SEO blacklisting for cloaking [closed]

2019-04-11 09:48发布

I am using postbacks to perform paging on a large amount of data. Since I did not have a sitemap for google to read, there will be products that google will never know about due to the fact that google does not push any buttons.

I am doing cloaking to spit out all the products with no paging if the user-agent is that of a search engine. There may be some work arounds for situations like this which include hidden buttons to paged urls.

What about information you want indexed buy google but you want to charge for the content. Imagine that I have articles that I want users to be able to find in google, but when the user visits the page, only half the content is displayed and users will have to pay for the rest.

I have heard that google may blacklist you for cloaking. I am not being evil, just helpful. Does google recognize the intention?

标签: seo
4条回答
甜甜的少女心
2楼-- · 2019-04-11 10:14

Here is a FAQ by google on that topic. I suggest to use CSS to hide some content. For example just give links to your products as an alternative to your buttons and use display:none; on them. The layout stays intact and the search engines will find your pages. However most search engines will not find out about cloaking and other techniques, but maybe competitors will denigrate you. In any way: Don't risk it. Use sitemaps, use RSS feeds, use XML documents or even PDF files with links to offer your whole range of products. Good luck!

查看更多
【Aperson】
3楼-- · 2019-04-11 10:23

Highly doubtful. If you are serving different content based on IP address or User-Agent from the same URL, it's cloaking, regardless of the intentions. How would a spider parse two sets of content and figure out the "intent"?

There is intense disagreement over whether "good" cloakers are even helping the user anyway.

Why not just add a sitemap?

查看更多
等我变得足够好
4楼-- · 2019-04-11 10:25

I don't think G will recognize your intent, unfortunately. Have you considered creating a sitemap dynamically? http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=40318

查看更多
何必那么认真
5楼-- · 2019-04-11 10:30

This is why Google supports a sitemap protocol. The sitemap file needs to render as XML, but can certainly be a code-generated file, so you can produce on-demand from the database. And then point to it from your robots.txt file, as well as telling Google about it explicitly from your Google Webmaster Console area.

查看更多
登录 后发表回答