Is there a way to prevent Googlebot from indexing

2019-06-15 18:16发布

Is it possible to fine-tune directives to Google to such an extent that it will ignore part of a page, yet still index the rest?

There are a couple of different issues we've come across which would be helped by this, such as:

  • RSS feed/news ticker-type text on a page displaying content from an external source
  • users entering contact phone etc. details who want them visible on the site but would rather they not be google-able

I'm aware that both of the above can be addressed via other techniques (such as writing the content with JavaScript), but am wondering if anyone knows if there's a cleaner option already available from Google?

I've been doing some digging on this and came across mentions of googleon and googleoff tags, but these seem to be exclusive to Google Search Appliances.

Does anyone know if there's a similar set of tags to which Googlebot will adhere?

Edit: Just to clarify, I don't want to go down the dangerous route of cloaking/serving up different content to Google, which is why I'm looking to see if there's a "legit" way of achieving what I'd like to do here.

8条回答
叛逆
2楼-- · 2019-06-15 19:22

Yes definitely you can stop Google from indexing some parts of your website by creating custom robots.txt and write which portions you don't want to index like wpadmins, or a particular post or page so you can do that easily by creating this robots.txt file .before creating check your site robots.txt for example www.yoursite.com/robots.txt.

查看更多
我命由我不由天
3楼-- · 2019-06-15 19:23

In short NO - unless you use cloaking with is discouraged by Google.

查看更多
登录 后发表回答