Just wanted to know if it is possible to disallow the whole site for crawlers and allow only specific webpages or sections? Is "allow" supported by crawlers like FAST and Ultraseek?
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试):
问题:
回答1:
There is an Allow Directive however there's no guarantee that a particular bot will support it (much like there's no guarantee a bot will even check your robots.txt to begin with). You could probably tell by examining your weblogs whether or not specific bots were indexing only the parts of your website that you allow.
The format for allowing just a particular page or section of your website might look like:
Allow: /public/section1/
Disallow: /
This (should) prevent bots from crawling or indexing anything except for content under /public/section1