Subdomain disallow search bots via robots.txt

2019-05-31 07:13发布

I want to disallow search robots to access the entire domain including subdomains using Robots.txt and potentially .htaccess

I want to make sure that any new subdomains in future are blocked without having to create one in the root of subdomain every time.

Is this possible?

1条回答
我只想做你的唯一
2楼-- · 2019-05-31 07:54

If you want to block robots via robots.txt, you'll have to create one for each subdomain. I suggest a script that monitors your Zone File and then automatically creates one.

Another solution is to use HTTP Basic Auth. It'll block all bots from accessing the CNAMEs but it'll require users to enter a username and password.

Or you could use IP Tables to restrict access by IP range.

There are multiple solutions you can use to prevent robots from accessing your CNAMEs, and there are even more solutions to prevent search engines from adding your pages into their index.

It'll depend on whether who you want to let in (good bots, bad bots, users, etc.) that will determine what solutions you would use.

查看更多
登录 后发表回答