Robots.txt for multiple domains

2019-01-25 06:58发布

We have different domains for each language

  1. www.abc.com
  2. www.abc.se
  3. www.abc.de

And then we have different sitemap.xml for each site. In robots.txt, I want to add sitemap reference for each domain.

  1. Is it possible to have multiple sitemap references for each domain in single robots.txt?
  2. If there are multiple, which one does it pick?

标签: seo
3条回答
迷人小祖宗
2楼-- · 2019-01-25 07:24

Based on Hans2103's answer, I wrote this one that should be safe to be included in just about every web project:

# URL Rewrite solution for robots.txt for multidomains on single docroot
RewriteCond %{REQUEST_FILENAME} !-d # not an existing dir
RewriteCond %{REQUEST_FILENAME} !-f # not an existing file
RewriteCond robots/%{HTTP_HOST}.txt -f # and the specific robots file exists
RewriteRule ^robots\.txt$ robots/%{HTTP_HOST}.txt [L]

This rewrite condition should just serve the normal robots.txt if it's present and only look for a robots/ directory with the specified file robots/<domain.tld>.txt.

N.B.: The above rewrite has not yet been tested. Feel free to correct me if you find any flaws; I will update this post for future reference upon any helpful corrective comments.

查看更多
走好不送
3楼-- · 2019-01-25 07:33

I'm using the following solution in .htaccess after all domain redirects and www to non-www redirection.

# Rewrite URL for robots.txt
RewriteRule ^robots\.txt$ robots/%{HTTP_HOST}.txt [L]

Create a new directory in your root called robots. Create a text file filled with the specific robots information for every domain.

  • /robots/abc.com.txt
  • /robots/abc.se.txt
  • /robots/abc.de.txt
查看更多
走好不送
4楼-- · 2019-01-25 07:45

The robots.txt can only inform the search engines of sitemaps for its own domain. So that one will be the only one it honors when it crawls that domain's robots.txt. If all three domains map to the same website and share a robots.txt then the search engines will effectively find each sitemap.

查看更多
登录 后发表回答