Restrict robot access for (specific) query string

2019-07-02 09:51发布

Using robot.txt is it possible to restrict robot access for (specific) query string (parameter) values?

ie

http://www.url.com/default.aspx  #allow
http://www.url.com/default.aspx?id=6  #allow
http://www.url.com/default.aspx?id=7  #disallow

2条回答
相关推荐>>
2楼-- · 2019-07-02 10:35
User-agent: *
Disallow: /default.aspx?id=7  # disallow
Disallow: /default.aspx?id=9  # disallow
Disallow: /default.aspx?id=33 # disallow

etc...

You only need to specify the url's that are disallowed. Everything else is allowed by default.

查看更多
来,给爷笑一个
3楼-- · 2019-07-02 10:43

Can just the query variable defined such as

Disallow: /default.aspx?id=*

or better still

Disallow: /?id=

查看更多
登录 后发表回答