Using robot.txt is it possible to restrict robot access for (specific) query string (parameter) values?
ie
http://www.url.com/default.aspx #allow
http://www.url.com/default.aspx?id=6 #allow
http://www.url.com/default.aspx?id=7 #disallow
Using robot.txt is it possible to restrict robot access for (specific) query string (parameter) values?
ie
http://www.url.com/default.aspx #allow
http://www.url.com/default.aspx?id=6 #allow
http://www.url.com/default.aspx?id=7 #disallow
You only need to specify the url's that are disallowed. Everything else is allowed by default.
Can just the query variable defined such as
Disallow: /default.aspx?id=*
or better still
Disallow: /?id=