Asp.net Request.Browser.Crawler - Dynamic Crawler

2019-01-15 14:05发布

问题:

I learned Why Request.Browser.Crawler is Always False in C# (http://www.digcode.com/default.aspx?page=ed51cde3-d979-4daf-afae-fa6192562ea9&article=bc3a7a4f-f53e-4f88-8e9c-c9337f6c05a0).

Does anyone uses some method to dynamically update the Crawler's list, so Request.Browser.Crawler will be really useful?

回答1:

I've been happy the the results supplied by Ocean's Browsercaps. It supports crawlers that Microsoft's config files has not bothered detecting. It will even parse out what version of the crawler is on your site, not that I really need that level of detail.



回答2:

You could check (regex) against Request.UserAgent.

Peter Bromberg wrote a nice article about writing an ASP.NET Request Logger and Crawler Killer in ASP.NET.

Here is the method he uses in his Logger class:

public static bool IsCrawler(HttpRequest request)
{
   // set next line to "bool isCrawler = false; to use this to deny certain bots
   bool isCrawler = request.Browser.Crawler;
   // Microsoft doesn't properly detect several crawlers
   if (!isCrawler)
   {
       // put any additional known crawlers in the Regex below
       // you can also use this list to deny certain bots instead, if desired:
       // just set bool isCrawler = false; for first line in method 
       // and only have the ones you want to deny in the following Regex list
       Regex regEx = new Regex("Slurp|slurp|ask|Ask|Teoma|teoma");
       isCrawler = regEx.Match(request.UserAgent).Success;
   }
   return isCrawler;
}