你可以檢查(正則表達式)針對Request.UserAgent
。
Peter Bromberg寫了一篇關於在ASP.NET中編寫ASP.NET Request Logger and Crawler Killer的好文章。
這是他在Logger
類使用方法:
public static bool IsCrawler(HttpRequest request)
{
// set next line to "bool isCrawler = false; to use this to deny certain bots
bool isCrawler = request.Browser.Crawler;
// Microsoft doesn't properly detect several crawlers
if (!isCrawler)
{
// put any additional known crawlers in the Regex below
// you can also use this list to deny certain bots instead, if desired:
// just set bool isCrawler = false; for first line in method
// and only have the ones you want to deny in the following Regex list
Regex regEx = new Regex("Slurp|slurp|ask|Ask|Teoma|teoma");
isCrawler = regEx.Match(request.UserAgent).Success;
}
return isCrawler;
}
不錯!我會查一下。 – 2009-01-11 00:56:11