apache - 安全性:使用 htaccess 和 HTTP_USER_AGENT 阻止不良蜘蛛和机器人访问网站

标签 apache .htaccess redirect security user-agent

在构建 htaccess 规则来阻止常见的蜘蛛和机器人时,什么 HTTP_USER_AGENT标题应该过滤吗?

 RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR]
 RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:craftbot@yahoo.com [OR]
 RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR]
 RewriteCond %{HTTP_USER_AGENT} ^Custo [OR]
 RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR]
 RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR]
 RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR]
 RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR]
 RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR]
 RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR]
 RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [OR]
 RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR]
 RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR]
 RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR]
 RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR]
 RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR]
 RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR]
 RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR]
 RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR]
 RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR]
 RewriteCond %{HTTP_USER_AGENT} ^HMView [OR]
 RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR]
 RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [OR]
 RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [OR]
 RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR]
 RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR]
 RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [OR]
 RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR]
 RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [OR]
 RewriteCond %{HTTP_USER_AGENT} ^larbin [OR]
 RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR]
 RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [OR]
 RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [OR]
 RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [OR]
 RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR]
 RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR]
 RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR]
 RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR]
 RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [OR]
 RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR]
 RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR]
 RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [OR]
 RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [OR]
 RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR]
 RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [OR]
 RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR]
 RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR]
 RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR]
 RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR]
 RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR]
 RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR]
 RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR]
 RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR]
 RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR]
 RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR]
 RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [OR]
 RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR]
 RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [OR]
 RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [OR]
 RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR]
 RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR]
 RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR]
 RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [OR]
 RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR]
 RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR]
 RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR]
 RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [OR]
 RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [OR]
 RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR]
 RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR]
 RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR]
 RewriteCond %{HTTP_USER_AGENT} ^Widow [OR]
 RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR]
 RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [OR]
 RewriteCond %{HTTP_USER_AGENT} ^Zeus
 ## Return 403 Forbidden error.
 RewriteRule .* - [F]

最佳答案

这就是我的做法:

SetEnvIfNoCase user-agent "^BlackWidow" bad_bot 
# and all others from your list
SetEnvIfNoCase user-agent "^Zeus" bad_bot 
<FilesMatch "(.*)">
Order Allow,Deny
Allow from all
Deny from env=bad_bot
</FilesMatch>

关于apache - 安全性:使用 htaccess 和 HTTP_USER_AGENT 阻止不良蜘蛛和机器人访问网站,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/7113948/

相关文章:

php - Xampp:网络浏览器不会显示本地主机页面

php - 计算完全下载的文件

.htaccess - 添加尾部斜杠 .htaccess

php - 使用 php include 时 .htaccess 出现困惑

regex - htaccess 自动检测域名

java - 检查网络状态

c# - 控制台托管 WCF 服务 Http 可在 mono 和 IIS 下访问

php 文件不起作用

php - 重定向太多

java - 如何在 servlet 过滤器中重定向?