任务统计:
发布数/完成数/奖励数:
0/0/0
承接数/奖励数/收入数:
0/0/0
限制会员
- 积分
- -1786
|
网站入口文件index.php里的<?php下面添加:
//获取UA信息
$ua = $_SERVER['HTTP_USER_AGENT'];
//将歹意USER_AGENT存入数组
$now_ua = array('FeedDemon ','BOT/0.1 (BOT for JCE)','CrawlDaddy ','Java','Feedly','UniversalFeedParser','YisouSpider','ApacheBench','Swiftbot','ZmEu','Indy Library','oBot','jaunty','YandexBot','AhrefsBot','MJ12bot','WinHttp','EasouSpider','HttpClient','Microsoft URL Control','YYSpider','jaunty','Python-urllib','lightDeckReports Bot');
//遏止空USER_AGENT,dedecms等主流采集程序都是空USER_AGENT,部分sql注入工具也是空USER_AGENT
if(!$ua) {
header("Content-type: text/html; charset=utf-8");
wp_die('请勿采集本站,由于采集的站长木有小逼逼!');
}else{
foreach($now_ua as $value )
//判别能否是数组中存在的UA
if(eregi($value,$ua)) {
header("Content-type: text/html; charset=utf-8");
wp_die('请勿采集本站,由于采集的站长木有小逼逼!');
}
}
二,在.htaccess文件下添加所要屏蔽的爬虫:
RewriteCond %{HTTP_USER_AGENT} Python-urllib [NC,OR]
RewriteCond %{HTTP_USER_AGENT} YisouSpider [NC,OR]
【参考】:依据User-agent特征来屏蔽爬虫,运用.htacess文件:
RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} ^Ezooms
RewriteCond %{HTTP_USER_AGENT} ^Ezooms/1.0 |
|