左右不逢缘 发表于 4 天前

分享 恶意垃圾蜘蛛爬虫

User-agent: GPTBotDisallow: /User-agent: AmazonbotDisallow: /User-agent: AhrefsBotDisallow: /User-agent: BingbotDisallow: /User-agent: DotBotDisallow: /User-agent: DataForSeoBotDisallow: /User-agent: SemrushBotDisallow: /User-agent: UptimebotDisallow: /User-agent: MJ12botDisallow: /User-agent: MegaIndex.ruDisallow: /User-agent: ZoominfoBotDisallow: /User-agent: Mail.RuDisallow: /User-agent: SeznamBotDisallow: /User-agent: BLEXBotDisallow: /User-agent: ExtLinksBotDisallow: /User-agent: aiHitBotDisallow: /User-agent: ResearchscanDisallow: /User-agent: DnyzBotDisallow: /User-agent: spbotDisallow: /User-agent: YandexBotDisallow: /User-agent: CCBotDisallow: /User-agent: ApplebotDisallow: /User-agent: AhrefsBotDisallow: /User-agent: YandexBotDisallow: /User-agent: CensysInspectDisallow: /User-agent: MauiBotDisallow: /

Crystαl 发表于 4 天前

robots.txt没卵用,很多爬虫压根不遵守robots协议,这些你得写到UA黑名单里头....

浅生 发表于 4 天前

爬虫 我这边可以防护

婷姐 发表于 4 天前

写ua黑名单里比这个管用
页: [1]
查看完整版本: 分享 恶意垃圾蜘蛛爬虫