分享 恶意垃圾蜘蛛爬虫
User-agent: GPTBotDisallow: /User-agent: AmazonbotDisallow: /User-agent: AhrefsBotDisallow: /User-agent: BingbotDisallow: /User-agent: DotBotDisallow: /User-agent: DataForSeoBotDisallow: /User-agent: SemrushBotDisallow: /User-agent: UptimebotDisallow: /User-agent: MJ12botDisallow: /User-agent: MegaIndex.ruDisallow: /User-agent: ZoominfoBotDisallow: /User-agent: Mail.RuDisallow: /User-agent: SeznamBotDisallow: /User-agent: BLEXBotDisallow: /User-agent: ExtLinksBotDisallow: /User-agent: aiHitBotDisallow: /User-agent: ResearchscanDisallow: /User-agent: DnyzBotDisallow: /User-agent: spbotDisallow: /User-agent: YandexBotDisallow: /User-agent: CCBotDisallow: /User-agent: ApplebotDisallow: /User-agent: AhrefsBotDisallow: /User-agent: YandexBotDisallow: /User-agent: CensysInspectDisallow: /User-agent: MauiBotDisallow: / robots.txt没卵用,很多爬虫压根不遵守robots协议,这些你得写到UA黑名单里头.... 爬虫 我这边可以防护 写ua黑名单里比这个管用
页:
[1]