AI crawlers, such as Anthropic’s Claude and OpenAI's Deep Research, are causing disruptions on the internet.They increase server resource usage for website administrators, leading to unwanted bills.The crawlers ignore robots.txt files and consume web server resources, affecting small or independent web admins.Solutions like Anubis and AI Labyrinth are being used to block or occupy AI crawlers and protect web resources.