Cloudflare is introducing two new tools to help website owners control whether AI bots are allowed to access their content for model training. The first tool allows customers to create and manage a robots.txt file, which can be used to block or allow specific crawlers. The second tool blocks AI bots only on portions of the site that are monetized through ads, providing more granular control over AI bot activity. These new tools aim to protect content creators from having their content scraped for AI model training without their consent. Cloudflare's managed robots.txt offering will automatically update directives to block popular AI crawlers and prevent them from crawling sites that contain ads. The company also introduced a feature to detect when ads are shown on a hostname, allowing customers to block AI bots only on specific pages with ads. With these new tools, website owners can take control over how their content is used by AI models, ensuring they receive fair compensation for their work. Cloudflare's efforts aim to build a better Internet by protecting independent publishers and promoting transparency in the use of AI-powered crawlers. The company will continue to monitor the IETF's pending proposal on this topic and provide more granular control around AI bot management as needed.