Home / Companies / Cloudflare / Blog / Post Details
Content Deep Dive

The AI engineering stack we built internally — on the platform we ship

Blog post from Cloudflare

Post Details
Company
Date Published
Author
Ayush Thakur, Scott Roe-Meschke, and Rajesh Bhatia
Word Count
1,096
Language
English
Hacker News Points
-
Summary

Cloudflare has successfully integrated AI into its engineering stack over the past eleven months, with 93% of its R&D organization utilizing AI coding tools built on its own platform. This initiative involved creating internal MCP servers and AI tooling that led to the formation of the iMARS team, which is part of the Dev Productivity group responsible for internal tooling such as CI/CD and automation. The adoption of AI tools has significantly increased developer velocity, with a notable rise in merge requests and widespread use across 295 teams. A central component of this integration is the AI Gateway, which manages authentication, routing, and data policies, ensuring security and efficiency. Cloudflare's serverless AI inference platform, Workers AI, provides cost savings and reduced latency by running open-source models on its global network, which is crucial for various tasks, including documentation review and lightweight inference. The architecture focuses on a platform layer, knowledge layer, and enforcement layer, with a centralized proxy Worker facilitating seamless integration and management of AI tools. This infrastructure, primarily composed of shipping products, exemplifies Cloudflare's commitment to leveraging its own technologies for internal advancements, aligning with the broader goals of Agents Week.