Cloudflare's Internal AI Tools Drive Near-Universal Engineer Adoption
At Cloudflare, artificial intelligence has moved from experiment to essential utility. Over 93% of the company's research and development staff now use AI tools for their daily coding work. This...
At Cloudflare, artificial intelligence has moved from experiment to essential utility. Over 93% of the company's research and development staff now use AI tools for their daily coding work. This shift, achieved in under a year, is supported by a platform processing 241 billion tokens each month. The result is a tangible increase in productivity: weekly merge requests rose from approximately 5,600 to regularly exceeding 8,700.
The initiative began with a specialized internal team focused on deploying AI agents. This group, now part of the developer productivity organization, systematically integrated AI into core workflows. Every single code merge request is now examined by an AI reviewer. The system also streamlines onboarding and enforces repository standards.
Technical implementation is built on Cloudflare's own commercial products. The AI Gateway acts as the central nervous system, managing over 20 million requests monthly while enforcing zero data retention policies. For inference, engineers use a mix of external models and Cloudflare's own serverless Workers AI platform. The company's recently launched Kimi K2.5 model, noted for its long context window, now processes about 7 billion tokens daily and is reported to be significantly more cost-effective than some third-party options.
A key to adoption was seamless integration. Engineers access everything through a single command-line login. Behind the scenes, a proxy application manages authentication and routing, preserving user anonymity. The company also developed a protocol to connect AI agents to internal tools like Jira, GitLab, and Backstage, creating a unified portal for over 180 functions.
To manage complexity, Cloudflare introduced a 'Code Mode' that prevents AI context windows from being overloaded with tool definitions. Each code repository now contains an auto-generated AGENTS.md file, detailing standards and boundaries for AI use. Enforcement is automated; an AI reviewer in the CI pipeline assesses all code changes, delegating analysis to specialized sub-agents for security, performance, and documentation.
The company announced these internal metrics in April 2026, emphasizing that the stack is built entirely on products sold to its customers. This large-scale internal deployment, often called 'dogfooding,' demonstrates the platform's capability. As one engineer noted, the 93% adoption rate across R&D was achieved because the tools are simply part of the job now. For Cloudflare, the edge may be that its entire AI infrastructure runs on its global network, turning internal experience into a product roadmap.
Source: Webpronews
Ready to Modernize Your Business?
Get your AI automation roadmap in minutes, not months.
Analyze Your Workflows →