No active AI roles currently tracked. Primary focus: Agent · Engineering.
What leadership said about AI on earnings calls (above the line, stacked by event type) vs how many AI roles the company actually posted (below the line). Each side scales to its own peak — read shape, not absolute height.
Trajectory events (left half, cyan) vs active AI roles posted (right half, slate), bucketed by stage. Darker cell = more activity for this company.
Company is in discussions regarding AI pay-per-crawl to control and monetize AI bot traffic.
CFO Thomas Seifert on the efficiency gains of an AI-first operating model.
“By fully embracing an agentic AI-first organizational structure and operating model, as Cloudflare, Inc.’s revenue scales, our efficiency and productivity will scale even faster.”— Thomas Seifert
Cloudflare is seeing hundreds of billions of agentic requests per month, growing exponentially.
GPU utilization rates approach 70% to 80%, significantly higher than hyperscaler single-digit averages.
CEO Matthew Prince on the impact of AI on business models.
“In nearly every customer conversation, it is clear: the emergence of generative and agentic AI is not just redefining the economics of the Internet and software companies; it is redefining the business models of all companies, fundamentally reshaping how organizations are structured, operate, and create value.”— Matthew Prince
97% of engineers use AI coding tools, and 100% of production code contributions are reviewed by autonomous AI agents.
Internal usage of AI increased 600% in the last three months.
Announced workforce reduction of 1,100 employees to transition to an agentic AI-first operating model.
Cloudflare launched an official MCP server for a financial services partner to enable AI agents to interact with payment services.
“a leading financial services company has partnered with us to launch an official MCP server designed to allow AI agents like Clog or Cursor or OpenAI to interact directly with the company's payment services.”— Matthew Prince
Management claims superior capital efficiency for AI workloads compared to hyperscalers due to their specific business model.
“we only charge for the actual work that's getting done, that means that we're just getting, you know, oftentimes, as much as 10x the amount of work off of the same GPU that you might get with a hyperscaler.”— Matthew Prince
Cloudflare acquired Humenated and ASTRO to advance its 'Act four' strategy, focusing on new Internet business models and developer frameworks.
Cloudflare is partnering with Anthropic and others to support the Model Context Protocol (MCP) to enable agentic commerce.
“we're at the center of that future, describing the company's involvement in protocol standards and partnerships with providers such as Anthropic, Visa, Shopify, and Mastercard to enable the agentic commerce shift.”— Matthew Prince
CEO Matthew Prince articulated the company's vision for the 'agentic Internet' as a fundamental shift in software consumption.
“Now the agentic Internet is emerging, and we can already see its trends... Agents, in other words, are the ultimate infrastructure multiplier.”— Matthew Prince
Cloudflare highlighted the adoption of its AI crawl control product, which allows publishers to gain visibility into and monetize data usage by AI models.
“This customer was facing a massive increase in AI scraping, which was crushing their network and driving up infrastructure costs. They chose Cloudflare to gain visibility into which AI models are consuming their data, allowing them to protect and eventually monetize their unique content.”— Matthew Prince
Cloudflare disclosed that AI agent traffic on its network more than doubled in January.
“Over the month of January alone, the number of weekly requests generated by AI agents more than doubled across the Cloudflare network.”— Matthew Prince
Cloudflare deployed quantum-safe cryptography across its entire network for every customer to pre-empt future quantum computing risks.
Cloudflare initiated the NetDollar project to create a digital currency for agent-to-agent commerce, designed for regulatory compliance.
Management estimates that 80% of leading AI companies rely on Cloudflare's network.
Matthew Prince outlines Cloudflare's strategic role in setting protocols and guardrails for the agentic Internet.
“The agents of the future will inherently have to pass through our network and abide by its rules. And as they do, we will help set the protocol, guardrails, and business rules for the agentic Internet of the future.”— Matthew Prince
Matthew Prince predicts the inevitability of AI-powered agents facilitating commerce.
“And it seems inevitable that more and more commerce will be facilitated by AI-powered agents working on our behalf.”— Matthew Prince
Cloudflare products to be natively available within Oracle's OCI platform to broaden distribution across hybrid and multi-cloud deployments.
Cloudflare is increasing investment in GPU rollout to support demand for AI inference in 2025.
Cloudflare aims to play a foundational role in the post-search web, helping determine compensation for content creators and rules for AI agents.
“Cloudflare sits in a unique position to help figure out how content creators are compensated, what agents are allowed where and on what terms, and how the AI-driven web of the future will fit together.”— Matthew Prince
Cloudflare identifies opportunities for AI inference optimizations similar to training efficiencies, aiming for faster performance and lower prices.
“We are seeing that there are equivalent optimizations that can be made with AI inference on Cloudflare's platform, resulting in faster performance and lower prices for customers and higher margin, and less capex for us.”— Matthew Prince
AI Gateway users achieving >10x price performance improvement for AI agents by serving requests from Cloudflare's cache.
Cloudflare positions Workers as the go-to platform for AI inference and agentic workflows due to serverless architecture and price performance.
“The killer application for Cloudflare Workers is turning out to be AI. The model of programming is uniquely suited for building tools like AI agents, and our serverless architecture, which allows you to pay only for what you use based on CPU or GPU type, positions Workers to become the go-to platform for developers who want the best price performance for AI inference and agentic workflows.”— Matthew Prince
Matthew Prince asserts that Cloudflare's network architecture is uniquely positioned to handle the increasing demands of AI inference.
“And so, so far, there haven't been -- we have not hit limits that our engineering team hasn't found ways around. And I think that we're -- we feel pretty optimistic that even as AI continues to accelerate, the place that you're going to want to do inference is on Cloudflare's network.”— Matthew Prince
Management observed a notable shift in customer buying behavior from AI training to AI inference, resulting in the company's first multimillion-dollar Workers AI contract.
Matthew Prince highlights Cloudflare's ability to deliver higher GPU utilization for inference compared to hyperscale public clouds.
“What we see when we survey customers that are trying to manage this themselves, through hyperscale public cloud is that they're getting utilization rates that are sort of in the 5% to 10% range of the resources that they're buying. We're able to deliver much higher utilization.”— Matthew Prince
A rapidly growing AI company signed a $7 million pool of funds contract for Workers AI, moving all workloads to Cloudflare as their single inference cloud platform.
CEO thesis on OHTTP and privacy in the context of AI development.
“As AI continues to develop it's an area where a lot of people are rethinking the privacy of the internet and thinking about how can they incorporate more modern standards. And to that extent, I think you will see that if there is continued growth in this space, it actually may be a lot of the AI companies that are leading, leading in that direction.”— Matthew Prince
CEO thesis on the future of AI inference split between edge devices and network edge.
“inference is primarily going to happen in two places. The first is on devices themselves... But there will always be either devices that are older and maybe don't have the latest chips on them or models that are bigger and require more compute power than your handheld device is able to deliver. And in those cases, you're going to hand that inference task off to something else.”— Matthew Prince
Partnership with Meta to provide Llama models on Cloudflare's platform.
Inference requests powered by Cloudflare AI increased more than 700% quarter over quarter.
Developer accounts using AI functions increased 67% quarter over quarter.
AI customer realized 40% cost improvement using Workers AI for inference.