What leadership said about AI on earnings calls (above the line, stacked by event type) vs how many AI roles the company actually posted (below the line). Each side scales to its own peak — read shape, not absolute height.
Trajectory events (left half, cyan) vs active AI roles posted (right half, slate), bucketed by stage. Darker cell = more activity for this company.
CFO Thomas Seifert on the efficiency gains of an AI-first operating model.
“By fully embracing an agentic AI-first organizational structure and operating model, as Cloudflare, Inc.’s revenue scales, our efficiency and productivity will scale even faster.”— Thomas Seifert
CEO Matthew Prince on the impact of AI on business models.
“In nearly every customer conversation, it is clear: the emergence of generative and agentic AI is not just redefining the economics of the Internet and software companies; it is redefining the business models of all companies, fundamentally reshaping how organizations are structured, operate, and create value.”— Matthew Prince
Management claims superior capital efficiency for AI workloads compared to hyperscalers due to their specific business model.
“we only charge for the actual work that's getting done, that means that we're just getting, you know, oftentimes, as much as 10x the amount of work off of the same GPU that you might get with a hyperscaler.”— Matthew Prince
CEO Matthew Prince articulated the company's vision for the 'agentic Internet' as a fundamental shift in software consumption.
“Now the agentic Internet is emerging, and we can already see its trends... Agents, in other words, are the ultimate infrastructure multiplier.”— Matthew Prince
Matthew Prince outlines Cloudflare's strategic role in setting protocols and guardrails for the agentic Internet.
“The agents of the future will inherently have to pass through our network and abide by its rules. And as they do, we will help set the protocol, guardrails, and business rules for the agentic Internet of the future.”— Matthew Prince
Matthew Prince predicts the inevitability of AI-powered agents facilitating commerce.
“And it seems inevitable that more and more commerce will be facilitated by AI-powered agents working on our behalf.”— Matthew Prince
Cloudflare aims to play a foundational role in the post-search web, helping determine compensation for content creators and rules for AI agents.
“Cloudflare sits in a unique position to help figure out how content creators are compensated, what agents are allowed where and on what terms, and how the AI-driven web of the future will fit together.”— Matthew Prince
Cloudflare identifies opportunities for AI inference optimizations similar to training efficiencies, aiming for faster performance and lower prices.
“We are seeing that there are equivalent optimizations that can be made with AI inference on Cloudflare's platform, resulting in faster performance and lower prices for customers and higher margin, and less capex for us.”— Matthew Prince
Cloudflare positions Workers as the go-to platform for AI inference and agentic workflows due to serverless architecture and price performance.
“The killer application for Cloudflare Workers is turning out to be AI. The model of programming is uniquely suited for building tools like AI agents, and our serverless architecture, which allows you to pay only for what you use based on CPU or GPU type, positions Workers to become the go-to platform for developers who want the best price performance for AI inference and agentic workflows.”— Matthew Prince
Matthew Prince asserts that Cloudflare's network architecture is uniquely positioned to handle the increasing demands of AI inference.
“And so, so far, there haven't been -- we have not hit limits that our engineering team hasn't found ways around. And I think that we're -- we feel pretty optimistic that even as AI continues to accelerate, the place that you're going to want to do inference is on Cloudflare's network.”— Matthew Prince
Matthew Prince highlights Cloudflare's ability to deliver higher GPU utilization for inference compared to hyperscale public clouds.
“What we see when we survey customers that are trying to manage this themselves, through hyperscale public cloud is that they're getting utilization rates that are sort of in the 5% to 10% range of the resources that they're buying. We're able to deliver much higher utilization.”— Matthew Prince
CEO thesis on OHTTP and privacy in the context of AI development.
“As AI continues to develop it's an area where a lot of people are rethinking the privacy of the internet and thinking about how can they incorporate more modern standards. And to that extent, I think you will see that if there is continued growth in this space, it actually may be a lot of the AI companies that are leading, leading in that direction.”— Matthew Prince
CEO thesis on the future of AI inference split between edge devices and network edge.
“inference is primarily going to happen in two places. The first is on devices themselves... But there will always be either devices that are older and maybe don't have the latest chips on them or models that are bigger and require more compute power than your handheld device is able to deliver. And in those cases, you're going to hand that inference task off to something else.”— Matthew Prince
No active AI roles currently tracked. Primary focus: Agent · Engineering.