What Cloudflare’s 1,100 Layoffs Tell Us About AI Workforce Strategy

Cloudflare cut 1,100 jobs the same week it posted record revenue. The pattern is now structural across the enterprise, and Forrester and Gartner predict half of these reductions will quietly reverse. Here is what a sound AI workforce strategy actually requires.
Mid-aged employees carrying cardboard boxes leave a corporate office through a revolving door as younger workers arrive.

On 7 May, Cloudflare announced it was cutting 1,100 jobs, roughly twenty percent of its workforce. The same earnings report posted record quarterly revenue of $639.8 million, up 34 percent year on year. CEO Matthew Prince framed the redundancies as a restructuring for “the agentic AI era.” In the same earnings call he also told analysts the company will continue hiring and expects to have more employees in 2027 than at any point in 2026.

That contradiction is the central tension of every AI workforce strategy decision being made right now. Cut 1,100 people this week, on the basis that you will need more than 5,500 employees within eighteen months. Read it twice. Then look at the market reaction: the stock fell 24 percent the next day.

What Cloudflare Actually Said

The official Cloudflare announcement cites internal AI usage rising 600 percent in three months. Employees across engineering, HR, finance and marketing now run thousands of agent sessions per day. According to the company, that productivity surge has “fundamentally changed” what the organisation needs to look like.

The framing is worth dissecting. Cloudflare is not claiming a specific role was replaced by a specific model. It is claiming that the people using AI are so much more productive that the support roles around them no longer justify their numbers. Prince put it bluntly on the call: a lot of the support roles “are not going to be the roles that drive companies going forward.”

That is a workforce claim, not a technology claim. And it sets up a pattern that operational leaders inside EU organisations should recognise immediately.

The Cloudflare Pattern Is Not New

If you have been watching enterprise AI announcements for the past eighteen months, the script is familiar. A board approves a workforce reduction. The CEO frames it as an AI productivity story. The press release talks about the future. The market rewards or punishes depending on the mood that week. Eighteen months later, the work that was supposed to be absorbed by the technology turns out to still need doing, and the rehiring quietly begins.

The Forrester evidence

Forrester’s Predictions 2026: The Future of Work report puts the consequences on the record. 55 percent of employers regret laying off workers for AI, and Forrester predicts about half of those reductions will be quietly reversed: through rehiring at lower compensation, through offshore arrangements or through a slow rebuild that nobody mentions in a press release.

The more interesting Forrester finding sits in a different paragraph. Among generative AI decision-makers, 57 percent expect AI to increase headcount at their organisations; only 15 percent expect it to decrease. The people closest to the technology are not the ones writing the redundancy announcements.

The Gartner prediction

Gartner’s February 2026 forecast covers the same territory from a different angle. By 2027, 50 percent of companies that attributed headcount reduction to AI will rehire staff to perform similar functions, often under different titles. The most useful figure in the Gartner research is the smaller one: only 20 percent of customer service leaders surveyed had actually reduced agent staffing because of AI. The majority kept headcount steady while handling more customer interactions. The hype is louder than the practice.

Why AI Workforce Strategy Keeps Failing

The problem inside most organisations is not whether AI works. The problem is how the AI workforce strategy decision gets made.

Decisions get framed before the work is mapped

In a sound AI workforce strategy, the question “what does AI change about this role” comes before the question “how many people do we still need.” In practice the order is reversed. The board approves a target. Departments are asked to absorb it. The work is then quietly redistributed across the people who remain, pushed to an offshore team, or shadow-AI’d into existence by employees who never received training on the tools they are now expected to operate.

Across the EU, this pattern is producing measurable strain. Our recent analysis of workforce data shows that AI is producing more work for individuals, not less, wherever role redesign has not happened in advance of tool deployment.

No one inside owns the question

In most organisations, the question “what should our AI workforce strategy actually look like” has no internal owner. HR runs the headcount; IT runs the tools; finance signs the spend; consultants come and go. The AI Strategy Lead, the person who should be holding the operational picture together, either does not exist or has been parked under a function that cannot make cross-departmental decisions.

That gap is what produces Cloudflare-style announcements. Without an internal owner, “agentic AI era” becomes the script that fills the silence.

What Sound AI Workforce Strategy Looks Like

The discipline is not complicated. It is just rarely done in the right order.

Map the work before cutting the people. Identify which specific tasks inside a role are genuinely automatable, which require human judgement, which require institutional context, and which carry accountability that cannot be delegated to a model. The map is a precondition for any honest workforce decision; without it, the headcount target is a press release waiting to be reversed.

Build the management capability before scaling the tooling. AI tools produce value only when the manager around them has redesigned the workflow they sit inside. Most organisations buy the tools first and discover the management problem second. By then the redundancies are already announced and the institutional knowledge has walked out of the building.

Preserve that institutional knowledge deliberately. The people most likely to be cut in an AI-justified reduction are often the same people who hold the undocumented context that makes the function work. Lose them and the rehiring follows, usually within a year, usually at lower compensation, often offshore. The savings on the spreadsheet rarely survive the rebuild.

The Operational Case for Getting It Right

There is a structural reason to take this seriously beyond the obvious one. Rehiring at lower pay, or quietly offshore, damages institutional trust in ways that take years to repair. Customers notice the service gap. Remaining employees notice who was cut and conclude, correctly, that the next round will be no more strategic than the last. The cost of getting AI workforce strategy wrong is not just the rehire; it is the slow degradation of the workforce’s belief that decisions are being made for reasons that survive contact with reality.

The organisations doing this well are not the ones moving fastest. They are the ones moving in the right order. That order is what Future Prep’s AI Operational Strategy Prep Track is built to deliver: an AI workforce strategy that maps the role, redesigns the workflow, equips the manager and protects the institutional knowledge that AI cannot reconstitute on its own. The full Future Prep Suite adds the governance and digital sovereignty layers that sit alongside it.

If your board is about to make a headcount decision in the name of AI, give yourself ninety days to map the work first. The Cloudflare reaction shows what the market does when you do it the other way round.

I wrote the long version in a book, AI Is Not a Technology Project, for the executives who would rather read it before announcing the redundancies than after.

Newsletter
Releted Blogs
LATEST NEWS

AI governance is not a future problem

Regulation is already in effect. Your competitors are already building internal capability. The gap between ‘we are aware of AI’ and ‘we have operational control’ is closing, and it closes faster with a structured framework.

 

Book a 30-minute discovery call. No obligation. We will assess where your organisation stands and what a realistic starting point looks like.

No sales pressure. No jargon. Just a structured conversation about your organisation's AI readiness.

Scroll to Top