- OpenAI projects it will spend $121 billion on computing power for AI research in 2028 — and will still burn $85 billion that year even after nearly doubling its sales from the prior year, per confidential financial documents shared with investors ahead of this year’s funding rounds
- Losses of that scale would dwarf virtually any other public company in recorded history, making OpenAI’s upcoming IPO one of the most unconventional investment propositions Wall Street has ever evaluated
- Anthropic faces the same structural challenge — mounting computing costs that threaten to outpace even its most optimistic revenue forecasts — though its planned spending is considerably lower than OpenAI’s
- Both companies are accelerating the pace of new model releases while pouring ever-larger resources into training runs, with each incremental jump in AI intelligence becoming harder and more expensive to achieve — a dynamic showing no sign of slowing
What Happened?
Confidential financial documents shared by OpenAI and Anthropic with investors ahead of their respective funding rounds this year reveal the fundamental economic reality facing Silicon Valley’s two largest AI labs. OpenAI projects it will spend $121 billion on computing power for AI research alone in 2028 — and even after nearly doubling its sales from the prior year, will still burn $85 billion in losses. Losses of that magnitude would dwarf virtually any other public company in recorded history. Anthropic’s planned computing outlays are considerably smaller, but its financials tell a similar story: mounting infrastructure costs threatening to outpace even the most optimistic revenue scenarios. Both companies are accelerating the pace at which they release new AI model versions and pouring ever-larger resources into the training runs that produce them — a dynamic showing no sign of abating, as each incremental jump in intelligence becomes harder and more expensive to achieve.
Why It Matters?
The disclosures illuminate the fundamental tension at the heart of the AI industry’s business model: the capabilities that make these models commercially valuable require spending at a scale that makes traditional profitability concepts almost meaningless in the near term. OpenAI’s projected 2028 losses of $85 billion — after nearly doubling revenue — are not the losses of a distressed company. They are the losses of a company deliberately outspending its revenue to compound its intelligence advantage before rivals can close the gap. The entire commercial logic rests on a single enormous bet: that the company commanding the capability frontier today will eventually monetize it so comprehensively that today’s losses look trivial. Anthropic’s projections tell a similar story with a more conservative cost structure. For investors evaluating AI lab IPOs, the critical question is not current profitability — there will be none for years — but whether the market position being purchased at IPO is durable enough to justify the terminal value implied by the valuation.
What’s Next?
Both companies are racing toward potentially record-breaking IPOs by the end of 2026. When their full prospectuses become public — as SEC rules will require closer to listing — investors will see the complete scope of the financial commitments they are making. The computing cost trajectory is the central variable: each successive generation of AI model has required exponentially more compute to train, and there is no current evidence this curve is flattening. If OpenAI’s $121 billion compute projection for 2028 proves accurate, the company will require sustained, massive capital raises simply to stay in the race. The existential tail risk is a compute-efficiency breakthrough by a competitor that renders today’s expensive training approaches obsolete at a fraction of the cost — a scenario that could fundamentally reprice the entire sector. The AI arms race has never been more expensive, and IPO investors who buy in this year are making a generational bet that the companies commanding the most compute today will still command the most capable models a decade from now.
Source: The Wall Street Journal












