"NVIDIA wins the AI chip race" — that's been the consensus for three years.
An S-1 filed on April 17, 2026 is making that consensus harder to hold. Cerebras Systems submitted its IPO registration with the SEC, and the numbers are worth paying attention to: $510M in 2025 revenue, a $23B valuation following a $1B Series H, and an OpenAI contract reportedly worth "more than $10 billion." CEO Andrew Feldman put it plainly after the filing: "This contract is about displacing NVIDIA from OpenAI's inference business."
That sounds like marketing. But the numbers behind it aren't.
TL;DR
| Item | Detail |
|---|---|
| S-1 Filed | April 17, 2026 |
| Exchange | Nasdaq (ticker: CBRS) |
| Target Valuation | $23B (Series H basis) |
| 2025 Revenue | $510M |
| OpenAI Contract | $10B+ (inference compute) |
| Target IPO Date | Mid-May 2026 |
| Core Technology | Wafer Scale Engine (WSE-3) |
| Previous IPO Attempt | 2024 (withdrawn due to G42 federal review) |
1. What Makes Cerebras Different
Source: TechCrunch | Cerebras wafer-scale chip production
Cerebras builds chips differently. A standard GPU fab takes one wafer and dices it into dozens of chips. Cerebras uses the entire wafer as a single chip.
The WSE-3 packs 4 trillion transistors onto one die. The memory bandwidth is enormous, and there's no inter-chip communication latency because everything lives on one substrate. For AI inference workloads — serving large language models at scale with real-time response requirements — this architecture has a genuine edge over GPU clusters.
The argument for wafer-scale makes more sense when you think about what inference actually is: billions of users waiting for tokens. Every millisecond of latency compounds. That's why OpenAI, which serves hundreds of millions of users daily, is willing to make a $10B+ bet on this architecture.
I wrote a deep-dive into the WSE-3 specs a month ago, when the IPO was still a "maybe." Seeing an actual S-1 drop this fast was genuinely surprising.
2. What the S-1 Actually Reveals
Photo by Alexandre Debiève on Unsplash | The competitive landscape in AI hardware
The headline metric is $510M in 2025 revenue. For a hardware startup competing against NVIDIA, that's a meaningful number. But the more revealing detail is the structure of the OpenAI relationship.
The S-1 discloses a Master Relationship Agreement with OpenAI for inference compute. OpenAI also loaned Cerebras $1B and received stock purchase warrants. These two companies are structurally intertwined. That's both the thesis and the risk.
Thesis: As OpenAI scales inference to support more users, faster models, and more agentic workflows, Cerebras sits in the critical path of that demand. AWS also signed a data center chip supply agreement, showing some customer diversification.
Risk: Heavy customer concentration is a red flag in any S-1. If OpenAI builds its own inference silicon (already underway internally), or shifts its infrastructure strategy, what does Cerebras' revenue look like? The filing doesn't give a clean answer to that.
The broader funding environment is frothy right now. Cursor reached a $50B valuation two weeks ago, then SpaceX acquired an option to buy it at $60B. Capital is flowing into AI infrastructure across the board. That context helps Cerebras' timing.
3. Can It Actually Challenge NVIDIA? The Honest Assessment
Let's step back here.
The CEO's "displacing NVIDIA" framing is accurate only in a narrow sense: inference, not training. NVIDIA's position in model training remains dominant. H100, H200, Blackwell — and the CUDA ecosystem built on top of them over 15+ years. The one-liner tensor.cuda() in PyTorch represents a decade of developer habit. That doesn't get replaced in a cycle or two.
Cerebras is not positioned as a full replacement. It's positioned as faster and lower-latency for inference. In the right workloads, that's real. In others, NVIDIA still wins.
The 2024 IPO withdrawal is worth keeping in mind too. The previous S-1 was pulled because of federal scrutiny over G42, the Abu Dhabi investment firm. That issue has reportedly been resolved, but geopolitical risk is a permanent fixture for any AI chip company selling to US hyperscalers.
What's different this time: the OpenAI contract gives institutional investors a concrete revenue anchor. That changes the IPO math significantly.
4. What This Means for Developers
Photo by Ales Nesetril on Unsplash | How infrastructure competition shapes developer options
You don't need to change any code today. But there are a few things worth tracking.
Inference costs will compete harder. If Cerebras establishes itself as a credible inference alternative, NVIDIA faces pricing pressure. That flows downstream: lower infrastructure costs for OpenAI and Anthropic, which eventually shows up in API pricing.
Vendor lock-in gets more negotiable. Right now, serious AI inference basically requires the CUDA stack. If Cerebras or other alternatives gain market share, enterprises that want to diversify away from a single chip vendor will have real options.
You won't feel it immediately. Direct access to Cerebras hardware is still limited to a small set of partners. The impact of this IPO hits infrastructure first, and filters to application-layer developers through pricing and latency improvements over the next year or two.
That said — if you're building anything that touches real-time inference at scale, it's worth watching how Cerebras grows post-IPO.
Closing: A Crack in the NVIDIA Moat
Three years of "NVIDIA wins everything" is starting to have asterisks attached to it.
A successful $23B IPO for Cerebras wouldn't just be one company going public. It would be a market signal that NVIDIA alternatives in inference can actually work at scale. The $10B+ OpenAI contract is the proof of concept.
There's still a lot of uncertainty: customer concentration, geopolitical risk, NVIDIA's counterattack in inference. I don't know how this plays out, and honestly neither does anyone else.
But the S-1 is worth reading. It shows you where the money is flowing in AI infrastructure, and which bets people are making on what wins next.
Do you think Cerebras can genuinely compete with NVIDIA in inference? Or does NVIDIA's software moat hold regardless of hardware alternatives?
Sources
- AI chip startup Cerebras files for IPO — TechCrunch, April 18, 2026
- Cerebras Systems Files for IPO After $23B Valuation and OpenAI Deal — The AI Insider, April 21, 2026
- Cerebras IPO: $510M Revenue, $10B OpenAI Deal, $23B Valuation — Tech Insider, April 2026
- Inside Cerebras' IPO filing — Axios, April 20, 2026
Related reading:
- Cerebras WSE-3 Full Spec Breakdown: 4 Trillion Transistors vs NVIDIA - Deep dive into WSE-3 architecture and the NVIDIA comparison
- Why Cursor Is Now Worth $50 Billion: $2B ARR, 70% Fortune 1000 - The AI tool startup valuation explosion explained