Cerebras - Is it a NVIDIA killer? $CBRS Attacking AI’s Next Bottleneck: Inference
Everyone knows NVIDIA. The stock went from $10 to $1,000. DNVIDIA is, by every measure, the most dominant company in the AI revolution.
Cerebras is IPOing this week (13MAY2026) with a valuation around $50 billion.
Let me tell you about Cerebras.
They Didn't Try to Beat NVIDIA at Their Own Game
That's the first thing you need to understand. Every other chip company AMD, Intel, a dozen startups, tried to out-NVIDIA NVIDIA. Better GPU, faster memory, cheaper price. They all got crushed.
Cerebras looked at the problem differently. They asked one question: what is actually slowing AI down?
The answer isn't compute power. It's communication. Every time data has to travel between chips, across cables, across boards, across racks, you lose time. NVIDIA's approach is hundreds of tiny chips talking to each other constantly. That conversation has latency. And latency kills performance.
They took an entire silicon wafer, the thing factories normally cut into hundreds of separate chips, and kept it whole. One chip. The size of a dinner plate. By keeping far more compute and memory on one wafer-scale engine, Cerebras reduces a major source of latency: data movement across many separate chips. Just the fastest AI inference on the planet.
The Numbers Are Genuinely Insane
Cerebras's chip is 58x larger than NVIDIA's B200. It has 2,625x more memory bandwidth. On typical inference workloads it runs 15x faster. On certain specialized workloads it has hit 1,000x faster.
Revenue went from $24.6 million in 2022 to $510 million in 2025, roughly a 175% CAGR. The growth is real, but still concentrated and execution-heavy. And they have $24.6 billion in contracted backlog already sitting on the books before they've sold a single public share.
But NVIDIA Isn't Going Anywhere
NVIDIA did $215.9 billion in FY2026 revenue. Cerebras did $510 million in 2025. That scale gap is enormous. NVIDIA's CUDA software ecosystem has 20 years of developer loyalty baked in. Thousands of AI researchers have written their entire careers in CUDA. You don't switch that overnight. You don't switch that ever, for a lot of people.
NVIDIA still dominates training and the broader AI infrastructure stack. CUDA, NVLink, networking, software, and developer inertia are massive advantages. Cerebras’ cleanest wedge is not replacing that entire stack. It is winning the narrow but rapidly growing market where inference speed, latency, and memory movement matter more than general-purpose GPU flexibility.
And Cerebras lost $146 million at the operating level in 2025. The GAAP profit headline is real but it got a boost from non-operating items. The core business is still burning cash. At a $23 billion valuation they need flawless execution for years to justify the price.
So What Is Cerebras Actually Competing For?
Inference. Running AI in real time. Every time you use ChatGPT, Claude, or any AI product, that's inference. And inference is growing faster than training. Bloomberg Intelligence projects the inference market grows at twice the speed of training infrastructure through 2029.
This is the shift that matters. The AI industry spent five years obsessing over who could build the biggest model. Now the obsession is who can run it fastest and cheapest at scale. That's a completely different competition. And Cerebras built their entire company for exactly this moment.
The Real Comparison
NVIDIA is the highway. Cerebras is the sports car built specifically for that highway.
NVIDIA built the infrastructure that made AI possible. That story is mostly told. The next decade isn't about who can build the biggest training cluster. It's about who can make AI fast enough, cheap enough, and responsive enough to run inside every product, every app, every workflow on earth.
That's an inference problem. And right now Cerebras is the best answer to that problem on the market.
Is it a NVIDIA killer? Not today. Maybe not ever in training.
But in inference? In the market that's actually growing fastest? Cerebras isn't chasing NVIDIA. They're running a completely different race.
IPO is May 14th. Ticker is $CBRS.
Do your own research. But pay attention to this one.
Cerebras is a credible public-market pure play on the inference bottleneck: memory bandwidth, latency, and data movement. NVIDIA remains the AI platform king, but Cerebras may own a valuable lane inside the next phase of AI infrastructure.