
OpenAI just spent $20 billion on new chips to make ChatGPT dramatically faster
Sam Altman just made the biggest infrastructure bet in AI history.
OpenAI is spending over $20 billion on a new type of computer chip specifically designed to make ChatGPT faster, smarter and more responsive than ever.
These aren’t regular chips. Cerebras builds wafer-scale processors (the largest chips ever manufactured) that pack the processing power of an entire server rack onto a single piece of silicon the size of a dinner plate.
The goal is simple. Make ChatGPT respond in real time with zero lag. Enable AI agents that can think and act for minutes or hours without stopping. Process your requests faster than you can type them.
OpenAI is also taking an equity stake in Cerebras, meaning the company behind ChatGPT now partially owns the company building the hardware powering it.
For context, $20 billion is more than most countries spend on their entire technology infrastructure in a year.
OpenAI already surpassed $25 billion in annual revenue. They are now spending most of it on getting faster.
900 million people use ChatGPT every week. Every single one of them is about to notice the difference.