Cerebras Prices at $185, Closes at $311, the AI Chip IPO of 2026
Cerebras priced its IPO at $185 on Wednesday night, well above the marketed range, sold 30 million shares, and pulled in $5.55 billion. Then on Thursday it opened at $350, peaked at $386, and closed up 68% at $311.07. That gives the wafer-scale chip company a roughly $95 billion market cap and makes this the biggest US tech IPO since Uber in 2019.
The financials underwriting that price are actually live. Revenue last year was $510 million, up 76%. Net income swung from a $481.6 million loss the prior year to $88 million in the black. That is the rare AI-chip company that ships actual results — not a Trillion-Dollar Capex Story Stock, an operating business with revenue ramp and a path to profitability.
What does Cerebras actually do that matters to agent builders. Wafer-scale-engine chips with 4 trillion transistors and 900,000 cores on a single piece of silicon. The thesis they sold the public market is that agent inference workloads — long-running, tokens-hungry, latency-sensitive — favor wafer-scale dataflow over the GPU cluster fabric. Cerebras inference clouds already serve Llama 4, Qwen, Mistral, and others at speeds Nvidia can't match on price-per-token at scale.
The structural read is this. Public-market investors are signing the agent-economy thesis at $95B before the agent economy has hit its first revenue inflection. Compare to Fractile's $220M Series B yesterday, Etched, Groq, SambaNova all sitting private. Cerebras just took the agent-inference compute thesis public and the market handed it the biggest first-day debut in seven years. The chip layer is no longer the boring substrate — it just became the most valuable narrative in agent infrastructure.
← Back to all articles
The financials underwriting that price are actually live. Revenue last year was $510 million, up 76%. Net income swung from a $481.6 million loss the prior year to $88 million in the black. That is the rare AI-chip company that ships actual results — not a Trillion-Dollar Capex Story Stock, an operating business with revenue ramp and a path to profitability.
What does Cerebras actually do that matters to agent builders. Wafer-scale-engine chips with 4 trillion transistors and 900,000 cores on a single piece of silicon. The thesis they sold the public market is that agent inference workloads — long-running, tokens-hungry, latency-sensitive — favor wafer-scale dataflow over the GPU cluster fabric. Cerebras inference clouds already serve Llama 4, Qwen, Mistral, and others at speeds Nvidia can't match on price-per-token at scale.
The structural read is this. Public-market investors are signing the agent-economy thesis at $95B before the agent economy has hit its first revenue inflection. Compare to Fractile's $220M Series B yesterday, Etched, Groq, SambaNova all sitting private. Cerebras just took the agent-inference compute thesis public and the market handed it the biggest first-day debut in seven years. The chip layer is no longer the boring substrate — it just became the most valuable narrative in agent infrastructure.
Comments