AI computing systems builder Cerebras Systems (CBRS) went public on May 14 following a highly successful IPO. The company is already being touted as a rival to AI giant Nvidia Corporation (NVDA). Cerebras came up with the idea of making the “GPU” the entire wafer, meaning that instead of tying multiple GPU chips together, the company’s technology enables similar or higher aggregate compute and memory bandwidth with a single wafer-scale engine.
Cerebras also claims that its Wafer-Scale Engine 3 offering is the fastest commercialized AI processor in the world, with an inference up to 15 times faster than leading GPU-based solutions from Nvidia. The technology shows promise, and becoming a public entity indicates that it’s now ready to show the world.
However, Nvidia has created an empire for itself. The dominance is so stark that the company’s name has become synonymous with artificial intelligence (AI). Advanced Micro Devices (AMD) is also a gigantic player in the market. So, Cerebras has a long way to go before it comes in contention with these behemoths.
About Cerebras Stock
Cerebras Systems designs and manufactures specialized AI computing systems built around its proprietary wafer-scale chips, which are optimized for large-scale machine learning training and high-speed inference. The company offers both on-premises systems that organizations can run as private supercomputers and cloud-based compute access, serving customers in research, enterprise, and large-scale AI applications.
Headquartered in Sunnyvale, California, CBRS focuses on enabling extremely fast model training and deployment while simplifying infrastructure requirements for deep learning workloads. After a day of trading, Cerebras has a market capitalization of $66.95 billion.
In its May 14 debut, the company gained 68.2% on its opening day before closing the trading day at $311.07. The company raised $5.55 billion through its offering, making it the largest IPO since Uber (UBER) in 2019. This came at a time when the U.S. IPO market is facing concerns over reemerging tariff issues, private credit worries, and market volatility driven by tensions in the Middle East from the U.S.-Iran war.
And it's not the first time the company has tried to become a publicly listed company. In 2024, Cerebras attempted to go public in the U.S., but later withdrew its submission after it was flagged for heavy reliance on a single Middle Eastern customer.
At that time, Abu Dhabi-based and Microsoft-backed G42 accounted for 87% of revenue in the first half of that year. But this time around, Cerebras has reduced its reliance on G42, as the company revealed that 24% of revenue last year came from G42. However, in 2025, the Mohamed bin Zayed University of Artificial Intelligence in the UAE generated 62% of the company’s revenue.
How Are Cerebras’ Financials?
The company that claims to build the fastest AI infrastructure in the world, seemingly, did not come into the market empty-handed. Cerebras generated $509.99 million in revenue for 2025, up 75.7% year-over-year (YOY) last year.
It generated the majority of its top line from hardware sales, implying sales of its AI systems and other equipment used for on-premises training and inference at customer locations. Last year, Cerebras recorded $358.44 million, up 69.1% YOY. The rest of the top line comes from the sale of cloud capacity through its dedicated capacity and on-demand models, which stood at $151.55 million for last year.
On a GAAP basis, Cerebras earned a net income of $237.83 million. Yet, on a non-GAAP basis, the company reported a net loss of $75.74 million, up 247.9% YOY. As of year-end 2025, Cerebras’ principal sources of liquidity were cash, cash equivalents, and restricted cash of $930.40 million and marketable securities of $406.50 million, and no outstanding debt.
Apart from notable customers in the Middle East, Cerebras has entered into a partnership with OpenAI, valued at more than $20 billion, to deploy 750 megawatts of Cerebras’ high-speed AI compute. Furthermore, OpenAI has partnered to co-design future models for future Cerebras hardware. And finally, another notable multi-year deal was struck with AWS for the large-scale deployment of fast inference.
On the date of publication, Anushka Dutta did not have (either directly or indirectly) positions in any of the securities mentioned in this article. All information and data in this article is solely for informational purposes. For more information please view the Barchart Disclosure Policy here.