Welcome to Cautious Optimism, a newsletter on tech, business, and power. Share today’s free article with your colleagues and support by tapping subscribe.
📈 Trending Up: Strikes … Oura, it turns out (BG) … AI for sales … cloud backups … US GDP … storm surges in Florida … nuclear power, at last …
📉 Trending Down: Connectivity … Soviet Reddit … home buying limits in China … the Canadian birth rate … unicorn health …
YC drama: Drama in tech is often clarifying, and it seems that there’s little better fodder for discourse than open-source tech. The WordPress-Automattic-WPEngine beef is a good, recent example. Another is PearAI, a recent YC-backed startup that cloned some open-source software and then “slapped its own made-up closed license on it” that the company later admitted was “written by ChatGPT,” TechCrunch’s Julie Bort reports.
The startup has backed off that license since, but how to be a good citizen in open-source will never stop being a topic of conversation. Here, drama and argument help reinforce open-source norms by shaming those who abrogate them.
Cerebras is going public!
Just yesterday CO was carping about a dearth of IPOs in the United States during our look at Swiggy’s filing to list in India. Consider our words eaten, as after this newsletter went out, Cerebras Systems filed to go public here in the States.
Crunchbase reports that Cerebras raised $715 million while private, including a $250 million Series F that gave the company a valuation of just over $4 billion, calculated on a post-money basis.
So, what did Benchmark, Altimeter, Eclipse Ventures, Foundation Capital, and the Abu Dhabi Growth Fund get for their money? Let’s find out.
The basics
Cerebras builds big chips that can run AI workloads (Cerebras Wafer-Scale Engine, or WSE), hardware to support its chips, a megacluster of its chips that it calls an “AI Supercomputer,” software to run AI loads on its chips (Cerebras Software Platform or CSoft), an inference service delivered via an API, and other AI-related services.
At the core of it all is the Cerebras WSE, which is, effectively, a big-ass chip:
I’m not being wry, the size here really does matter. Cerebras has packed 900,000 compute cores, which it calculates is 52x the number that the “leading commercially available GPU has” onto a single chip. The WSE also has 44 gigabytes of SRAM, and can sling data at 21 petabytes per second. Those are big numbers.
Why does it matter that Cerebras has built a big chip that can handle AI workloads, when other companies offer smaller chips that you can simply mash together into a single cluster to do the same? Cerebras argues that parceling genAI training work to so many smaller brains creates issues:
[T]raining a large GenAI model on GPUs in a tractable amount of time requires breaking up the model and calculations, and distributing the pieces across hundreds or thousands of GPUs, creating extreme communication bottlenecks and power inefficiencies.
And when it come to inference — primer here, if you need it — the company argues that its big-chip model is also better:
During generative inference, the full model must be run for each word that is generated. Since large models exceed on-chip GPU memory, this requires frequent data movement to and from off-chip memory. GPUs have relatively low memory bandwidth, meaning that the rate at which they can move information from off-chip HBM to on-chip SRAM, where the computation is done, is severely limited. This leads to low performance as GPUs cores are idle while waiting for data – they can run at less than 5% utilization on interactive generative inference tasks.
Thanks to the SPAC craze, I am perhaps too skeptical of charts in IPO filings, but here’s how Cerebras details the performance of its chips against rival offerings when running a well-known, open-source LLM:
Finally, Cerebras released the third version of its huge chip earlier this year. The WSE-3 is twice as performant as its predecessor without using more power or costing more. That’s a pretty hefty upgrade. So, how is the big chip selling? Let’s find out.
Hardware
Back in 2022, Cerebras sold $15.60 million worth of hardware products, inclusive of its chips. That figure scaled to $57.1 million in 2023, for growth of 266%. That’s tasty.
Even better, Cerebras continued to greatly expand its hardware-predicated top line this year. In the first half of 2023, the company sold just $1.56 million worth of physical gear — implying a massive H2 2023 ramp, mind — compared to $104.3 million worth of the stuff in the first half of 2024.
Put another way, Cerebras put up nearly twice its last-year’s revenue in just the first two quarters of 2024. Even better, its Q1 2024 hardware revenue result of $49.1 million was its all-time best, until Q2 2024 was tallied and beat it with $54.9 million worth of hardware top line.
We cannot infer precisely how much of the company’s revenue acceleration is due to its new WSE-3 chip, or its other hardware products, but I have a pretty strong hunch that the new silicon is the key driver. Why? Everyone wants harder, faster, better, stronger AI models for less. To pull that off you need cheaper, faster model training and cheaper, faster AI inference. Cerebras is building for precisely that use case.
Services
Cerebras “generate[s] services and other revenue primarily through sales of one- to three-year support services, cloud-based computing services, and custom AI modeling services,” the company reports, and it’s a growing plank of its business.
The second of its two revenue lines, service-based revenues at the company grew from $9.0 million in 2022 to $21.6 million in 2023. The first half of 2024 saw a far-shaper $32.1 million worth of services revenue.
Normally when we examine SaaS businesses we see high-margin software revenues, and zero-margin services. Is that the case with Cerebras?
Is it a good business?
As it turns out, Cerebras has a wonky and fun gross margin profile:
Cerebras 2023 services gross margin: 68.4%
Cerebras 2023 hardware gross margin: 20.2%
Cerebras H1 2024 services gross margin: 56.6%
Cerebras H1 2024 hardware gross margin: 36.3%
The company is seeing its gross margins shift from services to hardware. Which is good, as I reckon that the company is going to drive the vast majority of its lifetime top line from selling chips and not services.
Summarizing to this point:
Big chips are good at AI workloads
AI workloads are getting bigger
People want bigger better models
And so Cerebras is growing quickly as it sells chips and help to the corporate world which is hungry to lever AI to reduce its labor costs.
Now, is Cerebras a good business today? No, but it’s nearly there. Here’s the income statement:
Cerebras’s gross profit in H1 2024 was more than double what it managed in all of 2023, against just 72% more revenue.
Thanks to that massive inflation in total gross profit, Cerebras’s operating loss fell about 50% in the first half of the year compared to the same portion of 2023. And the company is on pace to lose more than $50 million less this year than it did in the last — though I would expect slightly better results, frankly.
You could argue that Cerebras could use another few quarters to bake; to get its losses even lower, and prove that its new chip can keep selling further from its release date — recall how quickly it seems that state-of-the-art anything in AI seems to get quickly deprecated — but what’s the fun in that? Here’s a company that clearly has massive potential, going public before all the risk has been removed from its business.
That means there’s a little meat on the table for retail investors, and that all the value that Cerebras will, or may generate has not yet been created and extracted. Good!
Now will Cerebras win the AI chip war? Not if Nvidia has its way. Or Etched. Or Rebellions. Or one of the other myriad chip companies that wants to host the math for that our AI thirst drives. But with its revenue growth to-date it’s hard to argue that Cerebras is not barking up at least one live tree.
I dig it. More on what it’s worth when we get an IPO price range.
Ok, one more thing
I’ve been hiding the ball a little on you. Remember when we noted that the Abu Dhabi Growth Fund invested in Cerebras? Well, Abu Dhabi is also G42’s HQ. G42 is a UAE-based AI company founded back in 2018, chaired by Crown Prince Mohammed bin Zayed, Sheikh Tahnoon. He’s described by the FT as G42’s patron.
G42, you may recall, recently raised $1.5 billion from Microsoft, and wants to build a regional AI hub. I put the larger G42 effort under the economic diversification drive of MENA countries that have historically relied on natural resources for national revenue. Regardless, the Abu Dhabi connection matters more than for merely fundraising purposes. As Cerebras notes in its filing (emphasis added):
Our total revenue for the six months ended June 30, 2024 increased by $127.7 million, or 1,474%, compared to the six months ended June 30, 2023. This increase was primarily due to hardware revenue, which grew by $102.7 million. The growth was attributable to a significant increase in the number of AI systems sold. Services and other revenue increased by $25.0 million primarily due to an increase in professional services and ongoing support services as a result of our growing installed base. We generated significant revenue from G42 for the six months ended June 30, 2024 and 2023, representing $119.1 million and $3.7 million, respectively, or 87% and 43%, respectively, of our total revenue. During the six months ended June 30, 2024, G42 represented $101.3 million, or 97%, and $17.8 million, or 56%, of hardware revenue and services and other revenue, respectively. During the six months ended June 30, 2023, G42 represented $3.7 million, or 52%, of services and other revenue. No hardware revenue was recognized from G42 during the six months ended June 30, 2023.
Ties to G42 run even deeper, with Cerebras writing elsewhere that it sells “Cerebras solutions via our cloud offering as well as via the Condor Galaxy Cloud owned by Group 42 Holding.” A customer, a partner, and an entity with ties to fundraising sources. Not a bad web of connections, if you are Cerebras and want both capital, and orders.
But such revenue concentration is worrisome twice. First, it’s a risk because if G42 decides to pursue a different chip strategy, Cerebras could be see much of its revenue evaporate. And, because you wonder why no other companies want Cerebras chips as much as G42 does. Or kinda at all?
Perhaps Cerebras doesn’t have that much to worry about. G42 is, it turns out, buying shares:
Since June 30, 2024, we have sold an aggregate of 5,798,089 shares of our Series F-1 redeemable convertible preferred stock at a purchase price of $14.66 per share, for an aggregate purchase price of $85.0 million. Pursuant to the Preferred Stock Purchase Agreement, G42 agreed to purchase an aggregate of 22,851,296 shares of our Series F-2 redeemable convertible preferred stock (or, if purchased following the completion of this offering, shares of our Class N common stock) at a purchase price of $14.66 per share, for anticipated gross proceeds to us of $335.0 million. G42 has committed to purchase these shares by April 15, 2025.
Still, G42 remains a pretty minor shareholder all things told. It’s looking like more trad investors are going to take home a mint of the company debuts for a price above its last private sticker:
Enough for now. Here’s an IPO to get excited about.
How will public-market investors value an AI hardware come-up story in the wake of Nvidia’s recent ascendency? How will its revenue concentration harm its valuation?
We’ll find out soon enough!
my dog:meatloaf
alex: SEC filings