Can anyone slow Cursor?
This morning we’re talking about AI factories, Cursor’s new, shiny valuation, and why people are trying to cross blockchains with compute credits.
Welcome to Cautious Optimism, a newsletter on tech, business, and power.
Good morning and happy Monday friends! It’s another interesting earnings week, with Palantir, Rivian, Uber, Doordash, Rumble, Bumble, Shopify, Affirm, and other names dropping over the next few days. We’re gonna learn a lot.
This morning we’re talking about AI factories, Cursor’s new, shiny valuation, and why people are trying to cross blockchains with compute credits. Before we start: I wrote the newsletter today on a different computer than usual. Not logged into my normal media accounts, and I got a look a fresh look at the media landscape. So, every single thing is paywalled these days?
I get it — CO paywalls itself just often enough to keep its top line inching up, but as paywalling is no fun we don’t do it enough to drive optimal growth. My poor business sense aside, what’s worrying is just how much well-reported journalism is stuck behind a credit card portal while bilge is free. This is not a de novo concern, but damn, I understand it better now. — Alex
📈 Trending Up: No shit … Starlink … AI jobs crisis? … Canopy ($70M, Jump Capital) … Doppel ($35M, Bessemer) … company towns … internal/external dev tools … inverse credit scores … Microsoft AI models? … Astronomer ($93M, Bain Capital Ventures) … water wars … even more reasoning models …
AI leaderboard (iOS, free, today):
#1: ChatGPT
#25 Grok
#29 Meta AI
#43 Google Gemini
📉 Trending Down: Domestic cybersecurity … stablecoin legislation … rule of law … Q2 GDP predictions … Buffett’s daily work cadence … prices in China … my brain, after reading this … luxury spend? …
AI factories, redux
After Microsoft reported earnings last week, we took a look at its AI notes. As a quick refresher, Microsoft attributed “16 points” of year-over-year growth at its Azure public cloud project to “AI services,” alongside other data points like 4x GitHub Copilot growth in the last year.
But what we took the most notice of was Microsoft’s rising AI workloads, measured in how many tokens it processed in the trailing quarter. Recall from Nvidia’s GTC keynote that tokens “transform images into scientific data ,” they “decode the laws of physics,” and can be “reconstitute[d] into music, into words, into videos, into research, into chemicals, or proteins.”
So when Microsoft said that it “processed over 100 trillion tokens this quarter, up 5X year-over-year – including a record 50 trillion tokens last month alone,” we should have translated that a bit more. Not only did Microsoft expand its token rate from 20 trillion per quarter to 100 trillion in a year, it’s going to smash that figure in the current quarter. I would not be shocked if Microsoft processed more than 200 trillion tokens in the current quarter, implying at least 67 trillion tokens per month.
What matters for our understanding of the world is twofold:
Microsoft is trimming its data center buildouts a little, but about half of Azure’s growth rate today comes from AI workloads, and those workloads are themselves growing very quickly.
Those facts underscore the AI factory model of the future that Nvidia detailed earlier this year.
What I want to highlight for us both this morning is that Nvidia’s pitch that AI-focused data centers should be considered AI factories, measured by their raw inputs (silicon, water, electricity) and outputs (processed/generated tokens) is not a future pitch. Instead, it’s an accurate description of AI number crunching today.
Hence why Meta AI’s decision to offload its API compute workloads to third parties all the more interesting. You’d think that Meta would want to house that capacity internally. After all, we can consider GPU clouds to be a layer above trad data centers, right? AI data work sits atop prior CPU work. So, why not try to own the new processing layer? Microsoft and Google want to, at least.
Can anyone slow Cursor?
What’s driving the demand for AI compute? In part, coding tools. While AI pushes its way into writing, prototyping, marketing, sales, and other niches, it’s in the developer game that we’re seeing perhaps the most revenue creation from genAI apart from the hyperscalers.
So it’s not a shock that the FT reports that Cursor (Anysphere’s coding assist service) closed its long-expected new round. Worth some $900 million at a $9 billion valuation, the FT adds that “OpenAI backer Thrive Capital” led the round, with participation from a16z and Accel.
But even more interesting is that the FT did not report the same revenue number for Cursor that others have. Instead of the $300 million ARR threshold that we saw in the news recently, the FT says that the company reached $200 million ARR in April. That’s still an insanely quick double from the $100 million worth of annual recurring revenue that Cursor reached at or around the new year, but not as impressive as we might have thought.
No matter, I completely understand investors paying 45x ARR for Cursor. If you are putting capital into the company today you must expect it to reach $400 million ARR by year-end (I am spitballing, but a 2x from April after a 2x in a quarter or so seems reasonable), meaning that your Cursor check will close 2025 at a 22.5x ARR multiple.
It gets even cheaper from there. You can drop capital at insane multiples into companies, so long as they are growing at record-breaking speed. Can Windsurf slow Cursor down? Can anyone? If not, $9 billion could like mighty cheap in a year’s time.
Crypto + AI
Closing today with a brief note, but the movement to fuse AI and crypto is gaining some momentum. You might read the preceding sentence as meaningless buzzword slop, but the thesis in question isn’t that silly:
Lots of folks want to lever AI work
AI work requires lots of compute
Proof-of-Work crypto chains have demonstrated an ability to generate self-reinforcing networks that inpute compute and output digital currency
So, why not make the Proof-of-Work work handling AI compute loads, instead of, say, doing math problems that generate no economic value apart from blockchain security?
From that perspective, you can kinda see the pitch. As I learned on TWiST last week, Bittensor is running a playbook that tracks our above-written argument. I remain skeptical that AI shops will be willing to use decentralized compute for AI workloads given that it sounds very slow compared to the sort of data centers the market is accustomed to. But, hey, who knows.
Enter Tether. Which is building Tether.AI, a “fully open-source AI runtime” that will feature “no API keys” and “no central point of failure.” Details are super scant, but it seems that everyone in crypto is seeing the demand curve for AI and wanting a piece of the compute action.
What’s nice for you and I is that the AI-crypto crossover will work, or not, on the basis of the quality of its offering. What will prove tricky for Bittensor, Tether and their rivals is that it’s going to be super hard to compete on price. After all, the hyperscalers are pretty good at squeezing pennies from their gear.
From a competition perspective, more is better, so I am not going to sit here and mock. But you really wonder if, apart from the old cross-border payments argument (now served by stablecoins), what crypto is for. Perhaps handling some AI compute.