There are queries to go around
Welcome to Cautious Optimism, a newsletter on tech, business, and power.
Happy Thursday friends, it’s a busy morning. We have Tesla and Alphabet earnings to parse, a new national AI policy and a speech from POTUS to parse for hints regarding future copyright policy, and a thought-bubble concerning search volume.
Keep at least one eye on efforts to undercut Fed independence, a strategy that is rolling ahead at top-speed. Turning the American central bank into a political fief so that borrowing costs can be artificially lowered in the near-term is hardly a recipe for fiscal discipline, making support for POTUS’s anti-JPow crusade hard to parse out, as most of the voices shouting are also in favor — putatively — of smaller Federal deficits. To work! — Alex
📈 Trending Up: No shit … copper prices … the possibility of state ownership of TikTok, which is a terrible idea … tariffs … fund of funds … tensions between Thailand and Cambodia …
📉 Trending Down: Clean government in Ukraine? … courage … women’s rights in Saudi Arabia … Zelensky’s popularity … food supply in Gaza …
Tesla misses, Alphabet impresses
Shares of Tesla are off around 6% in pre-market trading this morning after the electric car giant reported its second-quarter results yesterday after the bell. Tesla’s revenue ($22.50 billion) was both under expectations ($22.64 billion), and its year-ago result ($25.05 billion). The company’s 12% revenue decline and resulting top line underperformance led to operating income ($923 million) falling under market expectations ($1.23 billion). Tesla also missed adjusted EPS expectations.
Why are Tesla shares lower? The recurring joke that Tesla’s share price is unmoored from its financial performance is not, in fact, law. And Tesla is not promising its shareholders much in terms of near-term reacceleration.
Tesla’s Q2 deck did not supply a timeline to revenue growth; instead, the company said that thanks to “the impacts of shifting global trade and fiscal policies on the automotive and energy supply chains” among other factors, that it cannot yet forecast when it will start posting year-over-year growth.
Tesla’s executives spoke very positively about the company’s new robotaxi service — more than 7,000 miles driven with a “handful of vehicles” — but Elon Musk said that despite rapid anticipated service expansion he doesn’t expect self-driving taxis to have a “material impact” on Tesla’s operating results until “around the end of next year.”
The end of certain credits from the government and their quicker-than-expected demise is impacting Tesla’s operations. Apart from inventory management, Tesla said that despite starting “production of the lower-cost model as planned in the first half of 2025,” it is “focus[ing] on building and delivering as many vehicles as possible […] before the EV credit expires and the additional complexity of ramping a new product, the ramp will happen next quarter slower than initially expected.”
Musk said that he expects Tesla to build Optimus bots at a 100,000 per month pace in five years’ time.
There’s a lot more in Tesla’s earnings — especially if you are interested in energy generation and storage — but what matters for our purposes is that Tesla truly is stuck between its past and future today. It’s still a big company. It’s still profitable. Just less so, and it’s betting huge on new technologies while its legacy cash flows contract. Investors, it appears, are slightly nervous about how long the changeover period will last.
Next, Alphabet.
In contrast to Tesla, Google’s parent company crushed market expectations. It turned in $96.43 billion (+14%) worth of revenue against expectations of $94 billion, better earnings per share than forecast ($2.31, $2.18), and, most critically, Google Cloud revenue of $13.62 billio (+32%), better than the $13.11 billion that was expected.
So, what are the AI notes that matter? Here’s a rundown:
Gemini models are popular: Some nine million “developers have now built with Gemini,” according to Sundar, who added per an official release, that the “Gemini App now has more than 450 million monthly active users, and [Alphabet continues] to see strong growth and engagement, with daily requests growing over 50% from Q1.” Even more, some 85,000 “enterprises […] now build with Gemini, “driving a 35x growth in Gemini usage year-over-year.”
Google’s AI compute load is skyrocketing: You can think of AI work in dollar or token terms. Thinking in dollars, Google Cloud saw deals worth $250 million and more double year-over-year in Q2, while it signed as many $1 billion and greater contracts in H1 2025 as it did in all of 2024. The company also said that its AI cloud revenues grew more quickly than overall Google Cloud top line.
What about tokens? Of all the numbers that Google reported, this is my favorite:
At I/O in May, we announced that we processed 480 trillion monthly tokens across our surfaces. Since then we have doubled that number, now processing over 980 trillion monthly tokens, a remarkable increase.
Recall that Microsoft processed around 50 trillion tokens in the final month of calendar Q1. It will be interesting to see just how Redmond details its own AI inference growth.
What else? Alphabet now expects capex of $85 billion this year, instead of a previously indicated $75 billion. Given that the company said during its earnings call that it anticipates greater capex to meet “meet cloud customer demand.”
Shares of Alphabet are up 3.6% in pre-market trading.
The new national AI policy
Yesterday at an event co-hosted by the All-In podcast — my TWiST cohost Jason is also a cohost of All-In — the Trump administration released a new AI policy roadmap, and POTUS gave a speech.
You can read America’s AI Action Plan here. My quick takes are as follows:
There are more penalties included for states that enact their own AI regulations; the goal here is to limit local action to help AI companies have a simpler regulatory framework to operate against. Congressional leaders proved skeptical of efforts to limit their states’ right to regulate AI inside their own borders previously.
The Federal government intends to only purchase AI services from LLM companies that “ensure that their systems are objective and free from top-down ideological bias.” Given that the Fed could prove a large AI customer, here we see the government set up a situation in which it is the arbiter of what constitutes an unbiased AI model. Which seems like a risky idea.
New efforts to train American workers for an AI-led future, and retrain those who get supplanted.
Intelligent words regarding incentivizing the sharing of “high-quality datasets publicly,” and requiring research funded by the government to “disclose non-proprietary, non-sensitive datasets that are used by AI models during the course of research and experimentation.”
A bear-hug of open-source AI.
A curtailment of the Clean Air Act, Clean Water Act, and other environmental laws to ensure that data centers can be built quickly while also making “Federal lands available for data center construction and the construction of power generation infrastructure for those data centers.” Put another way, we’re going to privatize AI profits and socialize the impact on our land.
In his remarks, POTUS hit on a few things that matter.
The Trump Administration thinks copyright holders should allow for AI companies to use their information sans payment:
You can’t be expected to have a successful AI program when every single article, book, or anything else that you’ve read or studied, you’re supposed to pay for. Gee. I read a book. I’m supposed to pay somebody. And, you know, we we appreciate that, but you just can’t do it because it’s not doable. […]
[Y]ou just can’t do it. China’s not doing it. And if you’re going to be beating China and right now, we’re leading China very substantially in AI very, very substantially, and nobody’s seen the amount of work that’s going to be bursting upon the scene. But you have to be able to play by the same set of rules.
The Trump Administration is hellbent on limiting state-level AI regulations:
If you are operating under 50 different sets of state laws, the most restrictive state of all will be the one that rules. So you could have a state run by a crazy governor, a governor that hates you, a governor that’s not smart, or maybe a governor that’s very smart but decides that he doesn’t like the industry and he can put you out of business because you’re going to have to go to that lowest common denominator. We need one common sense federal standard that supersedes all states, supersedes everybody so you don’t end up in litigation with 43 states at one time. You got to go litigation free. It’s the only way.
As your politics are largely the mean of the people you spend time with, it’s not hard to see where Trump’s AI views come from. But it’s notable how authoritarian some government skeptics become when they have power; from the tech industry demanding that the government get off its back to clearing the way to use public land for private business, bending the executive to remove state-level influence in favor of Federal fiat, and steamrolling of intellectual property rights that might impede income? That’s pretty heavy-handed stuff!
There are atheists in foxholes; but there doesn’t appear to be a business leader who finds expedient intellectual inconsistency to be worth at least a little blushing.
What’s up with search?
According to OpenAI, 500 million weekly ChatGPT users execute 2.5 billion prompts each day, globally. some 330 million of the 2.5 billion come from the United States.
As I no longer see a real difference between search, and non-search queries when using ChatGPT, I think we can say with confidence that OpenAI is handling billions of daily search queries.
Google, per the online data I could find, handles 13 to 14 billion searches per day. That makes it far and away larger than OpenAI today, but I think that we share a viewpoint on which figure is growing faster.
At the same time, Google is seeing success with its own AI search products — ‘AI Mode’ now has 100 million MAUs in the U.S. and India, while overall query volume continues to rise. This means that OpenAI is probably eating some Google query growth that it might have otherwise accreted, but that the search market is growing in query terms.
People are searching more than before. When we consider the search hopes of companies from Perplexity to OpenAI, therefore, we would do well to keep in mind that we’re discussing an expanding, and not static, market. There’s queries to go around.