The state AI regulation argument may spark a Republican divide
Also: The Orbanization of American media
Welcome to Cautious Optimism, a newsletter on tech, business and power.
đ Trending Up: New, better AI models ⊠OpenAIâs social push ⊠Kalshi ⊠a quantum internet? ⊠eVTOL drama ⊠middle-class pain âŠ
đ Trending Down: Targeted sycophancy ⊠telecom cybersecurity ⊠SoftBank stock ⊠Googleâs free cash flow ⊠national sanity ⊠Robinhoodâs market cap âŠ
Things That Matter
Are we getting a rate cut or not? The CME FedWatch tool tracks the likelihood of rate cuts in upcoming Fed meetings. Often, the market forms a consensus in advance to price in central bank decisions ahead of time.
Things are a bit different for the December 10 Fed meeting. The marketâs prediction of a rate cut â a mere 25 bips, mind â has see-sawed in recent days. FedWatch currently expects thereâs a 64% chance that weâll get a cut next month. A few days ago, the numbers were flipped.
This is uncertainty, and the markets hate that word and everything it represents. Itâs reasonable that the Fed is hard to read at present: the members of its voting board are split on the best path forward as they weigh both sides of their dual-mandate of full employment and constrained inflation. Itâs a tough balance at the best of times, let alone during a time when unemployment is rising and inflation is far above the target 2% rate.
Why are stocks taking a beating this week? Partly because the major narrative, the AI boom, has lost some of its near-term momentum, and rate cuts are not a cure for trader worries.
Say one thing and do another â the Amazon way: While the e-commerce giant explained its massive 14,000-job layoff as a way to reduce management layers to move faster, the cuts actually included a large chunk of software developers. Up to 40% of the lost jobs in California, New York, New Jersey and Washington state affected engineers.
This wonât come as a surprise if you read Amazon employee forums, where one can find many complaints about how the layoffs hit engineers the hardest, not managers.
OpenAI is worried about Gemini 3 eating some of its growth. Fair enough. Perhaps OpenAI needs to spend more of its time and money building better models instead of consumer-facing features?
The Orbanization of American media: One way that Viktor MihĂĄly OrbĂĄn has retained power in Hungary since 2010 is by using a media industry neutered by the state. With the media in its pocket, the ruling party can greatly influence public sentiment, and so stay in power despite retaining democratic mannerisms like regular elections. OrbĂĄn forged friendly control over the media partly thanks to his friends buying most of the newspapers in the country.
I am sure that the state of press freedom in minor EU countries doesnât keep you up at night, but the example matters. The Guardian reports that Larry Ellison, the largest shareholder in Paramount (which his son controls), wants to axe anchors at CNN whom the POTUS doesnât like if Paramount is able to buy Warner Bros Discovery.
The NYPost reported in October that the White House may move to block any deal that doesnât see Warner Bros Discovery sold to the Ellisons â who recently purchased CBS Newsâ parent company, installed a POTUS-friendly editor in chief, and brought on a Republican-aligned ombudsman.
Why might the White House block a bid by someone else to buy the asset? Because other folks might not be willing to bend its editorial decisions to fĂȘte the President. Whatâs even grosser is that, because it is presumed that no one else has a shot at getting the deal done, the Ellison family could get Warner Bros Discovery at a discount.
This is bad!
State AI regulations may spark a Republican divide
The tech and political press are buzzing over an upcoming effort aimed at tamping down U.S. statesâ efforts to regulate AI.
The House may try to include âstate AI regulation preemption in a must-pass defense policy bill thatâs expected to be finalized in the coming weeks,â Axios reported a few days ago. POTUS is also considering an executive order on the matter.
Given that a ban on state-level AI regulation failed to pass through Congress earlier this year, what authority might the President have to limit state action? Well, he doesnât have to do it directly.
If the executive order, tentatively titled âEliminating State Law Obstruction of National AI Policyâ, is released in its expected form, the U.S. attorney general would create an âAI Litigation Task Forceâ within 30 days to challenge state AI laws, âincluding on grounds that such laws unconstitutionally regulate interstate commerce.â
The order would also sic the FCC on states that regulated AI, stripping them of broadband funding.
The arguments from both sides are simple enough:
Those opposed worry that a patchwork of differing rules will encumber AI companies, forcing them to move more slowly, which would in turn result in a less competitive domestic AI industry compared to China.
Those in favor fret that removing states from the regulatory conversation is yielding the field to large technology companies, which could crimp statesâ ability to protect their local industries.
This is not a purely party-line division. There are both Republican senators and governors opposed to not letting states regulate AI as they see fit.
Senator Josh Hawley on X said the push âshows what money can doâ in a response responding to a tweet arguing that âAI amnesty will harm conservatives, children, communities, and creators.â
Hawley said in May during the One Big Beautiful Bill kerfuffle that he would do âeverything [he could] to killâ the push to crimp state-level regulation, and described the idea as âreally terrible policy.â
Governor Ron DeSantis had a lot to say about the Houseâs search for a place to add a moratorium on state-level AI regulations in a coming must-pass bill:
Stripping states of jurisdiction to regulate AI is a subsidy to Big Tech and will prevent states from protecting against online censorship of political speech, predatory applications that target children, violations of intellectual property rights and data center intrusions on power/water resources.
The rise of AI is the most significant economic and cultural shift occurring at the moment; denying the people the ability to channel these technologies in a productive way via self-government constitutes federal government overreach and lets technology companies run wild.
Not acceptable.
POTUS does not care about the minutiae of AI regulation. What he cares about is maintaining power, and a big chunk of the AI executive class supports the President. No wonder he wants to keep them happy.
However, large tech companies are not popular in many conservative circles. Many Republican leaders retain that mindset, and some of those are potential candidates for the GOP nomination in 2028.
This pits VP JD Vance, broadly aligned and funded by tech cos, against potential rivals for the nomination. Once POTUS exits stage left, a real dust-up in the Republican party may start brewing, with the potential to split it along technology lines.
What about the law? The current technology talking point is that state-level AI regulations run afoul of the Commerce Clause, though even a16z notes that there are limits to the argument. We should expect reams of essays about just how burdensome different statesâ AI rules are, whether or not they unduly hinder interstate commerce, and calls for Congress to actually do something useful, like set national rules.
That last bit is a good point. A few weeks ago, the White House went after Anthropic for supporting a particular bill in California. The AI giant argued that it would prefer a national standard, but as Congress was useless, the company would be content with state-level action:
While we continue to advocate for that federal standard, AI is moving so fast that we canât wait for Congress to act. We therefore supported a carefully designed bill in California where most of Americaâs leading AI labs are headquartered, including Anthropic. This bill, SB 53, requires the largest AI developers to make their frontier model safety protocols public and is written to exempt any company with an annual gross revenue below $500Mâtherefore only applying to the very largest AI companies. Anthropic supported this exemption to protect startups and in fact proposed an early version of it.
Wouldnât it be nice if we got a bipartisan bill out of Congress with reasonable, tailored and modest AI rules? Yes. Will we? No. So here we are, testing the limits of the Executive Branchâs power yet again.
