Fifty Shades of AI Regulation
Why Trump's AI Order Is Necessary
How 1,000 state AI bills turned innovation into a regulatory car crash, and why one federal framework might actually be the least-bad option
There’s a particular brand of British misery that comes from trying to renew your car insurance. Not the actual cost, mind you, that’s a separate despair. It’s the moment when you’ve spent forty minutes clicking through comparison sites, and you finally find a quote that doesn’t make you want to walk everywhere forever, only to discover it’s not valid in your postcode. Or it doesn’t cover your specific make of car. Or it requires you to have a different type of parking arrangement.
You’ve just wasted an hour of your life navigating rules that nobody asked for, that serve no actual purpose, and that exist solely because seventeen different committees each decided they needed to justify their existence by inventing a new requirement.
Which is exactly the feeling I get when I look at what’s happening with AI regulation in America right now.
The Regulatory Pile-Up Nobody Asked For
I watched President Trump sign an executive order last week directing the federal government to challenge state AI laws, and honestly, the bloke’s got a point. Over 1,000 different AI bills have been introduced across US state legislatures. One thousand. That’s a bureaucratic arms race where the prize is making sure nobody can actually do anything without hiring a small army of compliance lawyers.
I’ve spent enough time inside corporate machinery to know what happens when you let fifty different jurisdictions each invent their own version of the rules. You don’t get careful, thoughtful protection. You get a compliance nightmare that benefits absolutely nobody except the consultants charging £800 an hour to explain why your chatbot needs different disclaimers in Colorado versus California versus Connecticut.
States like California are considering requiring AI companies to “censor outputs and insert left-wing ideology in their programming.” Colorado’s got its own set of rules. New York’s probably cooking up something different. Meanwhile, AI startups are supposed to navigate this mess while competing against Chinese companies that operate under exactly one set of regulations. It’s like asking someone to win a Formula One race while making them stop at every county line to check if they’re allowed to use fourth gear.
How We Got Here (And Why It’s Stupid)
The mechanism is straightforward, and it’s the same one that’s bollocked up every other attempt at innovation in the past twenty years. When there’s no federal framework for something new and potentially important, states rush to fill the void. Not because they’ve got brilliant ideas, but because they’ve got legislators who need to look busy. Each state crafts its own regulations based on whatever combination of genuine concern, lobbying pressure, and performative politics happens to be in fashion that month.
The result is friction, not protection. Expensive, pointless friction that slows down development without making anyone safer. Trump’s executive order directs the Justice Department to set up an AI Litigation Task Force to sue states over their AI laws, and honestly, someone needed to. The order also threatens to withhold federal broadband funding from states with “onerous” AI laws, which is the bureaucratic equivalent of threatening to take away someone’s pudding if they don’t behave.
Now, before you accuse me of becoming a fanboy, let’s be clear: this isn’t about loving big tech or trusting billionaires to police themselves. It’s about recognising that scattered regulation is often worse than no regulation at all, because it creates the illusion of oversight while actually just making everything slower and more expensive. The executive order explicitly exempts child safety laws, which is the right move, because protecting kids from AI harms is one of those rare areas where local experimentation actually makes sense. But requiring fifty different disclosure formats for the same AI model? That’s plain stupid!
What This Means For You (The Bit You’ll Actually Use)
Whether you’re building AI tools, using them, or just trying to understand what the hell is happening to your job, this executive order actually matters to your miserable existence.
First, understand that “federal framework” doesn’t mean “no rules.” It means one set of rules instead of fifty contradictory ones. If you’re working with AI in any capacity, a single national standard means you can actually plan for compliance instead of playing regulatory whack-a-mole. When you’re evaluating AI tools or vendors, you’ll want to ask them a simple question: “What’s your compliance strategy if this executive order actually works?” Because the ones who’ve been building for fifty different state markets are going to have a lot of wasted infrastructure. The ones who’ve been waiting for federal clarity are going to move faster.
Second, watch what happens with state laws that actually address real harms versus ones that just add paperwork. The executive order directs federal agencies to challenge state AI laws that “harm innovation,” but it explicitly protects child safety regulations. That’s your template for understanding which regulations will survive and which will get demolished. If a state law requires meaningful disclosures about AI-generated content that could deceive people, it’s probably safe. If it requires quarterly reports abouut your chatbot’s “ideological alignment,” it’s probably toast. This matters because it tells you where to focus your own compliance efforts, if you’re stuck dealing with this nonsense.
Third, recognise that legal uncertainty is now the dominant feature of the landscape, not stability. Trump’s order will trigger court battles. States will fight back, claiming their constitutional right to regulate commerce and protect consumers. This will take years to sort out. What that means for you is simple: don’t make any five-year plans based on the current regulatory environment, because it’s all about to become a legal thunderdome. If you’re investing in AI companies, if you’re building AI products, if you’re making career decisions based on where the AI industry is heading, factor in that nobody actually knows what the rules will look like eighteen months from now.
Finally, here’s the practical takeaway everyone needs to understand: this isn’t actually about AI at all. It’s about who gets to control the next trillion-pound industry. The fight between federal and state regulation is a proxy war for whether tech giants or state governments have more power. You’re not being protected by either side, you’re just watching two groups of people argue about who gets to write the rules that will eventually make your life more complicated.
The smart move is to build your own decision-making framework for AI adoption that doesn’t depend on regulatory clarity, because you’re not getting any. Ask yourself: does this tool actually solve a problem I have, or am I just using it because everyone else is? Can I explain how it works well enough to know when it’s failing? Do I have a backup plan for when the regulations change and this tool becomes illegal or unusable? Those questions matter more than any executive order.
The Insurance Quote At The End Of The World
So yeah, this executive order might actually be necessary, which is a sentence I never thought I’d write. Not because it’s perfect, not because it solves everything, but because fifty different state regulations genuinely is worse than trying to build a single federal framework. Even if that framework is imperfect, even if it gets hijacked by lobbying, even if it takes years to sort out in court.
It’s still better than trying to renew your car insurance in fifty different postcode districts, each with their own special rules about what colour your car is allowed to be. Sometimes the least-bad option is just accepting that one massive committee making one set of stupid rules is less awful than fifty tiny committees each inventing their own unique brand of stupidity.
Though I’m still walking everywhere. The exercise does me good.
