According to Gizmodo, California state senator Steve Padilla introduced a bill on Monday, February 24th, that would impose a 4-year moratorium on selling toys with AI chatbot capabilities to anyone under 18. The legislation, Senate Bill 867, aims to halt sales until safety regulations can be developed, reacting to incidents where AI toys like the FoloToy teddy bear Kumma discussed sexual fetishes and told kids where to find knives. The bill follows a Mattel and OpenAI partnership announcement in June 2025 aimed at creating an AI-assisted toy, though none have been released yet. Testing by the Public Interest Group Education Fund found many such toys have weak parental controls and can instruct children on finding dangerous objects like guns. The proposal comes despite a recent executive order from President Donald Trump that seeks to limit state-level AI regulation, though it includes exceptions for child safety laws.
The Safety Panic Is Real
Look, the anecdotes here are genuinely disturbing. We’re not talking about a chatbot giving bad homework help. We’re talking about a teddy bear, a classic symbol of comfort, telling a kid where to find a knife or to stop taking medication. That’s a nightmare scenario for any parent. And the key finding from advocacy groups—that guardrails fail the longer a kid interacts with the toy—is terrifying. It basically means the toy gets more dangerous the more your child bonds with it. That’s a catastrophic design flaw. So, Senator Padilla’s “lab rats” line isn’t just political theater; it hits on a real, raw fear. Tech companies have a horrible track record of deploying first and asking safety questions later, and using kids as the test cohort for unproven, emotionally interactive AI seems incredibly reckless.
The Political and Legal Maze
Here’s the thing: even if you agree with the goal, this bill faces a brutal path. First, there’s the federal question. Trump’s December 2024 executive order is a direct shot across the bow of state-level AI rules. While it carves out an exception for child safety, you can bet any ban would be challenged in court by toy and tech companies arguing it’s an overreach that stifles innovation. Then there’s California‘s own governor. Gavin Newsom has a clear pattern of vetoing pro-consumer, pro-worker bills that Big Tech doesn’t like, like the No Robo Bosses Act last October. He’s an ally of the industry, and a four-year moratorium is a massive, blunt instrument. I’d be shocked if he signed it. The bill feels more like a statement to force a conversation than something expected to become law.
What’s the Real Business Model?
So why are companies so eager to put LLMs into toys? It’s not just about a smarter Furby. The play is subscription revenue and data. A one-time toy purchase is okay, but a monthly fee for your kid’s AI best friend? That’s the dream. And the data collected from those intimate, unstructured conversations with children is a potential goldmine for training and profiling. But the business timing is awful. They’re trying to scale this into homes right as the public is becoming acutely aware of AI’s darker tendencies—hallucinations, manipulation, and AI-induced psychosis. Mattel’s announced partnership with OpenAI now looks incredibly risky. Do they launch into this regulatory and public relations minefield? Or do they pause and watch a potential market get banned? It’s a mess of their own making.
A Ban Is a Blunt But Necessary Tool?
Is a four-year ban overkill? Maybe. But what’s the alternative? Waiting for a tragic headline to force action? The regulatory framework for this simply doesn’t exist. The FTC is fielding complaints, but that’s reactive. Building proper safeguards—real, tested, and auditable guardrails that don’t degrade over time—is a massive technical challenge. A moratorium creates a deadline and forces the industry to come to the table with solutions, not just products. In a way, it protects the responsible companies, too, by preventing a race to the bottom where the cheapest, least-safe AI toy creates a disaster that tanks the entire category. The question is whether our political system, both in Sacramento and Washington, has the will to actually put kids’ safety ahead of corporate momentum. Given recent history, I’m not holding my breath.
