According to Mashable, Roblox is rolling out mandatory AI age checks for all users who want to chat, starting January 7. The platform, which had 151 million daily users in 2025, uses facial estimation tech from a third-party vendor called Persona to assign users to one of six age categories, from Under 9 to 21+. Roblox Chief Safety Officer Matt Kaufman and product head Rajiv Bhatia called this the “first” large-scale implementation for all ages and a step toward a “gold standard.” In regions where checks were already required—Australia, New Zealand, and the Netherlands—over 50% of users have opted in. This move comes as Roblox faces nearly 80 lawsuits from victims and parents alleging the platform failed to prevent child sexual exploitation, with cases set to be centralized before a judge in San Francisco.
The Big Gamble
So, Roblox is going all-in on biometrics for safety. It’s a massive, unprecedented step for a platform where nearly half of U.S. kids under 16 are hanging out. The stated goal is noble: to create age-appropriate chat environments and theoretically wall off predators. But here’s the thing—this isn’t a precise ID verification. It’s an *estimation*. Persona and Roblox admit it’s accurate within a two-year margin. That’s a huge window when you’re talking about a 12-year-old versus a 14-year-old, or a 16-year-old versus an 18-year-old. The entire safety model hinges on the AI getting it right, and while they say it’s better for younger users, that’s a lot of trust to place in an on-device camera scan.
Skepticism and the Lawsuit Storm
And trust is in short supply. The mandatory rollout feels like a direct, frantic response to the legal firestorm. Nearly 80 lawsuits? That’s not a few disgruntled users; that’s a systemic crisis. The cases are particularly damning because they allege Roblox was a starting point for exploitation that then continued on apps like Meta, Discord, and Snapchat. You have to ask: will a one-time age gate stop determined bad actors? Or will it just create a minor inconvenience? Critics argue predators will find workarounds—using older siblings, manipulated images, or just moving conversations off-platform faster. Roblox can point to its new “gold standard”, but for the families suing, this tech probably feels like too little, way too late.
Where This Is All Headed
Basically, this is a watershed moment for online platforms. Roblox is setting a precedent that could ripple across every social app and game popular with kids. If they make this stick, you can bet other companies will be forced to consider similar invasive measures. Roblox already says it plans to expand these checks to its Roblox Studio collaboration tools. But the real trajectory here is toward a fractured online experience. We’re moving from a vague “enter your birthday” model to a biometric checkpoint society for digital spaces. The privacy trade-off is staggering, but the pressure from regulators and lawsuits might make it the new normal. The big question isn’t just whether the tech works today. It’s whether we’re comfortable building a future where a camera scan is the ticket to talk to your friends online.
