According to TechRadar, a group of 18 top European cybersecurity and privacy academics has published an open letter warning that Chat Control legislation “still brings high risks to society without clear benefits for children.” Denmark recently withdrew the mandatory scanning clause after failing to get majority support, making it voluntary instead. The European Council was set to discuss adoption as early as December 8, 2025, but the scope has now expanded beyond just URLs, pictures, and videos to include text messages. The experts’ concerns appear to have already influenced today’s meeting, with leaked cables showing EU governments removing Chat Control from the agenda because a majority hasn’t been reached.
Voluntary Isn’t Safe
Here’s the thing that really worries me about this “compromise.” Making scanning voluntary instead of mandatory sounds like a win for privacy advocates, but the experts are calling this “through the backdoor” scanning. Basically, once the infrastructure exists and companies start implementing it voluntarily, what’s to stop governments from making it mandatory later? The technology itself creates the surveillance capability, regardless of whether it’s optional today. And let’s be real – when big tech companies face pressure from EU regulators, “voluntary” often becomes “you better do this if you want to keep operating here.”
AI Isn’t Ready for This
The academics aren’t mincing words about the technology itself. They straight up say “current AI technology is far from being precise enough to undertake these tasks with guarantees for the necessary level of accuracy.” Think about what that means in practice. False positives could destroy lives – imagine being flagged as a predator because an AI misread an innocent conversation between parents discussing their child’s health. The potential for harm here is enormous, and we’re trusting this to systems that still can’t reliably distinguish between a dog and a muffin in photos.
Age Verification Nightmare
Now they want to introduce age verification on encrypted messaging services like WhatsApp and app stores. The experts point out that this “cannot be performed in a privacy-preserving way with current technology.” So to protect children, we’d need to collect everyone’s biometric data, behavioral patterns, or context information? That’s like burning down the village to save it. And what about the people who don’t have official documents or don’t want to hand over their ID to use basic messaging services? The experts warn this would cut off a “substantial fraction of the population” from essential online services.
Easy to Bypass Anyway
Here’s the kicker – these provisions would be ridiculously easy to bypass. The experts note they “can be easily evaded, by using providers outside the EU or VPNs to avoid geolocation checks.” So the actual criminals who this legislation is supposedly targeting? They’ll just use non-EU services or basic privacy tools. Meanwhile, regular citizens get subjected to mass surveillance. It’s security theater that sacrifices real privacy for the illusion of safety. The open letter makes it clear that there’s “no proven benefit, while the potential for harm and abuse is enormous.”
What Happens Now?
The fact that Chat Control got pulled from today’s COREPER agenda because they couldn’t reach a majority is significant. According to Patrick Breyer’s analysis, this could seriously delay adoption beyond the December 2025 target. But here’s my concern – when legislation gets delayed like this, it often comes back with minor tweaks but the same fundamental problems. The underlying push for mass scanning isn’t going away, and neither are the technical concerns that experts keep raising. This fight is far from over, and the stakes for digital privacy couldn’t be higher.
