According to Fast Company, Meta is facing explosive new allegations that it suppressed internal research showing people who stopped using Facebook experienced less depression, anxiety, and loneliness. The claims come from a legal brief filed in U.S. District Court for the Northern District of California as part of a lawsuit brought by several U.S. school districts against Meta, Snap, TikTok, and other social media companies. The alleged research project, called Project Mercury, was reportedly initiated in 2019 to study how apps affect polarization, news habits, well-being, and social interactions. Plaintiffs claim social media companies knew about these negative mental health impacts on children and young adults but failed to act and misled authorities. Meta strongly denies the allegations, calling them “cherry-picked quotes and misinformed opinions” designed to present a “deliberately misleading picture.”
Meta pushes back hard
Meta isn’t just denying these allegations—they’re coming out swinging. Their statement to Fast Company emphasizes their decade-long track record of listening to parents and implementing safety features. They specifically point to Teen Accounts with built-in protections and parental controls. But here’s the thing: when a company says the “full record will show” their innocence, it usually means they’re preparing for a long, ugly legal battle. And let’s be honest—this isn’t the first time we’ve heard about internal research at social media companies that never saw the light of day.
Part of a bigger pattern
This lawsuit feels like déjà vu, doesn’t it? We’ve been down this road before with tobacco companies, opioid manufacturers, and now social media giants. The playbook seems familiar: internal research identifies problems, legal teams get involved, and suddenly the research disappears or gets reinterpreted. What makes this case particularly damaging is that it involves children’s mental health. School districts are literally suing because they claim these platforms are harming their students. That’s not just bad PR—that’s potentially massive liability.
The business implications are huge
From a pure business perspective, Meta can’t afford to have this narrative gain traction. Their entire model depends on engagement, especially from younger users who represent future revenue streams. If research conclusively shows their platform causes depression and anxiety, how do they square that with their mission to “bring the world closer together”? They’re walking a tightrope between protecting their business and addressing legitimate health concerns. And honestly, their current approach of adding parental controls feels a bit like putting a band-aid on a bullet wound when the core product might be the problem.
What happens next?
This lawsuit is just getting started, and we’re likely to see years of legal wrangling. But the discovery process could be brutal for Meta. If internal documents and emails surface showing executives knew about these mental health impacts and chose to suppress the research? That would be catastrophic. Meanwhile, the court of public opinion is already rendering its verdict. Every parent who’s worried about their teen’s screen time is watching this case closely. The real question is whether this will actually force meaningful change or just become another legal settlement that changes nothing.
