Meta Faces Major Legal Challenge Over Harm to Teens

Another major legal challenge is mounting for Meta, with a group of U.S. parents and school leaders alleging that Meta ignored major risk red flags, in order to maximize usage and profit, despite warnings, including those from its own internal leaders.
According to a new filing in the Northern District of California, which has been put forward by a collective of more than 1,800 plaintiffs, Instagram, TikTok, Snapchat, and YouTube have “relentlessly pursued a strategy of growth at all costs, recklessly ignoring the impact of their products on children’s mental and physical health.”
Among the various claims in the suit, the group says that Meta:
- Has intentionally limited the effectiveness of its youth safety features, and has blocked tests of possible safety features that could impact growth
- Has implemented inadequate enforcement measures to combat sex trafficking in its apps, including the suggestion that Meta requires a user to be repeatedly detected (up to 17 times) engaging in such activity before it takes action
- Has ignored harms to teen users if they risked reducing the potential for more engagement
- Has stalled in its efforts to stop potential predators from contacting minors, also due to growth and usage concerns
- Has prioritized bigger projects, like the Metaverse, over funding improved child safety measures
The group claims to have gained insight from several former Meta staffers to this effect, reinforcing its case against the social media giant. Which will now see Meta once again faced with a court battle to defend its efforts to protect teens.
Which is an accusation that Meta has faced before, with Meta CEO Mark Zuckerberg hauled before U.S. Congress last year to respond to reports that Meta had ignored teen safety concerns in favor of maximizing profit.
Meta has long maintained that it is always working to improve its systems, and that it does take such obligations seriously, while also pointing to flawed methodology behind many of these reports, suggesting that selective testing, and broader media bias, has unfairly targeted its apps.
Though another element of the same legal filing has also suggested that Meta has previously scrapped reports of this type if they’ve failed to show the company in a positive light.
According to the filing, Meta shut down internal research into the mental health effects of Facebook back in 2020, after initial responses showed that users did see positive mental health impacts after they stopped using the app.
As per Reuters:
“In a 2020 research project code-named ‘Project Mercury,’ Meta scientists worked with survey firm Nielsen to gauge the effect of ‘deactivating’ Facebook, according to Meta documents obtained via discovery. To the company’s disappointment, ‘people who stopped using Facebook for a week reported lower feelings of depression, anxiety, loneliness and social comparison,’ internal documents said.”
The suit alleges that Meta buried these findings, and canceled any further work on this element, arguing that the results were tainted by the “existing media narrative” around the company.
Meta has denied the accusations, and has reiterated its efforts to address such concerns within its apps.
As per Meta spokesman Andy Stone:
“The full record will show that for over a decade, we have listened to parents, researched issues that matter most, and made real changes to protect teens.”
Meta plans to defend itself against the claims, and show that it has taken effort to work with the available research, and address such issues where possible.
But it seems that Meta will now have to face this latest round of questions in a public forum, and with statements from former Meta execs, it could be a messy and harmful proceeding for the business.
The full filing, as noted, also alleges that Snapchat’s age detection methods are ineffective, and that it uses compulsive engagement tools, like Snap Streaks, to keep users coming back. It also claims that TikTok “uses manipulative design techniques to boost engagement among youth,” while YouTube’s algorithms expose young users to harmful content.
It’s a wide-ranging suit, with a big pool of potentially impacted plaintiffs. And it could end up being another bad showing for Meta in particular, depending on how it proceeds.



