Facebook parent company Meta accused of hiding internal study claiming mental health harm due to social media

The court documents, filed by school districts against Meta, Google, TikTok, and Snapchat, claim that the tech giants prioritised growth over safety, hiding the risks from parents and regulators.

A hearing on the filings is scheduled for January 26 in Northern California's US District Court, intensifying scrutiny on Big Tech's role in youth mental health crises.
A hearing on the filings is scheduled for January 26 in Northern California's US District Court, intensifying scrutiny on Big Tech's role in youth mental health crises.

Facebook parent Meta is once again facing serious allegations in a US class-action lawsuit, which states that it deliberately suppressed internal research that proved its social media platforms, Facebook and Instagram, causally harm users’ mental health. The court documents, filed by school districts against Meta, Google, TikTok, and Snapchat, claim that the tech giants prioritised growth over safety, hiding the risks from parents and regulators while designing addictive features for teens and children. 

A hearing on the filings is scheduled for January 26 in Northern California’s US District Court, intensifying scrutiny on Big Tech’s role in youth mental health crises.

The lawsuit, led by law firm Motley Rice on behalf of US school districts, presents a worrying picture of corporate negligence. The plaintiffs allege the social media platforms tacitly encouraged under-13 usage, ignored child sexual abuse material, and even paid child-focused groups like the National PTA to publicly endorse their safety claims.

Social media platforms accused of hiring

TikTok, for instance, is accused of boasting internally about influencing the PTA, with staff noting it would “do whatever we want going forward… they’ll announce things publicly, their CEO will do press statements for us.” It is said that Meta stalled predator protections for years due to growth concerns and set a “very, very, very high strike threshold”—requiring 17 violations before removing sex traffickers.

At the heart of the Meta claims is “Project Mercury,” a 2020 study where scientists partnered with Nielsen to test a week’s deactivation of Facebook and Instagram. Results showed participants reported lower depression, anxiety, loneliness, and social comparison. Yet, Meta allegedly halted the project, deeming the findings “tainted by the existing media narrative.” 

An unnamed researcher noted, “The Nielsen study does show causal impact on social comparison,” adding an unhappy face emoji. Another staffer likened the silence to tobacco firms “doing research and knowing cigs were bad and then keeping that info to themselves.” 

Despite this, Meta assured Congress it couldn’t quantify harm to teen girls, the documents charge. In a 2021 text, CEO Mark Zuckerberg reportedly deprioritised child safety, “I wouldn’t prioritise child safety as my top concern when I have a number of other areas I’m more focused on like building the metaverse.”

Meta responds to the allegations

Meta denies the accusations. Meta spokesperson Andy Stone stated, “The full record will show that for over a decade, we have listened to parents, researched issues that matter most, and made real changes to protect teens.” He called the study flawed methodologically and insisted teen safety measures are effective, with immediate removals for sex trafficking flags.

“We strongly disagree with these allegations, which rely on cherry-picked quotes and misinformed opinions,” Stone added. Meta has moved to strike the documents, arguing the plaintiffs’ unsealing request was overbroad.

The filings also expose broader patterns. Meta Platforms allegedly optimised for teen engagement despite knowing it amplified harmful content, blocked effective safety tests to avoid growth hits, and pressured staff to justify inaction.

Read Next
This article was first uploaded on November twenty-three, twenty twenty-five, at two minutes past twelve in the night.
X