Reuters reports that unredacted court documents filed in a class-action suit against Meta and other social media firms accuse it of allegedly shutting down its internal studies into how Facebook and Instagram mentally harmed users after it found causal links between its platforms and mental harm.

 

The study, which was done in 2020 in the project named Project Mercury, entailed the use of meta scientists who worked in collaboration with Nielsen to understand the impacts of temporarily shutting down Facebook and Instagram. The results, unveiled in internal reports, showed that the users who discontinued their Facebook usage (in a week) indicated reduced depression, anxiety, loneliness, and social comparison.

 

Rather than releasing the findings or even continuing with the project, Meta apparently decided to pull the plug on the project and rejected the findings as being subject to the media narrative about the company that existed. Nonetheless, there have been internal communications stating that there were certain Meta employees who felt that the conclusions of the study were valid. 

 

One anonymous investigator is quoted as saying, “the Nielsen study does demonstrate causal influence on social comparison, whereas another paralleled it with the situation in the tobacco industry, where negative research is being silenced.”

 

Although the internal research suggests a cause-and-effect relationship between its products and its adversely affected mental health, the filing asserts that in 2021, Meta lied to Congress by declaring that it could not determine whether its platforms were harmful to teenage girls.

 

Meta spokesperson Andy Stone justified the company’s decision to halt the study by stating that the methodology used in the research was flawed. Stone highlighted that Meta had been working for a long time to make its products safer, particularly for teenagers, and had actually acted in response to parent and researcher feedback. 

 

“A complete history will indicate that we have been listening to parents, researching the issues that are the most important, and creating actual changes that are protecting the teens for over a decade.”

 

The argument that Meta concealed evidence of the harms of its products is only one among the numerous arguments of a larger legal filing by the law firm of Motley Rice. The company is bringing a case against Meta, Google, TikTok, and Snapchat on behalf of school districts nationwide in a lawsuit claiming that the four companies knowingly withheld certain dangers of their platforms. 

 

Some of the claims include allegations that social media firms promoted the use of their applications among children who are below 13 years old, did not take action against child sexual abuse content, and wanted to promote more use of social media by teenagers at the expense of school.

 

The plaintiffs also accuse the companies of trying to purchase the favor of child-oriented groups to protect their products. An example mentioned is the sponsorship of TikTok to the National PTA, as described in the filing that resulted in internal messages about TikTok and its capabilities of influencing the organization to change its messages.

 

The filings include fewer pieces of information about the accusations against TikTok, Google, and Snapchat, but the filings provided against Meta are much more comprehensive.

In the filing, Meta supposedly modelled its youth safety functionalities to be weak and not frequently exploited. Meta has also been found to hinder the attempts to test safety features that can damage the growth of the company. 

 

One report quoted that Meta said that it took 17 attempts to remove a user who had attempted sex trafficking by claiming they were reported 17 times before being kicked off the website, which it considered to be a “very high strike threshold.”

 

The filing also asserts that Meta knew that setting its products to bring in more engagement with teens led to serving more harmful content, but still chose to pursue this strategy anyway. Meta also faces accusations of dragging its feet to ensure that child predators do not get in touch with minors and forcing employees to defend this decision. Mark Zuckerberg, in 2021, reportedly said that he did not prioritize child safety and instead focused on creating the metaverse.

 

Andy Stone, responding to these claims, also said that the safeguards that were taken by Meta to protect teenagers worked and that the lawsuits distorted the work done by the company. Stone referred to the allegations as being cherry-picked.

 

The motion to strike the internal documents mentioned in the filing has been submitted by Meta, claiming that the request by plaintiffs to unseal the documents as it is is excessive. On January 26, 2026, the matter is scheduled to be heard in the Northern California District Court.