Internal documents reveal that Meta, the parent company of Facebook and Instagram, allegedly suppressed research showing a link between its platforms and negative mental health outcomes in users. The revelations stem from a lawsuit filed by US school districts against Meta and other social media companies.
According to the unearthed documents, Meta conducted an internal research project in 2020, codenamed “Project Mercury,” to assess the impact of deactivating Facebook and Instagram. The study reportedly found that individuals who abstained from using Facebook for a week experienced a decrease in feelings of depression, anxiety, loneliness, and social comparison.
The lawsuit claims that instead of publishing these findings or pursuing further research, Meta allegedly terminated the project and internally dismissed the results as conflicting with an existing negative media narrative surrounding the company. This decision is now at the heart of the legal challenge.
The legal filing further alleges that despite Meta’s internal research documenting a causal link between its products and adverse mental health effects, the company informed Congress that it lacked the ability to determine whether its products were harmful to young girls. This contradiction is a key point of contention in the lawsuit.
The documents also indicate that Meta executives were privately informed of the validity of the research findings. One employee reportedly expressed concern that concealing the negative results would be akin to actions taken by tobacco companies, “who conduct research, know cigarettes are harmful, and then keep that information to themselves.”
The allegation of Meta concealing evidence of social media’s harmful effects is just one of several claims in the lawsuit, which was filed by Motley Rice, a law firm representing school districts across the country. The plaintiffs argue that the companies deliberately hid internally recognized dangers of their products from users, parents, and educators.
The lawsuit further accuses Meta and its competitors of implicitly encouraging children under the age of 13 to use their platforms, failing to address child sexual abuse content, and actively seeking to expand user reach, all while allegedly downplaying the potential mental health risks. The case is expected to bring further scrutiny to the practices of social media giants.



