Meta’s Hidden Study
What the Courts Are Saying About Social Media & Mental Health
Recent court filings have uncovered a troubling narrative: Meta (the company behind Facebook and Instagram) may have downplayed or even buried its own research showing negative mental health effects, especially among teens.
What the Court Case Reveals
- Internal Meta research, including a “Teen Mental Health Deep Dive,” found that many young users felt worse about themselves after using Instagram.
- According to the filings, one in five teens said Instagram “makes them feel worse about themselves.”
- Meta allegedly knew that its platform’s design features like likes, follower counts, and “always-on” notifications was driving social comparison, anxiety, and depression.
- In a striking comparison, internal documents quoted in the lawsuit say hiding the negative findings was “akin to the tobacco industry … doing research … and then keeping that info to themselves.”
- One project called Project Mercury, done with Nielsen, reportedly showed that people who “deactivated” Meta apps for a week experienced lower depression, anxiety, loneliness, and social comparison.
- Despite those findings, Meta allegedly stopped further work and publicly claimed the negative results were tainted by “media narrative.”
A deeper dive into ‘Project Mercury’
Research Design
- Meta ran a 2020 internal study code-named ‘Project Mercury‘.
- They partnered with Nielsen, a well-known survey/research firm, to measure what happens when users deactivate (stop using) Facebook and Instagram for a week.
- The goal was likely to assess casual links between social-media usage and mental well-being.
Major Findings (Reported in Filings)
- Users who deactivated both platforms for a week reported lower levels of:
- Depression
- Anxiety
- Loneliness
- Negative social comparison (i.e. comparing themselves less to others)
- These results were described in internal documents as showing a “causal impact” on social comparison.
Meta’s Internal Reaction
- Rather than publicising the results or continuing more research, Meta reportedly stopped the study.
- Internally, some employees raised red flags: one research staffer allegedly compared Meta’s silence to the tobacco industry “doing research … and then keeping that info to themselves.”
- Another staffer stated that despite criticism, the research did, in fact, show a casual link to social comparison.
- However, Meta publicly defended terminating the project, claiming the methodology was flawed.
- According to filings, Meta cited the “existing media narrative” about social media harms as a reason to question the validity of the negative findings.
Broader Context
- The revelations come from court documents in a multi-district litigation case. NDC California Court
- Plaintiffs (which include U.S. school districts) argue Meta knowingly designed social-media features (likes, follower counts, algorithmic recommendations) that trigger social comparison, which harms mental health.
- According to the complaint, Meta’s internal researchers identified “downward spirals” in teens related to self-esteem, comparison, anxiety, and depression.
Why This Isn’t Just About Tech, It’s a Public Health Issue
These are not isolated or speculative claims. Multiple states in the U.S. have filed lawsuits against Meta, asserting that the company knowingly designed features to maximise engagement, even at the cost of teen mental health.
The lawsuits argue that Meta’s internal data shows clear links between usage and anxiety, depression, and low self-esteem, but the company publicly downplayed or denied these risks.
Why This Matters Now
Meta’s case isn’t just a legal battle, it’s a wake-up call. The internal documents allege the company was fully aware of possible harms, yet continued with features that keep users engaged, particularly young users.
If you’re a business, you need to understand not just the tech risks, but the societal and reputational risks that come with platform partnerships.