
Internal Facebook documents leaked by Frances Haugen showed the company's own researchers found that 'thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse.' Facebook researchers also found the platform's algorithms amplified eating disorder content and that teens reported Instagram increased rates of anxiety and depression. The company buried these findings.
“Social media companies know their algorithms are harming teenagers' mental health and they are hiding the evidence.”
What they said vs. what the evidence shows
“I don't think the research is that conclusive. Social media effects on teens are more nuanced than the narrative suggests.”
— Instagram Head Adam Mosseri · Sep 2021
SourceFrom “crazy” to confirmed
The Claim Is Made
This is the moment they called it crazy.
Facebook knew. That's the uncomfortable conclusion drawn from thousands of internal documents that the company never intended to share with the public.
For years, Instagram's parent company insisted its photo-sharing platform was a benign space for self-expression and connection. When critics raised concerns about the app's impact on teenage mental health, Facebook executives pointed to limited evidence and anecdotal complaints. The company's public stance remained consistent: Instagram was not responsible for rising rates of anxiety, depression, and body dysmorphia among young users. If anything, the company argued, Instagram provided a positive community for teenagers to explore their identities.
This narrative collapsed in September 2021 when Frances Haugen, a former Facebook product manager, leaked internal research documents to journalists. The Wall Street Journal obtained what became known as "The Facebook Files," a trove of studies conducted by the company's own researchers between 2019 and 2021. One finding was particularly damning: thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse.
The research went further. Facebook's internal team documented that the platform's recommendation algorithms actively amplified eating disorder content. They found that Instagram increased anxiety and depression among significant portions of its teen user base. Researchers discovered that the more time teenagers spent on Instagram, the more negative effects they experienced. These weren't speculative conclusions or external critiques. This was Facebook's own science, conducted by Facebook's own teams, using Facebook's own data.
For years, Facebook leadership had access to these findings. Instead of acting on them, the company buried the research. When researchers presented these uncomfortable truths internally, executives shelved the findings rather than implement meaningful changes. The company continued its public relations campaign claiming Instagram was safe for teenagers, even as internal memos told a different story.
Get the 5 biggest receipts every week, straight to your inbox — plus an exclusive PDF: The Top 10 Conspiracy Theories Proven True in 2025-2026. No spam. No agenda. Just the papers they couldn't hide.
You just read "Internal Facebook research confirmed Instagram worsened body…". We send ones like this every week.
No one's said anything yet. Be the first to drop your take.
Confirmed: They Were Right
The truth comes out. Officially documented.
Confirmed: They Were Right
The truth comes out. Officially documented.
The leaked documents soon reached policymakers and public health officials. The U.S. Surgeon General, acknowledging this research among other evidence, issued an advisory on social media and youth mental health in December 2023. The advisory warned that social media platforms posed genuine risks to adolescent mental health and called for immediate action to protect young users.
What makes this case significant is not merely that Instagram harms teen mental health—critics had suspected this for years. What matters is that Facebook's own researchers confirmed it, documented it meticulously, and that the company chose suppression over accountability. This represents a failure of corporate responsibility on a massive scale.
The implications extend beyond Instagram or even Facebook. This case demonstrates how large technology companies can commission rigorous internal research, discover uncomfortable truths about their products, and simply choose not to act. It reveals a system where corporate self-interest systematically outweighs public welfare, even when the stakes involve children's mental health.
For the public, this serves as a reminder that skepticism toward corporate claims remains warranted. When tech companies assure us their platforms are safe, we now know to ask: what internal research did they conduct? What did they find? And critically, why weren't we told?
Beat the odds
This had a 0% chance of leaking — someone talked anyway.
Conspirators
~150Network
Secret kept
0.5 years
Time to 95% exposure
500+ years