
Internal Facebook research leaked by whistleblower Frances Haugen and published by the Wall Street Journal showed: '32% of teen girls said when they felt bad about their bodies, Instagram made them feel worse' and 'We make body image issues worse for one in three teen girls.' Among teens with suicidal thoughts, 13% of UK users and 6% of US users traced it to Instagram. Facebook never made this research public and gave misleading responses to congressional inquiries.
“Instagram is destroying teenage girls' mental health and Facebook knows it but hides the evidence to protect their profits.”
What they said vs. what the evidence shows
“The research is being used to paint a misleading picture. Instagram's effects on well-being are nuanced.”
— Facebook VP of Global Affairs Nick Clegg · Sep 2021
SourceFrom “crazy” to confirmed
The Claim Is Made
This is the moment they called it crazy.
When Frances Haugen walked into the Wall Street Journal offices with internal Facebook documents in 2021, she carried something the company had worked hard to keep private: proof that executives knew Instagram was harming teenage girls and did nothing to stop it.
The research was damning and specific. Facebook's own scientists found that 32 percent of teen girls said Instagram made them feel worse about their body image when they were already struggling with it. Another finding was even more direct: "We make body image issues worse for one in three teen girls." Among the most vulnerable—teens already experiencing suicidal thoughts—13 percent of UK users and 6 percent of US users traced those thoughts directly to Instagram. These weren't theoretical concerns or outside allegations. They came from Facebook's own internal research teams.
For years, company leaders had publicly downplayed concerns about Instagram's mental health impact on young users. When Congress asked pointed questions about whether the platform harmed teenagers, Facebook provided responses that contradicted what their own researchers had discovered. The company gave misleading assurances that suggested the science was inconclusive or that Instagram actually benefited young people. Haugen's leak proved that wasn't a case of miscommunication or unclear findings—it was a deliberate omission.
What made this claim verifiable wasn't just one source. The Wall Street Journal published the documents alongside analysis. The House Committee on Commerce obtained and published their own documentation of Facebook's knowledge. The research methodology was sound, the internal communications were authentic, and the timeline was clear: Facebook knew about these harms well before they made public statements suggesting otherwise.
Get the 5 biggest receipts every week, straight to your inbox — plus an exclusive PDF: The Top 10 Conspiracy Theories Proven True in 2025-2026. No spam. No agenda. Just the papers they couldn't hide.
You just read "Facebook's own research showed Instagram is toxic for teen g…". We send ones like this every week.
No one's said anything yet. Be the first to drop your take.
Confirmed: They Were Right
The truth comes out. Officially documented.
Confirmed: They Were Right
The truth comes out. Officially documented.
The significance here extends beyond one company or one app. This case illuminates how technology firms can conduct rigorous internal research, reach uncomfortable conclusions about their own products, and then decide that protecting shareholder value matters more than protecting children. Facebook didn't lack the data to make different choices about Instagram's design and algorithmic recommendations. They had the data and chose not to act on it.
The stakes are particularly high because Instagram specifically targets young people. The company was simultaneously developing Instagram Kids, a version designed for children under 13, even while internal research showed the existing platform was damaging teenage girls' mental health. The company later paused the project after public pressure, but only after Haugen's revelations became public.
This also matters for public trust in institutional accountability. When companies mislead regulators and the public while possessing contrary evidence, it suggests that existing oversight mechanisms aren't sufficient. Congressional hearings can be deflected. Public relations statements can obscure findings. Only when someone with internal access decides that truth matters more than loyalty does the actual picture emerge.
Five years after the initial claims, Instagram and Meta have made incremental changes to how the app handles body image content and teen accounts. But the core question remains: How many other harmful findings are sitting in corporate research departments right now, known to companies but hidden from parents, regulators, and the public? Haugen's disclosure proved that for at least one major platform, the answer was "quite a few."
Beat the odds
This had a 0% chance of leaking — someone talked anyway.
Conspirators
~150Network
Secret kept
0.5 years
Time to 95% exposure
500+ years