
Frances Haugen, a former Facebook data scientist, leaked tens of thousands of internal documents in 2021 showing Facebook knew its algorithms promoted divisive and harmful content because it drove engagement. Internal research showed Instagram was toxic for teen girls. Despite knowing this, the company chose profits over safety. Haugen testified before Congress and the SEC, leading to the company rebranding as Meta.
“Facebook's own research shows their products harm users, especially teens, and they chose profits over people. The algorithm is designed to addict.”
What they said vs. what the evidence shows
“The argument that we deliberately push content that makes people angry for profit is deeply illogical. We make money from ads, and advertisers don't want their ads next to harmful content.”
— Mark Zuckerberg / Meta · Oct 2021
SourceFrom “crazy” to confirmed
The Claim Is Made
This is the moment they called it crazy.
When Frances Haugen walked into a television studio in 2021, she carried something that would fundamentally challenge how we understand one of the world's most powerful companies. The former Facebook data scientist had spent years inside the company's offices, and she had decided to go public with what she had witnessed: Facebook knew its platform was causing harm, and it had chosen not to fix it.
The claim itself was straightforward but devastating. Facebook, Haugen alleged, had deliberately engineered addictive features into its platform and deliberately prioritized engagement metrics over user safety. More specifically, the company's own internal research showed that Instagram, Facebook's photo-sharing subsidiary, was toxic for teenage girls—yet the company continued operating it without meaningful safeguards. These weren't casual observations or outside critiques. These were findings from Facebook's own scientists, buried in internal documents.
For years, Facebook had dismissed such concerns. Company executives and spokespeople had repeatedly stated that they took safety seriously, that their algorithms were designed to connect people and promote meaningful interaction. When critics raised concerns about mental health impacts or divisive content, Facebook's response was consistent: the science was inconclusive, and their responsibility was to respect user autonomy and free expression.
Then Haugen did something that changed everything. She took tens of thousands of internal documents out of Facebook's offices. These materials, which became known as the "Facebook Files," were provided to the Wall Street Journal and shared with the SEC and Congress. What they revealed was remarkable in its specificity and damning in its implications.
The documents showed that Facebook's own researchers had concluded Instagram was linked to body image issues, eating disorders, and suicidal ideation among teenage girls. Internal research also demonstrated that Facebook's algorithm systematically promoted divisive and inflammatory content because such posts generated engagement—the metric by which Facebook measured success. The company had identified the problem with precision. It understood the mechanism. And it did nothing substantial to address it.
Get the 5 biggest receipts every week, straight to your inbox — plus an exclusive PDF: The Top 10 Conspiracy Theories Proven True in 2025-2026. No spam. No agenda. Just the papers they couldn't hide.
You just read "Facebook knowingly designed addictive features and prioritiz…". We send ones like this every week.
No one's said anything yet. Be the first to drop your take.
Confirmed: They Were Right
The truth comes out. Officially documented.
Confirmed: They Were Right
The truth comes out. Officially documented.
In October 2021, Haugen testified before Congress, becoming the public face of the revelations. She spoke not as a conspiracy theorist but as an insider describing standard corporate practice: Facebook prioritized its own financial interests over the wellbeing of its users, particularly minors. The testimony was covered extensively by major news outlets and prompted investigations by regulators worldwide.
The impact was real but perhaps not transformative in the way some hoped. Facebook's parent company rebranded itself as Meta in a move many viewed as damage control. Regulatory scrutiny increased. But the company's fundamental business model—leveraging user data and attention for advertising revenue—remained intact. What changed was the documented record. We could no longer claim ignorance. We had the internal emails, the research papers, the presentations that showed the company had known.
This matters because it establishes a pattern. The claim that tech companies prioritize profit over safety wasn't speculation anymore. It was documented fact. For anyone trying to understand whether corporations can be trusted to self-regulate, or whether transparency and accountability matter, the Facebook Files provided a clear answer. What was once dismissed as conspiracy theory had become the official record.
Beat the odds
This had a 0% chance of leaking — someone talked anyway.
Conspirators
~150Network
Secret kept
0.5 years
Time to 95% exposure
500+ years