
In 2021, Frances Haugen leaked thousands of internal Facebook documents proving the company knew its engagement algorithms promoted hate speech, misinformation, and content that harmed teen mental health. Internal research showed that Instagram made body image issues worse for 1 in 3 teen girls. Facebook implemented election safeguards in 2020 then rolled them back after Biden won, allowing conspiratorial content to 'fester.' Haugen testified to Congress that Facebook 'chose to optimize for its own interests, like making more money.'
“The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook. Facebook, over and over again, chose to optimize for its own interests.”
What they said vs. what the evidence shows
“The argument that we deliberately push content that makes people angry for profit is deeply illogical. We make money from ads, and advertisers don't want their ads next to harmful or angry content.”
— Mark Zuckerberg / Meta · Oct 2021
SourceFrom “crazy” to confirmed
The Claim Is Made
This is the moment they called it crazy.
Facebook told the world its algorithms were designed to connect people and foster community. Internally, the company knew something different was happening—and chose not to stop it.
In 2021, whistleblower Frances Haugen walked out of Facebook's offices with thousands of internal documents that would fundamentally challenge the company's public narrative. What she revealed wasn't speculation or inference. It was Facebook's own research, conducted by Facebook's own scientists, proving the company understood exactly how its engagement algorithms amplified rage, misinformation, and harmful content—and did little to change course.
For years, Facebook executives faced criticism from activists, researchers, and regulators who argued the platform's algorithms prioritized engagement over safety. The company's standard response was consistent: these claims were exaggerated. Facebook maintained it was balancing user safety with free expression and innovation. Spokesperson after spokesperson insisted the company cared deeply about preventing harm.
The documents told a different story. According to internal research cited in the Wall Street Journal's "Facebook Files" investigation, the company's own data showed that its algorithm systematically boosted divisive content because it drove engagement. Content that provoked anger, fear, and outrage kept people on the platform longer. The algorithm learned this. It optimized for it.
The evidence extended beyond abstract metrics. Internal research revealed that Instagram, Facebook's photo-sharing platform, made body image issues demonstrably worse for teenage girls. One document noted that the platform made "body image issues worse for 1 in 3 teen girls." Facebook scientists flagged this finding. Instagram's leadership knew. Nothing substantive changed.
Get the 5 biggest receipts every week, straight to your inbox — plus an exclusive PDF: The Top 10 Conspiracy Theories Proven True in 2025-2026. No spam. No agenda. Just the papers they couldn't hide.
You just read "Facebook's own research showed its algorithm amplifies rage …". We send ones like this every week.
No one's said anything yet. Be the first to drop your take.
Confirmed: They Were Right
The truth comes out. Officially documented.
Confirmed: They Were Right
The truth comes out. Officially documented.
The documents also revealed a cynical approach to election integrity. After the 2020 presidential election, Facebook had implemented safeguards to reduce misinformation and conspiratorial content. These measures worked. Then, with Biden's victory secured, Facebook rolled back many of these protections. The barrier to reaching viral status was lowered again. Conspiratorial and false content was allowed to "fester," in Haugen's characterization.
In October 2021, Haugen testified before Congress about what she'd witnessed. She was direct: "Facebook, over and over again, chose to optimize for its own interests, like making more money." The company had prioritized profit margins over the mental health of teenagers, over election integrity, over the spread of demonstrably false information.
Facebook's response followed the established playbook. The company released a statement arguing Haugen's claims were being taken out of context, that the research was being mischaracterized, that Facebook cared about safety. Some outlets ran both sides. But the documents themselves were difficult to explain away. They were internal. They were contemporaneous. They were damning.
What makes this case significant isn't that a large company prioritized profits—that's hardly shocking. What matters is the scale of the deception. Facebook spent years telling users, advertisers, lawmakers, and the public one thing while internal research showed something entirely different. The company didn't lack information. It lacked incentive to act on it.
This case demonstrates why institutional accountability matters. When powerful platforms shape public discourse, their internal research shouldn't remain internal. The gap between what companies tell the public and what they know privately is where real harm happens.
Beat the odds
This had a 0% chance of leaking — someone talked anyway.
Conspirators
~100Network
Secret kept
0.5 years
Time to 95% exposure
500+ years