
BBC investigation found that Meta and TikTok made deliberate decisions to remove evidence-based content safeguards, resulting in immediate surges of harmful content on their platforms.
“BBC investigation found that Meta and TikTok made deliberate decisions to remove evidence-based content safeguards, resulting in immediate surges of harmful content on their platforms.”
A BBC investigation in March 2026 revealed that both Meta and TikTok made deliberate decisions to remove evidence-based content safety measures. The result was immediate and predictable: harmful content surged on both platforms.
The companies removed safeguards that had been proven to reduce harmful content — not because the measures were ineffective, but because they reduced engagement metrics.
This is the same pattern Frances Haugen exposed in 2021 when she leaked internal Facebook documents showing the company knew its algorithms amplified harmful content. Three years later, they are still actively making choices that prioritize engagement over user safety.
Get the 5 biggest receipts every week, straight to your inbox — plus an exclusive PDF: The Top 10 Conspiracy Theories Proven True in 2025-2026. No spam. No agenda. Just the papers they couldn't hide.
You just read "Meta and TikTok deliberately removed content safety measures…". We send ones like this every week.
No one's said anything yet. Be the first to drop your take.
Beat the odds
This had a 0.1% chance of leaking — someone talked anyway.
Conspirators
~150Network
Secret kept
2.4 years
Time to 95% exposure
500+ years