
Facebook manipulated 689,000 users' news feeds in 2012 to study emotional contagion without consent. The study altered the emotional content users saw to measure psychological responses.
“We regularly conduct research to improve our services, always in compliance with our terms of service”
From “crazy” to confirmed
The Claim Is Made
This is the moment they called it crazy.
In 2014, Facebook quietly disclosed that it had conducted a psychological experiment on nearly 700,000 of its users without their knowledge or consent. The experiment, carried out in January 2012, systematically manipulated users' news feeds to show them either more positive or more negative content, then measured how those changes affected their emotional states and posting behavior. What made this revelation especially striking was not just that it happened, but how casually the social media giant had treated the ethical implications.
The experiment was designed to test a concept called "emotional contagion"—the idea that emotions can spread between people through social networks. Facebook wanted to know if they could deliberately alter a user's emotional state by changing what appeared in their feed. To do this, they manipulated the algorithms controlling the news feeds of 689,000 users, reducing the amount of either positive or negative content each group saw. Researchers then analyzed whether those users posted more positive or negative content themselves in response.
When the study was published in the Proceedings of the National Academy of Sciences (PNAS) in June 2014, it sparked immediate controversy. The research itself was legitimate—the findings showed that emotional content in news feeds did influence users' moods and behavior. But the method used to conduct it raised profound ethical questions that Facebook had apparently not seriously grappled with.
Initially, Facebook's response was defensive. Company executives argued that the experiment was no different from the countless A/B tests the platform conducted routinely to improve user experience. They suggested that critics were overreacting, and that the study fell under acceptable research protocols. The company pointed out that the experiment had been approved internally and that the data came from their standard terms of service, which theoretically gave them the right to use user data for research purposes. Some defended the work on purely scientific grounds, emphasizing that the results contributed to legitimate academic understanding of social networks.
Get the 5 biggest receipts every week, straight to your inbox — plus an exclusive PDF: The Top 10 Conspiracy Theories Proven True in 2025-2026. No spam. No agenda. Just the papers they couldn't hide.
You just read "Facebook Conducted Secret Psychological Experiments on Users". We send ones like this every week.
No one's said anything yet. Be the first to drop your take.
But this dismissal didn't hold up under scrutiny. The crucial difference between normal platform optimization and this experiment was intent and transparency. Facebook wasn't trying to improve the user experience; they were deliberately trying to make some people feel worse to measure the psychological effect. They had not explicitly informed participants they were in a study. And perhaps most significantly, they had not obtained informed consent—the foundational principle of ethical research on human subjects.
The PNAS paper itself became the smoking gun. It documented exactly what Facebook had done, providing the specific details that proved the original claims were not exaggerated. The study was published in a top-tier peer-reviewed journal, giving it undeniable credibility and permanence. Facebook could not dispute its own research.
This episode matters because it fundamentally shifted how the public understood Silicon Valley's relationship with user data. It wasn't just about privacy or surveillance—it was about the willingness to conduct psychological experiments on millions of people without their knowledge for corporate interests. The incident raised urgent questions about what power technology companies actually held and what ethical guardrails, if any, constrained them.
The Facebook emotional contagion study revealed something that many suspected but few could prove: tech platforms were treating their users as subjects in ongoing experiments, not as customers deserving transparency and respect. The documented truth proved far more consequential than any conspiracy theory.
Beat the odds
This had a 0.7% chance of leaking — someone talked anyway.
Conspirators
~150Network
Secret kept
11.9 years
Time to 95% exposure
500+ years