
In January 2012, Facebook secretly manipulated the news feeds of 689,003 users for one week, reducing exposure to either positive or negative emotional content to study 'emotional contagion.' The results, published in PNAS in June 2014, showed Facebook could make users sadder or happier by controlling what they saw. No informed consent was obtained. The study included children and adolescents without parental consent. Facebook's only justification was that users had agreed to general terms of service. Lead author later apologized for 'the way the paper described the research.'
“Facebook conducted a secret psychological experiment on nearly 700,000 users, manipulating their emotions without telling them. This is unethical human experimentation at massive scale.”
From “crazy” to confirmed
The Claim Is Made
This is the moment they called it crazy.
Facebook conducted a psychological experiment on nearly 700,000 of its users without their knowledge or consent. For one week in January 2012, the social media giant secretly manipulated what appeared in users' news feeds to study whether it could alter their emotional state. The experiment worked exactly as designed—and that's precisely the problem.
The study, led by researchers including Adam Kramer, Jeffrey Hancock, and Sheryl Tal Yarkoni, systematically reduced either positive or negative content in users' feeds. Some users saw fewer happy posts. Others saw fewer sad ones. The goal was to measure "emotional contagion"—whether Facebook could make people feel worse or better simply by controlling the information they encountered.
Facebook's response was characteristically dismissive. The company claimed that by using the platform, all users had implicitly agreed to participate in research through the platform's general terms of service. This wasn't a special study protocol with opt-in consent. This wasn't notification sent to users explaining the experiment. This was a terms-of-service clause that most people never read, stretched to cover psychological manipulation of emotions.
When the results were published in the prestigious Proceedings of the National Academy of Sciences in June 2014, the findings were unmistakable. Facebook could make people sadder. Facebook could make people happier. The company possessed a documented ability to influence the emotional state of hundreds of thousands of people simultaneously, and it had already done so without asking.
The evidence also revealed something more troubling. Among the 689,003 users whose feeds were manipulated, the dataset included children and adolescents. No parental consent was obtained for these minors. No separate protections were afforded to a vulnerable population. A platform had conducted emotional psychology experiments on children without their parents knowing.
Get the 5 biggest receipts every week, straight to your inbox — plus an exclusive PDF: The Top 10 Conspiracy Theories Proven True in 2025-2026. No spam. No agenda. Just the papers they couldn't hide.
You just read "Facebook secretly manipulated 689,003 users' emotions by alt…". We send ones like this every week.
No one's said anything yet. Be the first to drop your take.
Confirmed: They Were Right
The truth comes out. Officially documented.
Confirmed: They Were Right
The truth comes out. Officially documented.
The academic response was swift. Institutional review boards that oversee ethical research standards questioned how the study had bypassed normal consent procedures. The lead author later acknowledged the controversy, apologizing for "the way the paper described the research"—a carefully worded statement that avoided admitting the research itself was unethical.
What makes this case significant is what it revealed about corporate power and the vulnerability of digital citizens. Facebook didn't need a lab. It didn't need to recruit subjects. It had billions of captive users whose data and behavior it could manipulate at will, all justified by the fine print they never read. The company demonstrated that consent, in the digital age, had become theoretical rather than actual.
This wasn't a bug in Facebook's system. It was a feature. The experiment worked because the platform was specifically designed to maximize engagement through content curation. Facebook simply weaponized that capability for research purposes—and got away with it for years before anyone noticed.
The lesson persists today. Users continue to operate under the assumption that their online activities are benign scrolling. What this documented case proved is that every interaction is observable, every emotion is measurable, and every user is a subject in experiments they never consented to join. That asymmetry of knowledge and power between platforms and people remains the central issue of our digital age.
Beat the odds
This had a 0% chance of leaking — someone talked anyway.
Conspirators
~100Network
Secret kept
0.5 years
Time to 95% exposure
500+ years