
In August 2021, Apple announced a system to scan all photos uploaded to iCloud using NeuralHash to detect CSAM. Security researchers and the EFF immediately warned it was 'a backdoor masquerading as child safety' that governments could exploit to search for political content. Over 5,000 organizations and individuals signed an open letter opposing it. Apple delayed the rollout, then quietly killed the project in December 2022, admitting that scanning users' photos 'would create new threat vectors for data thieves' and open the door to 'bulk surveillance.'
“Apple is building a backdoor into every iPhone that will scan all your photos. Once built, this will inevitably be expanded to search for political content and anything governments want to find.”
From “crazy” to confirmed
The Claim Is Made
This is the moment they called it crazy.
In August 2021, Apple announced a system that would fundamentally change how the company handles user privacy. The Cupertino tech giant unveiled plans to scan every photo uploaded to iCloud using a technology called NeuralHash, ostensibly to identify child sexual abuse material (CSAM). Apple framed this as a necessary step in the fight against child exploitation, positioning itself as taking moral responsibility where other platforms had fallen short.
The company's messaging was careful and measured. Apple executives explained that the system would work on-device, scanning photos before they were uploaded to the cloud. Only matches to a database of known illegal imagery would be flagged for human review. The system, they argued, would catch criminals without exposing law-abiding users' private photos. It sounded like a technical solution to a genuine problem.
The security community didn't see it that way. Within days of the announcement, researchers at the Electronic Frontier Foundation and other cryptography experts warned that Apple's system was "a backdoor masquerading as child safety." Their concern was fundamental: once Apple installed the capability to scan every photo on every device, the same technology could be repurposed by governments or other actors to search for political content, protest footage, or any other image a regime deemed problematic.
The warnings escalated quickly. More than 5,000 organizations and individuals signed an open letter opposing the plan, including major human rights groups and security experts. These weren't fringe voices—they represented mainstream technology and civil liberties institutions. The letter was clear: Apple had built a surveillance infrastructure that would be nearly impossible to control once activated.
Apple's initial response was to defend the system's architecture. Company representatives insisted the technology was designed to be resistant to misuse. But the pushback continued. The company delayed the rollout in September 2021, buying time to address the concerns.
Get the 5 biggest receipts every week, straight to your inbox — plus an exclusive PDF: The Top 10 Conspiracy Theories Proven True in 2025-2026. No spam. No agenda. Just the papers they couldn't hide.
You just read "Apple planned to scan every iPhone photo for illegal content…". We send ones like this every week.
No one's said anything yet. Be the first to drop your take.
Confirmed: They Were Right
The truth comes out. Officially documented.
Confirmed: They Were Right
The truth comes out. Officially documented.
Then silence. For over a year, the project disappeared from public discussion. By December 2022, Apple officially abandoned the initiative. The company's statement was telling. In a rare admission, Apple acknowledged that scanning users' photos "would create new threat vectors for data thieves" and could open "the door to bulk surveillance." These were the exact concerns security researchers had raised seventeen months earlier.
What's remarkable here is that the concern wasn't dismissed as paranoia or exaggeration—it was validated by Apple itself. The company's own analysis eventually confirmed what experts had warned: a system designed to detect illegal content could inevitably be weaponized for broader surveillance.
This case matters because it demonstrates how technical capability can outlive its stated purpose. Once built, surveillance tools are difficult to contain. The threat isn't always immediate government overreach—it's the infrastructure itself, left in place for the next administration, the next crisis, the next claim of necessity.
The death of Apple's CSAM scanning project shows that public pressure and expert testimony can still halt the surveillance machinery before it's fully implemented. But it also reveals how close we came to accepting a permanent change to how our devices work. In this instance, they knew—and so did we.
Beat the odds
This had a 0.1% chance of leaking — someone talked anyway.
Conspirators
~150Network
Secret kept
1.3 years
Time to 95% exposure
500+ years