
In 1993, the Clinton administration introduced the Clipper Chip (MYK-78), requiring all phones to include an NSA-designed encryption chip with built-in government backdoor access via key escrow. The encryption algorithm 'Skipjack' was classified, preventing peer review. In 1994, cryptographer Matt Blaze found a critical flaw that allowed the escrow system to be bypassed entirely. The chip was abandoned by 1996 after massive public opposition, but the concept of mandated encryption backdoors resurfaces regularly.
“The government wants to install a chip in every phone that gives them a master key to decrypt any conversation. This is a backdoor masquerading as security.”
From “crazy” to confirmed
The Claim Is Made
This is the moment they called it crazy.
In 1993, the Clinton administration proposed something that would have fundamentally changed how Americans could communicate in private. The plan was called the Clipper Chip, and it represented one of the most direct attempts by the U.S. government to build surveillance into consumer technology at a mandatory, structural level.
The idea seemed straightforward enough from the government's perspective. Every telephone would be required to contain an NSA-designed encryption chip called the MYK-78. This chip would scramble conversations so thoroughly that no one—not criminals, not foreign adversaries—could listen in. The catch was that the government would always be able to listen in. The NSA held special "keys" that could unlock any conversation, stored in an "escrow" system overseen by two separate agencies. Law enforcement could obtain a warrant, retrieve the keys, and decrypt any call they wanted.
Supporters of the program framed it as a necessary balance. Privacy advocates and the technology industry were told that this was the only way to have strong encryption in consumer products without completely tying the hands of law enforcement. The algorithm powering the system, called Skipjack, was kept classified—meaning independent security experts couldn't examine it for flaws. The government essentially asked the public to trust that the system was secure simply because the NSA said so.
The skepticism was immediate and widespread. If the government had a master key, the argument went, what prevented criminals from obtaining that key? What protected it from being misused? These weren't paranoid questions—they were basic security engineering questions that should have been answerable with evidence.
In 1994, cryptographer Matt Blaze set out to answer them. What he discovered was devastating to the program's credibility. Blaze found a critical flaw in the escrow system's design. The problem wasn't with Skipjack itself, but with how the key escrow mechanism was implemented. He demonstrated that someone could actually remove the escrow information from the chip while keeping the encryption fully functional. The backdoor that was supposed to be unbreakable could be bypassed entirely. Blaze published his findings in a paper titled "Protocol Failure in the Escrowed Encryption Standard," and the security research community confirmed his work.
Get the 5 biggest receipts every week, straight to your inbox — plus an exclusive PDF: The Top 10 Conspiracy Theories Proven True in 2025-2026. No spam. No agenda. Just the papers they couldn't hide.
You just read "The US government tried to mandate a backdoor in every phone…". We send ones like this every week.
No one's said anything yet. Be the first to drop your take.
Confirmed: They Were Right
The truth comes out. Officially documented.
Confirmed: They Were Right
The truth comes out. Officially documented.
This wasn't a theoretical vulnerability that required extraordinary circumstances to exploit. It was a fundamental design flaw that any competent engineer could understand and potentially implement. The system the government wanted to make mandatory had a bypass that worked.
The revelations, combined with overwhelming public opposition from privacy advocates, technologists, and civil liberties groups, proved too much for the administration to overcome. By 1996, the Clipper Chip program was quietly abandoned.
What's instructive here isn't just that the government tried and failed. It's that the underlying impulse didn't disappear. Law enforcement and intelligence agencies continue to push for encryption backdoors, arguing that strong privacy harms national security. Each time, the same arguments resurface: trust us, we'll keep the keys safe, the encryption will work perfectly.
The Clipper Chip serves as a concrete historical reminder that these promises are fragile. It proves that well-intentioned security architecture can contain fatal flaws. And it demonstrates that the gap between what officials claim is technologically possible and what experts can actually achieve in practice is not always small.
Beat the odds
This had a 0.1% chance of leaking — someone talked anyway.
Conspirators
~150Network
Secret kept
1.3 years
Time to 95% exposure
500+ years