
In July 2019, Belgian broadcaster VRT NWS obtained over 1,000 Google Assistant recordings from a contractor, many captured without the wake word being spoken. Recordings included bedroom conversations, phone calls with sensitive information, and domestic arguments. Google acknowledged that contractors listen to about 0.2% of all audio snippets — which amounts to millions of recordings given the scale. Google settled a $68 million lawsuit for recording users without proper consent. The company had failed to disclose that human contractors reviewed recordings.
“Google is recording your conversations even when you haven't said 'Hey Google.' Human contractors are listening to recordings from your home.”
From “crazy” to confirmed
The Claim Is Made
This is the moment they called it crazy.
Every time you say "Hey Google," you assume you're triggering a private conversation with a machine. That assumption turned out to be dangerously naive.
In July 2019, Belgian broadcaster VRT NWS published an investigation that would fundamentally challenge how millions of people understood their smart home devices. The news outlet had obtained over 1,000 audio recordings captured by Google Assistant—and many of them shouldn't have been recorded at all. These weren't isolated glitches or edge cases. They were systematic failures in one of the world's most trusted technology companies.
The recordings told intimate stories of everyday life gone wrong. Conversations in bedrooms. Phone calls containing sensitive financial information. Domestic arguments that family members thought were private. A person undressing. Another triggering the device by accident while saying a similar-sounding phrase. These weren't hypothetical risks—they were documented, real moments where Google's technology was listening when users believed it wasn't.
Google's official position had always been reassuring. The company publicly stated that human contractors only reviewed audio when the wake word "Hey Google" or "OK Google" was clearly detected. This was the privacy safeguard users relied upon. The company had never explicitly disclosed—until forced to—that actual humans were listening to any recordings at all, let alone recordings captured without proper activation.
When confronted with VRT NWS's reporting, Google's response was measured and somewhat defensive. The company acknowledged that contractors do review audio snippets to improve the service, but insisted the practice was limited and privacy-conscious. They claimed that users had agreed to this arrangement in their terms of service. The explanation felt hollow to anyone who had actually read those terms or considered what 0.2% of recordings actually meant in practice.
Get the 5 biggest receipts every week, straight to your inbox — plus an exclusive PDF: The Top 10 Conspiracy Theories Proven True in 2025-2026. No spam. No agenda. Just the papers they couldn't hide.
You just read "Google Assistant recorded conversations without the wake wor…". We send ones like this every week.
No one's said anything yet. Be the first to drop your take.
Confirmed: They Were Right
The truth comes out. Officially documented.
Confirmed: They Were Right
The truth comes out. Officially documented.
Here's what made that 0.2% figure so troubling: Google processes billions of audio requests. Even a tiny percentage translates to millions of recordings being heard by human ears. Millions of conversations that users never knew were being transcribed, analyzed, and stored. The math was simple, but the implications were staggering.
Google eventually acknowledged the core claim was fundamentally true. The company updated its privacy disclosures. In 2020, it settled a $68 million lawsuit with users who had been recorded without proper consent. The legal system had validated what VRT NWS had already proven: Google had failed its users' trust in a measurable way.
What makes this case significant isn't that Google was uniquely malicious. Rather, it exposed how the architecture of modern technology companies can prioritize data collection over transparency. A user cannot reasonably consent to what they don't know exists. Burying disclosure in lengthy terms of service isn't informed consent—it's obfuscation dressed up as policy.
Today's smart devices are standard fixtures in millions of homes. They sit in bedrooms and bathrooms, in living rooms where families argue and make love. That ubiquity makes the 2019 findings impossible to dismiss as ancient history. The question this case poses remains relevant: when a device is always listening, even partially, how much privacy can we actually claim to have? Google's settlement acknowledged the problem. Whether users have truly learned from it is another question entirely.
Beat the odds
This had a 0% chance of leaking — someone talked anyway.
Conspirators
~150Network
Secret kept
0.5 years
Time to 95% exposure
500+ years