
TechCrunch revealed in January 2019 that Facebook's 'Research' app (successor to Onavo Protect) paid users aged 13-35 up to $20/month to install a VPN that gave Facebook root-level access to all phone activity — browsing history, private messages, emails, app usage, and location data. Facebook distributed the app through beta testing programs to circumvent Apple's App Store review. Apple revoked Facebook's enterprise certificate, temporarily disabling all internal Facebook iOS apps. This was part of Facebook's competitive intelligence operation to identify and copy rival apps.
“Facebook is literally paying teenagers to install spyware that monitors everything they do on their phones — every message, every app, every website.”
From “crazy” to confirmed
The Claim Is Made
This is the moment they called it crazy.
Facebook didn't just want to know what you did on their platform. They wanted to know everything—what apps you used, who you messaged, where you went, and what you searched for, even on competing services.
In January 2019, TechCrunch exposed a program that most users never knew existed. Facebook had been paying teenagers and young adults up to $20 per month to install a VPN application called "Research" on their phones. The catch: by installing it, users were essentially giving Facebook unrestricted access to their entire digital lives.
The Research app wasn't new. It was the successor to an app called Onavo Protect, which Facebook had acquired and repurposed. Where Onavo had presented itself as a privacy tool—a VPN that protected users on public WiFi—Facebook's Research app dropped the pretense. It was designed to collect comprehensive data on how people used their phones: every app they opened, every website they visited, every message they sent, even their location.
When this became public knowledge, Facebook's response was measured and defensive. The company said the program was legitimate research conducted with parental consent (for minors) and user agreement. They framed it as a standard market research initiative, no different from the kind of data collection academic institutions conduct. Facebook argued that users knowingly agreed to participate and understood what they were signing up for.
What made this program remarkable wasn't just what it collected, but how Facebook distributed it. The company didn't put Research on the Apple App Store. Instead, it used Apple's enterprise developer program—a system designed for companies to distribute internal apps to their own employees. By circumventing the App Store's review process, Facebook avoided Apple's scrutiny and the public visibility that comes with an official listing.
Get the 5 biggest receipts every week, straight to your inbox — plus an exclusive PDF: The Top 10 Conspiracy Theories Proven True in 2025-2026. No spam. No agenda. Just the papers they couldn't hide.
You just read "Facebook paid teens $20/month to install a VPN that gave Fac…". We send ones like this every week.
No one's said anything yet. Be the first to drop your take.
Confirmed: They Were Right
The truth comes out. Officially documented.
Confirmed: They Were Right
The truth comes out. Officially documented.
The strategy backfired when Apple discovered the deception. In February 2019, barely a month after TechCrunch's report, Apple revoked Facebook's enterprise certificate entirely. The move was dramatic: it temporarily disabled not just the Research app, but all of Facebook's internal iOS applications, including Instagram and WhatsApp. Apple had decided that Facebook's abuse of the enterprise system warranted that level of response.
The evidence of what Facebook was doing came from multiple angles. TechCrunch had documented the app's functionality and interviewed people who participated in the program. Internal Facebook documents and communications revealed that the company understood exactly what it was collecting and that this data served a specific strategic purpose: competitive intelligence. By monitoring how people used rival apps, Facebook could identify which services were gaining traction and which features were driving engagement elsewhere.
This wasn't a misunderstanding or a case of vague terms and conditions catching users off guard. Facebook deliberately created an infrastructure to spy on teenage phone users—paying them deliberately small amounts to make the proposition seem harmless while capturing information worth vastly more to advertisers and competitors.
The verification of this claim matters because it demonstrated a willingness among major tech companies to exploit regulatory gaps and user trust simultaneously. It showed that consent, when obtained through obscurity and financial incentive, doesn't necessarily represent genuine understanding. And it revealed that even after being caught repeatedly in privacy violations, Facebook's instinct remained the same: collect more data, by any means available.
Beat the odds
This had a 0% chance of leaking — someone talked anyway.
Conspirators
~150Network
Secret kept
0.5 years
Time to 95% exposure
500+ years