
In 2017, Google partnered with the Pentagon on Project Maven, using AI to analyze drone surveillance footage and identify targets. The contract was kept secret until exposed by Gizmodo in March 2018. Over 4,000 Google employees signed a letter demanding Google exit 'the business of war,' and at least 12 resigned in protest. Google eventually let the contract expire in March 2019 and published AI ethics principles prohibiting weapons development. Palantir subsequently took over the contract.
“Google is secretly helping the Pentagon build AI for drone warfare. They're using their machine learning to improve military targeting and surveillance.”
From “crazy” to confirmed
The Claim Is Made
This is the moment they called it crazy.
When Google employees discovered their company was building artificial intelligence specifically designed to improve the accuracy of military drone strikes, many felt betrayed. The tech giant they'd joined to "organize the world's information" was now organizing data for warfare—and doing it in the dark.
Project Maven began quietly in 2017 as a Pentagon initiative to automate the analysis of drone surveillance footage. Google's AI expertise made it the natural contractor. The company assigned roughly 100 engineers to the project, which involved developing machine learning algorithms capable of identifying and tracking human targets in high-resolution video feeds. For months, this arrangement remained confidential, known only to a select group of executives and government officials.
Google's official position was straightforward: this was a legitimate commercial contract, no different from any other technology partnership. The company suggested that helping the military apply AI responsibly was preferable to letting less careful contractors handle the work. Some executives argued the technology was purely defensive—analyzing data, not making targeting decisions. The Pentagon framed Maven as modernization, a way to process overwhelming amounts of surveillance data more efficiently.
But in March 2018, investigative reporting by Gizmodo exposed the secret arrangement. The article detailed exactly what Google was building and for whom, forcing the company to publicly acknowledge what it had kept hidden. What happened next revealed the depth of employee opposition.
Over 4,000 Google workers—roughly 10 percent of the workforce at the time—signed an open letter demanding their employer withdraw from Project Maven immediately. The letter was unambiguous: "We cannot support weapons development or technologies whose purpose is to harm people." Twelve employees went further, resigning on principle. Their departures sent a clear message that this wasn't an abstract ethical concern—it was serious enough to sacrifice careers.
Get the 5 biggest receipts every week, straight to your inbox — plus an exclusive PDF: The Top 10 Conspiracy Theories Proven True in 2025-2026. No spam. No agenda. Just the papers they couldn't hide.
You just read "Google secretly built AI for Pentagon drone surveillance — 4…". We send ones like this every week.
No one's said anything yet. Be the first to drop your take.
Confirmed: They Were Right
The truth comes out. Officially documented.
Confirmed: They Were Right
The truth comes out. Officially documented.
Internal emails and subsequent reporting showed the company had underestimated the strength of employee objections. Google had apparently believed quiet implementation would avoid controversy. Instead, the secrecy amplified the backlash. Workers felt misled by leadership that had promoted the company as an ethical actor in technology while simultaneously advancing military surveillance capabilities.
By March 2019, exactly one year after the Gizmodo article, Google allowed its Project Maven contract to expire and published new AI ethics principles that explicitly prohibited developing weapons systems. The move seemed like a victory for the protesters, a demonstration that employee activism could actually constrain corporate behavior.
What's often overlooked is what happened next. The Pentagon didn't abandon Project Maven—it simply found another contractor. Palantir Technologies, a data analytics company with deeper ties to military and intelligence work, took over the contract and continued developing the same surveillance AI systems that Google had abandoned.
This pattern matters because it reveals how corporate ethics commitments can sometimes represent a reshuffling rather than a genuine reckoning. Google got to satisfy employee demands and burnish its ethics reputation. The Pentagon got its AI. Only the defense contractors changed. The real story isn't that Google employees won—it's that the system absorbed their resistance and proceeded anyway, just through different channels.
Beat the odds
This had a 0% chance of leaking — someone talked anyway.
Conspirators
~150Network
Secret kept
0.5 years
Time to 95% exposure
500+ years