Data Science Investment Counter —Funding raised by UK data science companies in 2018.
£ 5.640 Million

3000 Google Employees ask CEO to stop working on Military Drones


Thousands of employees of Silicon Valley tech giant Google have penned an angry letter asking the company to interrupt its collaboration with the United States Department of Defence.

Last year, it emerged that Google was involved in the so-called “Project Maven“— a joint effort to develop image-recognition software for military drones. Google’s computer vision algorithms were trained with million of hours of drone videos with the aim of beefing up unmanned air vehicles’ ability to identity objects and people from overhead.

But the tech giant’s proximity to the US military has rubbed many Googlers the wrong way. According to The New York Times a letter signed by over 3,100 Google’s employees has been sent to CEO Sundar Pichai.

“Dear Sundar,
We believe that Google should not be in the business of war,” the letter reads.

“Therefore we ask that Project Maven be cancelled and that Google draft, publicize, and enforce a clear policy stating that neither Google nor its contractors will ever build warfare technology.”

The employees maintain that  “building this technology to assist the US Government in military surveillance—and potentially lethal outcomes—is not acceptable” and that the project is bound to “irreparably damage Google’s brand and its ability to compete for talent.”

Google’s response arrived under the guise of a spokesperson’s statement.

“An important part of our culture is having employees who are actively engaged in the work that we do. We know that there are many open questions involved in the use of new technologies, so these conversations—with employees and outside experts—are hugely important and beneficial,” the statement says.

“Maven is a well-publicized DoD project, and Google is working on one part of it—specifically scoped to be for non-offensive purposes and using open-source object-recognition software available to any Google Cloud customer. The models are based on unclassified data only. The technology is used to flag images for human review and is intended to save lives and save people from having to do highly tedious work.

Any military use of machine learning naturally raises valid concerns. We’re actively engaged across the company in a comprehensive discussion of this important topic and also with outside experts, as we continue to develop our policies around the development and use of our machine-learning technologies.”

Many signatories of the letter will not be convinced by the company’s assurances: while the software’s intended use is ostensibly non-lethal, some suspect that it might be repurposed to carry out more offensive tasks. And helping a drone identify a target can clearly be considered as the first step towards that target’s killing.

Image via Wikimedia


Co-working space and blog dedicated to all things data science.

Subscribe to our newsletter