WASHINGTON – *American technology giant Google has allowed the US DefenseDepartment to utilize its artificial intelligence (AI) technologies inmilitary drone projects, stirring concern among experts who think suchcooperation is not ethical.*
The US military and CIA have long been using drones to carry out targetedkillings and surveillance around the world.
It has been revealed now that the Pentagon is using Google’s TensorFlow AIsystems in Project Maven, which was established in July 2017 to analyzedata and video footage.
The data will then be used to conduct more precise attacks by dronesagainst targets in other countries, including Syria and Iraq.
Maven is one of the many US drone projects aimed at automatically andquickly singling out points of interest so that analysts can work moreefficiently in pinpointing new targets.
The Pentagon has so far spent $7.4 billion on AI and data processing. Theuse of machine learning and artificial intelligence tools has allowedmilitary experts to easily analyze vast amounts of footage captured by theUS military’s fleet of over 1,100 drones.
“People and computers will work symbiotically to increase the ability ofweapon systems to detect objects. Eventually we hope that one analyst willbe able to do twice as much work, potentially three times as much, asthey’re doing now. That’s our goal,” said Drew Cukor, chief of the DoD’sAlgorithmic Warfare Cross-Function Team.
Google says the current cooperation in AI is in its first stages and willexpand over time. Responding to internal criticism, the company chargedthat its advanced tools are only used for “non-offensive” purposes.
“This specific project is a pilot with the Department of Defense, toprovide open source TensorFlow APIs that can assist in object recognitionon unclassified data. The technology flags images for human review, and isfor non-offensive uses only,” said a Google spokesman.
That explanation is not enough for those Google employees who think theirwork is being used for attacks that have on many occasions led to civiliandeaths.