By: Adaoma Okafor

Recently, Laura Nolan, a Google engineer who worked on a collaboration project with the Department of Defense, hinted that the anticipated autonomous weapon may cause unintentional mass killings.[1] Her concerns are not too far-fetched. From the use of Siri on Apple products, to the use of Alexa, artificial intelligence (“AI”) is constantly being applied to fill the gaps in human knowledge and activity. This has been the case for autonomous weapons and  partly the reason why some countries are beginning to engage in an AI arms race. Autonomous weapons are defined as lethal weapons designed to mark a target and then eliminate it without human involvement. [2] 

Project Maven, otherwise called the Algorithmic Warfare Cross-Function Team, is an example.[3]  The program is an attempt by the Department of Defense to include artificial intelligence in its counter-terrorism efforts.[4]  The goal of the operation is to “turn the enormous volume of data available to the DoD into actionable intelligence and insights at speed.” [5]  In other words, the project focuses on creating drones that are trained to attack in a war zone.[6]     

Increasingly, AI weapons are being developed by the military and foreign governments. Its development has impacted big business conglomerates such as Google, Amazon, and Microsoft.[7]   So much so, Google ended its contract on Project Maven after its employees signed petitions to leave the project.[8]  While development of AI in the autonomous weapons arena may continue to advance the cybersecurity military agenda, there are still important ethical implications surrounding this technology. For example, can the AI discriminate between an enemy combatant and a civilian, and what would happen if it did not?

In 1980, the United Nations adopted The Convention on Certain Conventional Weapons.[9] It was a convention aimed at mitigating the use of certain ammunition in a war zone.[10]  It was designed to “ban or restrict the use of specific types of weapons that are considered to cause unnecessary or unjustifiable suffering to combatants or to affect civilians indiscriminately.”[11] In 1983, the convention was approved by 50 states.[12]  While the United Nations has a convention in place, more still needs to be done in order to protect unanticipated victims.

The Human Rights Watch has suggested that while there can be legal redress, there will be legal challenges to laws imposing direct accountability on manufacturers or software developers.[13] The Human Rights Watch reasons that manufacturers of such weapons and the software developer may likely have equivalent control over the device and as a result will continue to shift liability.[14]  However, there is a possibility that both the manufacturer and the software developer can be held liable under the theory of contributory negligence.  Negligence requires a duty, breach, causation, and damages. The software developers and the manufactures may likely have a duty to monitor and update the software, or to ensure that accurate information was utilized by the software. The developers or manufacturers use certain data sets to train the AI.  The inaccurate data sets may likely be the proximate cause of the wrong prediction that may cause further harm. The breach of their duty was the cause of the wrong prediction and as a result, there are damages.  However, approaching it from this theory might still be problematic. This is because issues of governmental immunity may arise. [15]  

In 2013, soldiers in the in the United Kingdom died while being transported in a vehicle that some called the “mobile coffin”. [16] The Supreme Court held that the plaintiffs could sue for negligence but that the military would “still be immune for “high-level policy decisions … or decisions made in the heat of battle.”[17] While this “mobile coffin” was not an autonomous weapon, it goes to show that government officials may likely still be excused when unfortunate events happen. When AI accidentally kills people, it is unlikely that individuals will have any redress. As a result, it is important for the United States and international communities to develop ways to police themselves and hold themselves accountable in the case that the autonomous weapon misfires.


[1] See Isobel Asher Hamilton, A Former Google Engineer Warned that Robot Weapons Could Cause Accidental Mass Killings, Bus. Insider (Sept 16. 2019), https://www.businessinsider.com/former-google-engineer-warns-against-killer-robots-2019-9.

[2] See Letter from Stuart Russell, et. Al, to IJCAI 2015 Conference (July 28, 2015) (https://futureoflife.org/open-letter-autonomous-weapons/?cn-reloaded=1).

[3] See Cheryl Pellerin, Project Maven Industry Day Pursues Artificial Intelligence for DoD Challenges, U.S. Dep’t of Def. (Oct. 27, 2017),  https://www.defense.gov/Newsroom/News/Article/Article/1356172/project-maven-industry-day-pursues-artificial-intelligence-for-dod-challenges/.

[4] Id.

[5] Memorandum from Deputy Secretary of Defense on Establishment of an Algorithmic Warfare Cross-Functional Team (Project Maven) (Apr. 26, 2017) (on file with author).

[6] See Hamilton, supra note 1.

[7] See Adam Frisk, What is Project Maven? The Pentagon AI Project Employees Want Out of, Glob. News (Apr. 5, 2018), https://globalnews.ca/news/4125382/google-pentagon-ai-project-maven/.

[8] See Shane Scott & Daisuke Wakabayashi, ‘The Business of War’: Google Employees Protest Work for the Pentagon, N.Y. Times (Apr. 05, 2018), https://www.nytimes.com/2018/04/04/technology/google-letter-ceo-pentagon-project.html.

[9] See The Convention on Certain Conventional Weapons, The United Nations Office at Geneva  https://www.unog.ch/80256EE600585943/(httpPages)/4F0DEF093B4860B4C1257180004B1B30 (last visited, Sept. 23, 2019).

[10] See id.

[11] Id.

[12] See id.

[13] Human Rights Watch, The Lack of Accountability for Killer Robots (2015).

[14] Id.

[15]  Id.

[16] Iraq Damages Cases: Supreme Court Rules Families Can Sue, BBC News, (June 19, 2013) https://www.bbc.com/news/uk-22967853 .

[17]  Human Rights Watch, The Lack of Accountability for Killer Robots (2015).

Share this post