Lethal autonomous weapons (LAWs) are a type of autonomous military system that can independently search for and engage targets based on programmed constraints and descriptions. LAWs are also known as lethal autonomous weapon systems (LAWS), autonomous weapon systems (AWS), robotic weapons or killer robots. LAWs may operate in the air, on land, on water, underwater, or in space. The autonomy of current systems was restricted in the sense that a human gives the final command to attack—though there are exceptions with certain "defensive" systems.
Being autonomous as a weapon
Being "autonomous" has different meanings in different fields of study. In engineering it may refer to the machine's ability to operate without human involvement. In philosophy it may refer to an individual being morally independent. In political science it may refer to an area's capability of
self-governance
__NOTOC__
Self-governance, self-government, or self-rule is the ability of a person or group to exercise all necessary functions of regulation without intervention from an external authority. It may refer to personal conduct or to any form of ...
. In terms of military weapon development, the identification of a weapon as autonomous is not as clear as in other areas.
The specific standard entailed in the concept of being autonomous can vary hugely between different scholars, nations and organizations.
Various people have many definitions of what constitutes a lethal autonomous weapon. The official United States Department of Defense Policy on Autonomy in Weapon Systems, defines an Autonomous Weapons Systems as, "A weapon system that, once activated, can select and engage targets without further intervention by a human operator." Heather Roff, a writer for
Case Western Reserve University School of Law
Case Western Reserve University School of Law is one of eight schools at Case Western Reserve University in Cleveland, Ohio. It was one of the first schools accredited by the American Bar Association. It is a member of the Association of American ...
, describes autonomous weapon systems as "armed weapons systems, capable of learning and adapting their 'functioning in response to changing circumstances in the environment in which
hey aredeployed,' as well as capable of making firing decisions on their own." This definition of autonomous weapon systems is a fairly high threshold compared to the definitions of scholars such as Peter Asaro and Mark Gubrud's definitions seen below.
Scholars such as Peter Asaro and Mark Gubrud are trying to set the threshold lower and judge more weapon systems as autonomous. They believe that any weapon system that is capable of releasing a lethal force without the operation, decision, or confirmation of a human supervisor can be deemed autonomous. According to Gubrud, a weapon system operating partially or wholly without human intervention is considered autonomous. He argues that a weapon system does not need to be able to make decisions completely by itself in order to be called autonomous. Instead, it should be treated as autonomous as long as it actively involves in one or multiple parts of the "preparation process", from
finding the target to finally firing.
Other organizations, however, are setting the standard of autonomous weapon system in a higher position. The
British Ministry of Defence defines autonomous weapon systems as "systems that are capable of understanding higher level intent and direction. From this understanding and its perception of its environment, such a system is able to take appropriate action to bring about a desired state. It is capable of deciding a course of action, from a number of alternatives, without depending on human oversight and control - such human engagement with the system may still be present, though. While the overall activity of an autonomous unmanned aircraft will be predictable, individual actions may not be."
As a result, the composition of a treaty between states requires a commonly accepted labeling of what exactly constitutes an autonomous weapon.
Automatic defensive systems
The oldest automatically triggered lethal weapon is the
land mine, used since at least the 1600s, and
naval mines, used since at least the 1700s.
Anti-personnel mines are banned in many countries by the 1997
Ottawa Treaty, not including the United States, Russia, and much of Asia and the Middle East.
Some current examples of LAWs are automated "hardkill"
active protection system
An active protection system is a system designed to actively prevent certain anti-tank weapons from destroying a vehicle.
Countermeasures that either conceal the vehicle from, or disrupt the guidance of an incoming guided missile threat are design ...
s, such as a radar-guided
CIWS systems used to defend ships that have been in use since the 1970s (e.g., the US
Phalanx CIWS). Such systems can autonomously identify ''and attack'' oncoming missiles, rockets, artillery fire, aircraft and surface vessels according to criteria set by the human operator. Similar systems exist for tanks, such as the Russian
Arena, the Israeli
Trophy
A trophy is a tangible, durable reminder of a specific achievement, and serves as a recognition or evidence of merit. Trophies are often awarded for sporting events, from youth sports to professional level athletics. In many sports medals (or, in ...
, and the German
AMAP-ADS AMAP-ADS (active defence system) is a hard-kill active protection system (APS), developed by the German company ADS Gesellschaft für aktive Schutzsysteme, a daughter company of Rheinmetall and IBD Deisenroth Engineering, as part of their Advanc ...
. Several types of stationary
sentry guns, which can fire at humans and vehicles, are used in South Korea and Israel. Many
missile defence systems, such as
Iron Dome, also have autonomous targeting capabilities.
The main reason for not having a "human in the loop" in these systems is the need for rapid response. They have generally been used to protect personnel and installations against incoming projectiles.
Autonomous offensive systems
Systems with a higher degree of autonomy would include
drones or
unmanned combat aerial vehicles, e.g.: "The unarmed
BAE Systems Taranis jet-propelled combat drone prototype may lead to a
Future Combat Air System that can autonomously search, identify, and locate enemies but can only engage with a target when authorized by mission command. It can also defend itself against enemy aircraft" (Heyns 2013, §45). The
Northrop Grumman X-47B drone can take off and land on aircraft carriers (demonstrated in 2014); it is set to be developed into an
Unmanned Carrier-Launched Airborne Surveillance and Strike (UCLASS) system.
According to ''
The Economist'', as technology advances, future applications of unmanned undersea vehicles might include mine clearance, mine-laying, anti-submarine sensor networking in contested waters, patrolling with active sonar, resupplying manned submarines, and becoming low-cost missile platforms. In 2018, the U.S.
Nuclear Posture Review alleged that Russia was developing a "new intercontinental, nuclear-armed, nuclear-powered, undersea autonomous torpedo" named "
Status 6".
The
Russian Federation is actively developing
artificially intelligent missiles,
drones,
unmanned vehicles,
military robots and medic robots.
Israeli Minister
Ayoob Kara stated in 2017 that
Israel is developing military robots, including ones as small as flies.
In October 2018, Zeng Yi, a senior executive at the Chinese defense firm
Norinco
China North Industries Group Corporation Limited, doing business internationally as Norinco Group (North Industries Corporation), and known within China as China Ordnance Industries Group Corporation Limited (), is a Chinese state-owned defense ...
, gave a speech in which he said that "In future battlegrounds, there will be no people fighting", and that the use of lethal autonomous weapons in warfare is "inevitable". In 2019, US Defense Secretary
Mark Esper lashed out at China for selling drones capable of taking life with no human oversight.
The British Army deployed new unmanned vehicles and military robots in 2019.
The
US Navy is developing "ghost" fleets of
unmanned ships.
In 2020 a
Kargu 2 drone hunted down and attacked a human target in
Libya, according to a report from the
UN Security Council’s Panel of Experts on Libya, published in March 2021. This may have been the first time an autonomous killer robot armed with lethal weaponry attacked human beings.
Ethical and legal issues
Standard used in US policy
Current US policy states: "Autonomous … weapons systems shall be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force." However, the policy requires that autonomous weapon systems that kill people or use kinetic force, selecting and engaging targets without further human intervention, be certified as compliant with "appropriate levels" and other standards, not that such weapon systems cannot meet these standards and are therefore forbidden.
"Semi-autonomous" hunter-killers that autonomously identify and attack targets do not even require certification.
Deputy Defense Secretary Robert Work said in 2016 that the Defense Department would "not delegate lethal authority to a machine to make a decision", but might need to reconsider this since "authoritarian regimes" may do so. In October 2016 President
Barack Obama stated that early in his career he was wary of a future in which a US president making use of
drone warfare could "carry on perpetual wars all over the world, and a lot of them covert, without any accountability or democratic debate". In the US, security-related AI has fallen under the purview of the National Security Commission on Artificial Intelligence since 2018. On October 31, 2019, the United States Department of Defense's Defense Innovation Board published the draft of a report outlining five principles for weaponized AI and making 12 recommendations for the ethical use of artificial intelligence by the Department of Defense that would ensure a human operator would always be able to look into the 'black box' and understand the kill-chain process. A major concern is how the report will be implemented.
Possible violations of ethics and international acts
Stuart Russell, professor of computer science from
University of California, Berkeley stated the concern he has with LAWS is that his view is that it is unethical and inhumane. The main issue with this system is it is hard to distinguish between combatants and non-combatants.
There is concern by some economists and legal scholars about whether LAWs would violate
International Humanitarian Law, especially the principle of distinction, which requires the ability to discriminate combatants from non-combatants, and the
principle of proportionality
Proportionality is a general principle in law which covers several separate (although related) concepts:
*The concept of proportionality is used as a criterion of fairness and justice in statutory interpretation processes, especially in consti ...
, which requires that damage to civilians be proportional to the military aim. This concern is often invoked as a reason to ban "killer robots" altogether - but it is doubtful that this concern can be an argument against LAWs that do not violate International Humanitarian Law.
A 2021 report by the American
Congressional Research Service states that "there are no domestic or international legal prohibitions on the development of use of LAWs," although it acknowledges ongoing talks at the
UN Convention on Certain Conventional Weapons (CCW).
LAWs are said by some to blur the boundaries of who is responsible for a particular killing. Philosopher Robert Sparrow argues that autonomous weapons are causally but not morally responsible, similar to child soldiers. In each case, he argues there is a risk of atrocities occurring without an appropriate subject to hold responsible, which violates
''jus in bello''. Thomas Simpson and Vincent Müller argue that they may make it easier to record who gave which command. Likewise,
Steven Umbrello
Stephen or Steven is a common English first name. It is particularly significant to Christians, as it belonged to Saint Stephen ( grc-gre, Στέφανος ), an early disciple and deacon who, according to the Book of Acts, was stoned to death; h ...
, Émile P. Torres and
Angelo F. De Bellis argue that if the technical capacities of LAWs are at least as accurate as human soldiers, then given the psychological shortcomings of human soldiers in war warrants that only these types of ethical LAWs should be used. Likewise, they propose using the
value sensitive design
Value sensitive design (VSD) is a theoretically grounded approach to the design of technology that accounts for human values in a principled and comprehensive manner. VSD originated within the field of information systems design and human-comput ...
approach as a potential framework for designing these laws to align with human values and
International Humanitarian Law.
Further, potential IHL violations by LAWs are – by definition – only applicable in conflict settings that involve the need to distinguish between combatants and civilians. As such, any conflict scenario devoid of civilians’ presence – i.e. in space or the deep seas – would not run into the obstacles posed by IHL.
Political economists
Christopher Coyne (professor) and Yahya Alshamy state concerns about LAWS incentivizing militarism and undermining peaceful solutions. They emphasize that humans are uniquely capable of engaging in the dynamic search for peace due to their morality and mortality. They also express concern over a potential arms race and the establishment of a precedent for acquiring LAWS. Whereas many scholars see proliferation as conducive to mutual deterrence by
Mutual assured destruction, they state that for proliferation to be effective in deterring aggressive behaviors, refraining from using weapons must have some positive probability of breaking down. If there were no chance the weapons would be used, they would not serve to deter anyone, and there would be no point having them in the first place.
Campaigns to ban LAWs

The possibility of LAWs has generated significant debate, especially about the risk of "killer robots" roaming the earth - in the near or far future. The group
Campaign to Stop Killer Robots
The Campaign to Stop Killer Robots is a coalition of non-governmental organizations who seek to pre-emptively ban lethal autonomous weapons.
History
First launched in April 2013, the Campaign to Stop Killer Robots has urged governments and the U ...
formed in 2013. In July 2015, over 1,000 experts in artificial intelligence signed a letter warning of the threat of an
artificial intelligence arms race and calling for a ban on
autonomous weapons. The letter was presented in
Buenos Aires at the 24th
International Joint Conference on Artificial Intelligence The International Joint Conference on Artificial Intelligence (IJCAI) is the leading conference in the field of Artificial Intelligence. The conference series has been organized by the nonprofit IJCAI Organization since 1969, making it the oldest p ...
(IJCAI-15) and was co-signed by
Stephen Hawking,
Elon Musk,
Steve Wozniak
Stephen Gary Wozniak (; born August 11, 1950), also known by his nickname "Woz", is an American electronics engineer, computer programmer, philanthropist, inventor, and technology entrepreneur. In 1976, with business partner Steve Jobs, he c ...
,
Noam Chomsky,
Skype co-founder
Jaan Tallinn and
Google DeepMind co-founder
Demis Hassabis, among others.
According to
PAX For Peace (one of the founding organisations of the Campaign to Stop Killer Robots), fully automated weapons (FAWs) will lower the threshold of going to war as soldiers are removed from the battlefield and the public is distanced from experiencing war, giving politicians and other decision-makers more space in deciding when and how to go to war.
They warn that once deployed, FAWs will make democratic control of war more difficult - something that author of ''
Kill Decision
''Kill Decision'' is a science fiction novel by Daniel Suarez, published in 2012.Kill Decision' at Amazon.com It deals with themes of espionage, artificial intelligence, and warfare using robots and drones. The story deals with the fictional scen ...
'' - a novel on the topic - and IT specialist
Daniel Suarez also warned about: according to him it might recentralize power into very few hands by requiring very few people to go to war.
[
There are websites protesting the development of LAWs by presenting undesirable ramifications if research into the appliance of artificial intelligence to designation of weapons continues. On these websites, news about ethical and legal issues are constantly updated for visitors to recap with recent news about international meetings and research articles concerning LAWs.
The Holy See has called for the international community to ban the use of LAWs on several occasions. In November 2018, Archbishop ]Ivan Jurkovic
Ivan () is a Slavic male given name, connected with the variant of the Greek name (English: John) from Hebrew meaning 'God is gracious'. It is associated worldwide with Slavic countries. The earliest person known to bear the name was Bulga ...
, the permanent observer of the Holy See to the United Nations, stated that “In order to prevent an arms race and the increase of inequalities and instability, it is an imperative duty to act promptly: now is the time to prevent LAWs from becoming the reality of tomorrow’s warfare.” The Church worries that these weapons systems have the capability to irreversibly alter the nature of warfare, create detachment from human agency and put in question the humanity of societies.
, the majority of governments represented at a UN meeting to discuss the matter favoured a ban on LAWs. A minority of governments, including those of Australia, Israel, Russia, the UK, and the US, opposed a ban. The United States has stated that autonomous weapons have helped prevent the killing of civilians.
In December 2022, a vote of the San Francisco Board of Supervisors
The San Francisco Board of Supervisors is the legislative body within the government of the City and County of San Francisco.
Government and politics
The City and County of San Francisco is a consolidated city-county, being simultaneously a c ...
to authorize San Francisco Police Department use of LAWs drew national attention and protests. The Board reversed this vote in a subsequent meeting.
No ban, but regulation
A third approach focuses on regulating the use of autonomous weapon systems in lieu of a ban. Military AI arms control will likely require the institutionalization of new international norms embodied in effective technical specifications combined with active monitoring and informal ('Track II') diplomacy by communities of experts, together with a legal and political verification process. In 2021, the United States Department of Defense requested a dialogue with the Chinese People's Liberation Army on AI-enabled autonomous weapons but was refused.
See also
* Artificial intelligence arms race
* List of fictional military robots
The following is a list of fictional works with military robots.
Film
Near future
Land design
*''Kill Command'' (2016) – S.A.R
*''Fahrenheit 451'' (1953) – Mechanical hound
*'' Red Planet'' (2000) – AMEE (Autonomous Mapping Explorati ...
* '' Slaughterbots''
References
Further reading
* Heyns, Christof (2013)
‘Report of the Special Rapporteur on extrajudicial, summary or arbitrary executions’
UN General Assembly, Human Rights Council, 23 (3), A/HRC/23/47.
* Krishnan, Armin (2009)
Killer robots: Legality and ethicality of autonomous weapons
(Aldershot: Ashgate)
* Müller, Vincent C. (2016)
‘Autonomous killer robots are probably good news’
in Ezio Di Nucci and Filippo Santoni de Sio (eds.), Drones and responsibility: Legal, philosophical and socio-technical perspectives on the use of remotely controlled weapons, 67-81 (London: Ashgate).
s campaign against LAWs
* Sharkey, Noel E (2012), �
Automating Warfare: lessons learned from the drones
��, ''Journal of Law, Information & Science'', 21 (2).
* Simpson, Thomas W and Müller, Vincent C. (2016)
‘Just war and robots’ killings’
''The Philosophical Quarterly'' 66 (263), 302–22.
* Singer, Peter (2009), Wired for war: The robotics revolution and conflict in the 21st Century (New York: Penguin)
* US Department of Defense (2012), �
Directive 3000.09, Autonomy in weapon systems
��. <2014 Killer Robots Policy Paper Final.docx>.
* US Department of Defense (2013), ‘Unmanned Systems Integrated Road Map FY2013-2038’. Seminar at UPenn
*{{cite book , last1=Saxon , first1=Dan , title=Fighting Machines: Autonomous Weapons and Human Dignity , date=2022 , publisher=University of Pennsylvania Press , isbn=978-0-8122-9818-5 , language=en
Uncrewed vehicles
Military robots