Does the technology already operate so that a killer robot locate someone, point and shoot them completely autonomously? A group of United Nations experts maintains that it is, and that this situation already occurred in Libya in March 2020. This was reported to the Security Council, as published last week by the magazine New Scientist. This far-reaching document (whose authors include the Spanish Luis Antonio de Albuquerque Bacardit) may represent a turning point, since for the first time in history it is admitted that a machine has autonomously attacked a human being, which it sparks deep ethical and legal debates because it shakes international conventions about what is allowed in war.
Under the colloquial concept of killer robots a vast field of weapons hides, explains Defense analyst Jesús Manuel Pérez Triana: “These range from drones that fit in the palm of one hand to large ones to devices such as the Global Hawk American, whose wingspan is greater than that of a Boeing 737 and which is capable of crossing the Atlantic Ocean without problems, passing through autonomous armored vehicles such as those developed by the Estonian company Milrem Robotics, that have already been tested in wars like the one in Mali ”.
The use of partially autonomous drones (that is, they need some human intervention) has become widespread in such a way that they have been more than frequent in the wars in Libya and Syria, and they became “the real star” of the conflict in Upper Karabakh that they faced Armenia and Azerbaijan in 2020, explains Pérez Triana. In this last contest the call was made known to the general public marauding ammunition, which is based on the idea of not using drones to launch weapons, but that the drone itself is equipped with a warhead and when it locates a target, an operator throws it at it because the drone itself is a weapon. “The obvious step is to equip this kamikaze drone with a target recognition system that allows it to operate autonomously.”
For their operation, autonomous drones are equipped with a camera and an image processing algorithm: in the same way that a program can be taught to recognize faces, it is taught to recognize targets. And to attack them. Until now, the air forces debated which would be the last plane with a pilot. “What we are seeing”, explains Pérez Triana, “is that a technological race has been launched to massively manufacture cheap kamikaze drones that can be launched in swarms thanks to the distributed computing, in which each drone by itself does not have much processing power, but acting in a swarm works as a hive mind. And if what is intended is to launch swarms of drones in a massive way, you cannot depend on a human sitting behind a console ”.
This step forward has already been taken, according to the letter sent by the group of experts to the UN Security Council. In it, an episode that occurred in March 2020 during the war in Libya is related. On the ground, the forces of General Khalifa Hafter, 77, then a strongman from the Russian-backed east of the country. His troops launched an attack on Tripoli and were repelled by the army of the UN-recognized Prime Minister, Fayez Sarraj. “Logistical convoys and retreating Hafter-affiliated forces were subsequently pursued and remotely attacked by unmanned combat aerial vehicles or lethal autonomous weapons systems such as the STM Kargu-2. [un dron militar de fabricación turca] and other loitering ammunition. Lethal autonomous weapon systems were programmed to strike targets without requiring data connectivity between the operator and the ammunition. ” The letter does not reveal whether there were any fatalities. Industry sources in Spain, however, are skeptical of this possibility. “Beyond the advances of the technique in purely academic fields, and as far as we know, the industrial programs and the innovations brought to systems in use in Defense are at a very low level of autonomy”, they assure.
These killer robots, explains Rahul Uttamchandani, an expert lawyer in the Legal Army cabinet, contravene “all the principles on which modern warfare is based.” The basics are humanity (Any person who does not participate or who has stopped participating in hostilities must be treated humanely); need (do not use weapons or methods that cause excessive damage with respect to the intended military advantage); proportionality (The adversary must not be inflicted with disproportionate harm in relation to the objective of the armed conflict); Y distinction (It is necessary to differentiate at all times between the population and the combatants).
The Legislation Debate
Autonomous weapons (known as limited automated weapons, or LAWS) or the automated ones (or partially autonomous) have already existed for a long time, recalls Joaquín Rodríguez Álvarez, doctor in International Public Law, professor at the UAB and responsible for the campaign in Spain Stop Killing Robots (stop killer robots). Until now, he says, its use had been limited to defense. The most recent example of this is the Iron dome, Israel’s defense system that, according to its army, managed to intercept 90% of the missiles launched by Hamas militias in the fight that broke out last month.
There is now considerable international activity trying to get it banned. Possible negotiations, however, are at a standstill. Both the United States and the United Kingdom and the NATO countries maintain that existing International Humanitarian Law (IHL) already “Provides a comprehensive framework to control the use of autonomy in armed systems”, as stated in 2019 by the North American representative before the Convention on Certain Conventional Weapons (CCW). The legal reasoning, however, is full of nooks and crannies that hinder any progress, according to Vicente Garrido Rebolledo, professor at the Rey Juan Carlos University and member between 2014 and 2017 of the Advisory Council of the United Nations Office for Disarmament Affairs for the general secretaries of Ban Ki-moon and Antonio Gutérres.
The Geneva Conventions constitute the backbone of this humanitarian legislation. However, neither the United States nor Turkey (among others) have signed part of the protocols, especially those that refer to internal conflicts. This circumstance would in itself render IHL ineffective for cases such as Libya. On the other hand, in order for its four basic principles (humanity, necessity, proportionality and distinction) to be respected, it is necessary to attribute the actions of war to someone. “In the case of autonomous weapons, how do you ask a machine to explain its behavior? No responsibility can be attributed to a machine. And if we have reached full automation, something that I question, we will have gone to a higher stage, for which a specific regulation is necessary ”, explains Garrido Rebolledo. Another possibility is to agree to a moratorium until an agreement is reached. “If this measure is reached, the losers would be the States and the industries, for which the slogan is not to talk about the matter.” This is the position adopted by the producing countries, with which Spain aligns itself.
“Where there is no discussion is that we are only witnessing the prologue of what it can become”, predicts Pérez Triana. “We are only at the beginning of the Terminator-like nightmares, which will unfold as the capacity of artificial intelligence is further developed.”