In the first days of the war in the Gaza Strip, which turns six months this Sunday with more than 33,000 dead, The Israeli Army relied almost entirely on an algorithmic system that qualified 37,000 Palestinians as suspected Hamas militants, turning them and their residences into military targets.
This was stated in an investigation published this week by the Israeli newspaper Sicha Mekomit (called +972 in its English version) and based on intelligence sources, which indicated that Soldiers unquestioningly adopted “kill lists” recommended by an artificial intelligence system, not used until now, nicknamed Lavender.
Once the names were suggested, “human personnel” spent about “20 seconds” on each target before authorizing a bombing, in order to simply ensure that the potential militiaman was a man, the investigation details.
However, Lavender is not that sophisticated, the text says, and carries a margin of error in approximately 10% of cases, so it can occasionally incriminate “people who have merely a loose connection to militant groups, or no connection at all.”
Furthermore, according to the text, The army attacked these people, many possible very low-ranking militiamen, “systematically” in their homes. – “usually at night, while their entire family was present” – by considering them an easier target, and increasing the number of civilian deaths “allowed” by each combatant.
“We were not interested in killing Hamas operatives only when they were in a military building or participating in a military operation,” an intelligence officer told +972.
“On the contrary, the Israel Defense Forces bombed them in their homes, without hesitation, as a first option,” he added, claiming that “the system is designed to search for them in these situations.”
In the first weeks of the war, according to two sources cited anonymously by the investigation, The army decided that the attack on any young Hamas operative identified by Lavender could result in the death of between “15 or 20 civilians”, an unprecedented non-combatant ratio.
In previous wars, The military did not authorize any “collateral damage” if they were men of rank, while if the target was a senior Hamas official or commander, in the current offensive, sources said the army repeatedly contemplated the massacre of more than 100 civilians.
Furthermore, the army preferred to use unguided missiles against these minor militants, commonly known as “dumb bombs” and documented in Gaza in late December by CNN, capable of destroying entire buildings and causing other casualties.
“It is not advisable to waste expensive bombs on unimportant people; it is very expensive for the country and there is a shortage of these bombs,” explained an intelligence officer on condition of anonymity.
The use of this and other artificial intelligence systems such as “Gospel”, based on algorithms, would be related to the high number of civilian deaths in the Strip.since according to unverified Israeli figures, 13,000 deaths would have some relationship with Islamist groups out of more than 32,900 deaths, not counting thousands of bodies that remain under tons of rubble.
According to data from the World Bank, 84% of medical facilities in the Palestinian enclave have also been damaged or destroyed, the water and sanitation system is almost collapsed, with 5% of its production, and roads and infrastructure have been devastated. by Israeli artillery and bombing.
What does Israel respond to the accusations?
The Israeli Army responded to the accusations revealed by the Israeli media Sicha Mekomit and stated that “The IDF does not use an artificial intelligence system that identifies terrorist agents or try to predict if a person is a terrorist.”
In a statement shared with The Guardian after the publication of the complaint, Israel assured that Your Army has several types of tools and methods to collect information and that information management tools are only part of the process of identifying targets in war.
“Information systems are merely tools for analysts in the target identification process. According to IDF directives, analysts must conduct independent examinations, in which they verify that the identified targets meet the relevant definitions in accordance with international law and the additional restrictions stipulated in the IDF directives,” they stated.
Israel also assured that Lavender is not a system but a database that is used to cross-reference information obtained by intelligence sources in military operations. “It is not a list of confirmed military agents eligible to attack,” they added.
The Army also assured that, before attacking each objective, the IDF carries out an individual evaluation of the operational advantage and collateral damage of attacking a certain area.
In the statement, Israel stated that its military does not carry out attacks when it considers that the collateral damage of the attack outweighs the military advantage.
Israel faces growing international pressure as its military operation in Gaza reaches six months and exceeds 33,000 victims. Criticism of the Hebrew country increased this week after the death of seven workers from the NGO World Central Kitchen (WCK) in an Israeli attack in the Strip.
The Army assured
this Friday that the attack against the aid workers was due to an “error” in believing that two armed Hamas militiamen were traveling in the convoy. But NGOs claim that Israel has systematically attacked humanitarian aid groups that currently operate within the enclave.
The Armed Forces announced that two commanders involved in the operation will be dismissed and two others reprimanded following the investigation carried out in this regard.
This Friday, The UN secretary general said he was “deeply concerned” by reports that Israel was using artificial intelligence (AI) to identify targets in Gaza, and rejected the idea that “life and death decisions” are delegated to algorithms.
I have been warning for years about the dangers of weaponizing artificial intelligence
“I am deeply concerned by reports that the Israeli army's bombing campaign includes artificial intelligence as a tool to identify targets, particularly in densely populated residential areas, resulting in high numbers of civilian casualties,” Antonio Guterres told the press.
“No part of the life-and-death decisions that impact entire families should be delegated to the cold calculation of algorithms,” he insisted.
“I have been warning for years about the dangers of weaponizing artificial intelligence and reducing the essential role of human intervention,” Guterres stressed.
“AI should be used as a force for good, for the benefit of the world, and not contribute to war at an industrial level,” he added.
*With AFP and EFE
#Israel #system #identify #Hamas #members #Gaza #research