‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza
The Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties, +972 and Local Call reveal.
By Yuval Abraham | April 3, 2024
In 2021, a
book
titled "The Human-Machine Team: How to Create Synergy Between Human and
Artificial Intelligence That Will Revolutionize Our World" was released
in English under the pen name "Brigadier General Y.S." In it, the author
a man who we confirmed to be the current commander of the elite
Israeli intelligence unit 8200
makes the case for designing a
special machine that could rapidly process massive amounts of data to
generate thousands of potential "targets" for military strikes in the
heat of a war. Such technology, he writes, would resolve what he
described as a "human bottleneck for both locating the new targets and
decision-making to approve the targets."
Such a machine, it turns out, actually exists. A new investigation by
+972 Magazine and Local Call reveals that the Israeli army has developed
an artificial intelligence-based program known as "Lavender," unveiled
here for the first time. According to six Israeli intelligence officers,
who have all served in the army during the current war on the Gaza Strip
and had first-hand involvement with the use of AI to generate targets
for assassination, Lavender has played a central role in the
unprecedented bombing of Palestinians, especially during the early
stages of the war. In fact, according to the sources, its influence on
the military's operations was such that they essentially treated the
outputs of the AI machine "as if it were a human
decision."
Formally, the Lavender system is designed to mark all suspected
operatives in the military wings of Hamas and Palestinian Islamic Jihad
(PIJ), including low-ranking ones, as potential bombing targets. The
sources told +972 and Local Call that, during the first weeks of the
war, the army almost completely relied on Lavender, which clocked as
many as 37,000 Palestinians as suspected militants
and their homes
for possible air strikes.
During the early stages of the war, the army gave sweeping approval for
officers to adopt Lavender's kill lists, with no requirement to
thoroughly check why the machine made those choices or to examine the
raw intelligence data on which they were based. One source stated that
human personnel often served only as a "rubber stamp" for the machine's
decisions, adding that, normally, they would personally devote only
about "20 seconds" to each target before authorizing a bombing
just
to make sure the Lavender-marked target is male. This was despite
knowing that the system makes what are regarded as "errors" in
approximately 10 percent of cases, and is known to occasionally mark
individuals who have merely a loose connection to militant groups, or no
connection at all.
Moreover, the Israeli army systematically attacked the targeted
individuals while they were in their homes
usually at night while
their whole families were present
rather than during the course of
military activity. According to the sources, this was because, from what
they regarded as an intelligence standpoint, it was easier to locate the
individuals in their private houses. Additional automated systems,
including one called "Where's Daddy?" also revealed here for the first
time, were used specifically to track the targeted individuals and carry
out bombings when they had entered their family's
residences.
|
Palestinians transport the wounded and try to put out a fire after an Israeli airstrike on a house in the Shaboura refugee camp in the city of Rafah, southern Gaza Strip, November 17, 2023. (Abed Rahim Khatib/Flash90) |
The result, as the sources testified, is that thousands of Palestinians
most of them women and children or people who were not involved in
the fighting
were wiped out by Israeli airstrikes, especially during
the first weeks of the war, because of the AI program's
decisions.
"We were not interested in killing [Hamas] operatives only when they
were in a military building or engaged in a military activity," A., an
intelligence officer, told +972 and Local Call. "On the contrary, the
IDF bombed them in homes without hesitation, as a first option. It's
much easier to bomb a family's home. The system is built to look for
them in these situations."
The Lavender machine joins another AI system, "The Gospel," about which
information was revealed in a previous
investigation
by +972 and Local Call in November 2023, as well as in the Israeli
military's own
publications.
A fundamental difference between the two systems is in the definition of
the target: whereas The Gospel marks buildings and structures that the
army claims militants operate from, Lavender marks people
and puts
them on a kill list.
In addition, according to the sources, when it came to targeting
alleged junior militants marked by Lavender, the army preferred to only
use unguided missiles, commonly known as "dumb" bombs (in contrast to
"smart" precision bombs), which can destroy entire buildings on top of
their occupants and cause significant casualties. "You don't want to
waste expensive bombs on unimportant people
it's very expensive for
the country and there's a shortage [of those bombs]," said C., one of
the intelligence officers. Another source said that they had personally
authorized the bombing of "hundreds" of private homes of alleged junior
operatives marked by Lavender, with many of these attacks killing
civilians and entire families as "collateral
damage."
In an unprecedented move, according to two of the sources, the army
also decided during the first weeks of the war that, for every junior
Hamas operative that Lavender marked, it was permissible to kill up to
15 or 20 civilians; in the past, the military did not authorize any
"collateral damage" during assassinations of low-ranking militants. The
sources added that, in the event that the target was a senior Hamas
official with the rank of battalion or brigade commander, the army on
several occasions authorized the killing of more than 100 civilians in
the assassination of a single commander.
|
Palestinians wait to receive the bodies of their relatives who were killed in an Israeli airstrike, at Al-Najjar Hospital in Rafah, southern Gaza Strip, October 24, 2023. (Abed Rahim Khatib/Flash90) |
The following investigation is organized according to the six
chronological stages of the Israeli army's highly automated target
production in the early weeks of the Gaza war. First, we explain the
Lavender machine itself, which marked tens of thousands of Palestinians
using AI. Second, we reveal the "Where's Daddy?" system, which tracked
these targets and signaled to the army when they entered their family
homes. Third, we describe how "dumb" bombs were chosen to strike these
homes.
Fourth, we explain how the army loosened the permitted number of
civilians who could be killed during the bombing of a target. Fifth, we
note how automated software inaccurately calculated the amount of
non-combatants in each household. And sixth, we show how on several
occasions, when a home was struck, usually at night, the individual
target was sometimes not inside at all, because military officers did
not verify the information in real time.
STEP 1: GENERATING TARGETS
'Once you go automatic, target generation goes crazy'
In the Israeli army, the term "human target" referred in the past to a
senior military operative who, according to the rules of the military's
International Law Department, can be killed in their private home even
if there are civilians around. Intelligence sources told +972 and Local
Call that during Israel's previous wars, since this was an "especially
brutal" way to kill someone
often by killing an entire family
alongside the target
such human targets were marked very carefully
and only senior military commanders were bombed in their homes, to
maintain the principle of proportionality under international
law.
But after October 7
when Hamas-led militants launched a deadly
assault on southern Israeli communities, killing around 1,200 people and
abducting 240
the army, the sources said, took a dramatically
different approach. Under "Operation Iron Swords," the army decided to
designate all operatives of Hamas' military wing as human targets,
regardless of their rank or military importance. And that changed
everything.
The new policy also posed a technical problem for Israeli intelligence.
In previous wars, in order to authorize the assassination of a single
human target, an officer had to go through a complex and lengthy
"incrimination" process: cross-check evidence that the person was indeed
a senior member of Hamas' military wing, find out where he lived, his
contact information, and finally know when he was home in real time.
When the list of targets numbered only a few dozen senior operatives,
intelligence personnel could individually handle the work involved in
incriminating and locating them.
|
Palestinians try to rescue survivors and pull bodies from the rubble after Israeli airstrikes hit buildings near Al-Aqsa Martyrs Hospital in Deir al-Balah, central Gaza, October 22, 2023. (Mohammed Zaanoun) |
However, once the list was expanded to include tens of thousands of
lower-ranking operatives, the Israeli army figured it had to rely on
automated software and artificial intelligence. The result, the sources
testify, was that the role of human personnel in incriminating
Palestinians as military operatives was pushed aside, and AI did most of
the work instead. According to four of the sources who spoke to +972 and
Local Call, Lavender
which was developed to create human targets in
the current war
has marked some 37,000 Palestinians as suspected
"Hamas militants," most of them junior, for assassination (the IDF
Spokesperson denied the existence of such a kill list in a statement to
+972 and Local Call).
"We didn't know who the junior operatives were, because Israel didn't
track them routinely [before the war]," explained senior officer B. to
+972 and Local Call, illuminating the reason behind the development of
this particular target machine for the current war. "They wanted to
allow us to attack [the junior operatives] automatically. That's the
Holy Grail. Once you go automatic, target generation goes
crazy."
The sources said that the approval to automatically adopt Lavender's
kill lists, which had previously been used only as an auxiliary tool,
was granted about two weeks into the war, after intelligence personnel
"manually" checked the accuracy of a random sample of several hundred
targets selected by the AI system. When that sample found that
Lavender's results had reached 90 percent accuracy in identifying an
individual's affiliation with Hamas, the army authorized the sweeping
use of the system. From that moment, sources said that if Lavender
decided an individual was a militant in Hamas, they were essentially
asked to treat that as an order, with no requirement to independently
check why the machine made that choice or to examine the raw
intelligence data on which it is based.
"At 5 a.m., ][[the air
force]][ would come and bomb all the houses
that we had marked," B. said. "We took out thousands of people. We
didn't go through them one by one
we put everything into automated
systems, and as soon as one of [the marked individuals] was at home,
he immediately became a target. We bombed him and his
house."
"It was very surprising for me that we were asked to bomb a house to
kill a ground soldier, whose importance in the fighting was so low,"
said one source about the use of AI to mark alleged low-ranking
militants. "I nicknamed those targets 'garbage targets.' Still, I found
them more ethical than the targets that we bombed just for
'deterrence'
highrises that are evacuated and toppled just to cause destruction."
The deadly results of this loosening of restrictions in the early stage
of the war were staggering. According to data from the Palestinian
Health Ministry in Gaza, on which the Israeli army has relied almost
exclusively
since the beginning of the war, Israel killed some 15,000 Palestinians
almost half of the death toll so far
in the
first six
weeks of the war, up until a week-long ceasefire was agreed on Nov.
24.
|
Massive destruction is seen in Al-Rimal popular district of Gaza City after it was targeted by airstrikes carried out by Israeli colonial, October 10, 2023. (Mohammed Zaanoun) |
'The more information and variety, the better'
The Lavender software analyzes information collected on most of the 2.3
million residents of the Gaza Strip through a system of mass
surveillance, then assesses and ranks the likelihood that each
particular person is active in the military wing of Hamas or PIJ.
According to sources, the machine gives almost every single person in
Gaza a rating from 1 to 100, expressing how likely it is that they are a
militant.
Lavender learns to identify characteristics of known Hamas and PIJ
operatives, whose information was fed to the machine as training data,
and then to locate these same characteristics
also called "features"
among the general population, the sources explained. An individual
found to have several different incriminating features will reach a high
rating, and thus automatically becomes a potential target for
assassination.
In "The Human-Machine Team," the book referenced at the beginning of
this article, the current commander of Unit 8200 advocates for such a
system without referencing Lavender by name. (The commander himself also
isn't named, but five sources in 8200 confirmed that the commander is
the author, as reported
also by Haaretz.) Describing human personnel as a "bottleneck" that
limits the army's capacity during a military operation, the commander
laments: "We [humans] cannot process so much information. It doesn't
matter how many people you have tasked to produce targets during the war
you still cannot produce enough targets per
day."
The solution to this problem, he says, is artificial intelligence. The
book offers a short guide to building a "target machine," similar in
description to Lavender, based on AI and machine-learning algorithms.
Included in this guide are several examples of the "hundreds and
thousands" of features that can increase an individual's rating, such as
being in a Whatsapp group with a known militant, changing cell phone
every few months, and changing addresses frequently.
"The more information, and the more variety, the better," the commander
writes. "Visual information, cellular information, social media
connections, battlefield information, phone contacts, photos." While
humans select these features at first, the commander continues, over
time the machine will come to identify features on its own. This, he
says, can enable militaries to create "tens of thousands of targets,"
while the actual decision as to whether or not to attack them will
remain a human one.
The book isn't the only time a senior Israeli commander hinted at the
existence of human target machines like Lavender. +972 and Local Call
have obtained footage of a private lecture given by the commander of
Unit 8200's secretive Data Science and AI center, "Col. Yoav," at Tel
Aviv University's AI week in 2023, which was reported
on
at the time in the Israeli media.
In the lecture, the commander speaks about a new, sophisticated target
machine used by the Israeli army that detects "dangerous people" based
on their likeness to existing lists of known militants on which it was
trained. "Using the system, we managed to identify Hamas missile squad
commanders," "Col. Yoav" said in the lecture, referring to Israel's May
2021 military operation in Gaza, when the machine was used for the first
time.
|
Slides from a lecture presentation by the commander of IDF Unit 8200’s Data Science and AI center at Tel Aviv University in 2023, obtained by +972 and Local Call. |
|
Slides from a lecture presentation by the commander of IDF Unit 8200’s Data Science and AI center at Tel Aviv University in 2023, obtained by +972 and Local Call. |
The lecture presentation slides, also obtained by +972 and Local Call,
contain illustrations of how the machine works: it is fed data about
existing Hamas operatives, it learns to notice their features, and then
it rates other Palestinians based on how similar they are to the
militants.
"We rank the results and determine the threshold [at which to attack a
target]," "Col. Yoav" said in the lecture, emphasizing that
"eventually, people of flesh and blood take the decisions. In the
defense realm, ethically speaking, we put a lot of emphasis on this.
These tools are meant to help [intelligence officers] break their
barriers."
In practice, however, sources who have used Lavender in recent months
say human agency and precision were substituted by mass target creation
and lethality.
'There was no "zero-error" policy'
B., a senior officer who used Lavender, echoed to +972 and Local Call
that in the current war, officers were not required to independently
review the AI system's assessments, in order to save time and enable the
mass production of human targets without hindrances.
"Everything was statistical, everything was neat
it was very dry,"
B. said. He noted that this lack of supervision was permitted despite
internal checks showing that Lavender's calculations were considered
accurate only 90 percent of the time; in other words, it was known in
advance that 10 percent of the human targets slated for assassination
were not members of the Hamas military wing at
all.
For example, sources explained that the Lavender machine sometimes
mistakenly flagged individuals who had communication patterns similar to
known Hamas or PIJ operatives
including police and civil defense
workers, militants' relatives, residents who happened to have a name and
nickname identical to that of an operative, and Gazans who used a device
that once belonged to a Hamas operative.
"How close does a person have to be to Hamas to be [considered by an
AI machine to be] affiliated with the organization?" said one source
critical of Lavender's inaccuracy. "It's a vague boundary. Is a person
who doesn't receive a salary from Hamas, but helps them with all sorts
of things, a Hamas operative? Is someone who was in Hamas in the past,
but is no longer there today, a Hamas operative? Each of these features
characteristics that a machine would flag as suspicious
is
inaccurate."
|
Palestinians at the site of an Israeli airstrike in Rafah, in the southern Gaza Strip, February 24, 2024. (Abed Rahim Khatib/Flash90) |
Similar problems exist with the ability of target machines to assess
the phone used by an individual marked for assassination. "In war,
Palestinians change phones all the time," said the source. "People lose
contact with their families, give their phone to a friend or a wife,
maybe lose it. There is no way to rely 100 percent on the automatic
mechanism that determines which [phone] number belongs to
whom."
According to the sources, the army knew that the minimal human
supervision in place would not discover these faults. "There was no
'zero-error' policy. Mistakes were treated statistically," said a source
who used Lavender. "Because of the scope and magnitude, the protocol was
that even if you don't know for sure that the machine is right, you know
that statistically it's fine. So you go for
it."
"It has proven itself," said B., the senior source. "There's something
about the statistical approach that sets you to a certain norm and
standard. There has been an illogical amount of [bombings] in this
operation. This is unparalleled, in my memory. And I have much more
trust in a statistical mechanism than a soldier who lost a friend two
days ago. Everyone there, including me, lost people on October 7. The
machine did it coldly. And that made it
easier."
Another intelligence source, who defended the reliance on the
Lavender-generated kill lists of Palestinian suspects, argued that it
was worth investing an intelligence officer's time only to verify the
information if the target was a senior commander in Hamas. "But when it
comes to a junior militant, you don't want to invest manpower and time
in it," he said. "In war, there is no time to incriminate every target.
So you're willing to take the margin of error of using artificial
intelligence, risking collateral damage and civilians dying, and risking
attacking by mistake, and to live with it."
B. said that the reason for this automation was a constant push to
generate more targets for assassination. "In a day without targets
[whose feature rating was sufficient to authorize a strike], we
attacked at a lower threshold. We were constantly being pressured:
'Bring us more targets.' They really shouted at us. We finished
[killing] our targets very quickly."
He explained that when lowering the rating threshold of Lavender, it
would mark more people as targets for strikes. "At its peak, the system
managed to generate 37,000 people as potential human targets," said B.
"But the numbers changed all the time, because it depends on where you
set the bar of what a Hamas operative is. There were times when a Hamas
operative was defined more broadly, and then the machine started
bringing us all kinds of civil defense personnel, police officers, on
whom it would be a shame to waste bombs. They help the Hamas government,
but they don't really endanger soldiers."
|
Palestinians at the site of a building destroyed by an Israeli airstrike in Rafah, in the southern Gaza Strip, March 18, 2024. (Abed Rahim Khatib/Flash90) |
One source who worked with the military data science team that trained
Lavender said that data collected from employees of the Hamas-run
Internal Security Ministry, whom he does not consider to be militants,
was also fed into the machine. "I was bothered by the fact that when
Lavender was trained, they used the term 'Hamas operative' loosely, and
included people who were civil defense workers in the training dataset,"
he said.
The source added that even if one believes these people deserve to be
killed, training the system based on their communication profiles made
Lavender more likely to select civilians by mistake when its algorithms
were applied to the general population. "Since it's an automatic system
that isn't operated manually by humans, the meaning of this decision is
dramatic: it means you're including many people with a civilian
communication profile as potential targets."
'We only checked that the target was a man'
The Israeli military flatly rejects these claims. In a statement to
+972 and Local Call, the IDF Spokesperson denied using artificial
intelligence to incriminate targets, saying these are merely "auxiliary
tools that assist officers in the process of incrimination." The
statement went on: "In any case, an independent examination by an
[intelligence] analyst is required, which verifies that the identified
targets are legitimate targets for attack, in accordance with the
conditions set forth in IDF directives and international law."
However, sources said that the only human supervision protocol in place
before bombing the houses of suspected "junior" militants marked by
Lavender was to conduct a single check: ensuring that the AI-selected
target is male rather than female. The assumption in the army was that
if the target was a woman, the machine had likely made a mistake,
because there are no women among the ranks of the military wings of
Hamas and PIJ.
"A human being had to [verify the target] for just a few seconds," B.
said, explaining that this became the protocol after realizing the
Lavender system was "getting it right" most of the time. "At first, we
did checks to ensure that the machine didn't get confused. But at some
point we relied on the automatic system, and we only checked that [the
target] was a man
that was enough. It doesn't take a long time to
tell if someone has a male or a female voice."
To conduct the male/female check, B. claimed that in the current war,
"I would invest 20 seconds for each target at this stage, and do dozens
of them every day. I had zero added value as a human, apart from being a
stamp of approval. It saved a lot of time. If [the operative] came up
in the automated mechanism, and I checked that he was a man, there would
be permission to bomb him, subject to an examination of collateral
damage."
|
Palestinians emerge from the rubble of houses destroyed in Israeli airstrikes in the city of Rafah, southern Gaza Strip, November 20, 2023. (Abed Rahim Khatib/Flash90) |
In practice, sources said this meant that for civilian men marked in
error by Lavender, there was no supervising mechanism in place to detect
the mistake. According to B., a common error occurred "if the [Hamas]
target gave [his phone] to his son, his older brother, or just a
random man. That person will be bombed in his house with his family.
This happened often. These were most of the mistakes caused by
Lavender," B. said.
STEP 2: LINKING TARGETS TO FAMILY HOMES
'Most of the people you killed were women and children'
The next stage in the Israeli army's assassination procedure is
identifying where to attack the targets that Lavender
generates.
In a statement to +972 and Local Call, the IDF Spokesperson claimed in
response to this article that "Hamas places its operatives and military
assets in the heart of the civilian population, systematically uses the
civilian population as human shields, and conducts fighting from within
civilian structures, including sensitive sites such as hospitals,
mosques, schools and UN facilities. The IDF is bound by and acts
according to international law, directing its attacks only at military
targets and military operatives."
The six sources we spoke to echoed this to some degree, saying that
Hamas' extensive tunnel
system
deliberately passes under hospitals and schools; that Hamas militants
use ambulances to get around; and that countless military assets have
been situated near civilian buildings. The sources argued that many
Israeli strikes kill civilians as a result of these tactics by Hamas
a characterization that human rights groups
warn
evades Israel's onus for inflicting the casualties.
However, in contrast to the Israeli army's official statements, the
sources explained that a major reason for the unprecedented death toll
from Israel's current bombardment is the fact that the army has
systematically attacked targets in their private homes, alongside their
families
in part because it was easier from an intelligence
standpoint to mark family houses using automated
systems.
[Indeed, several sources emphasized that, as opposed to numerous cases
of Hamas operatives engaging in military activity from civilian areas,
in the case of systematic assassination strikes, the army routinely made
the active choice to bomb suspected militants when inside civilian
households from which no military activity took place. This choice, they
said, was a reflection of the way Israel's system of mass surveillance
in Gaza is designed.]
|
Palestinians rush to bring the wounded, including many children, to Al-Shifa Hospital in Gaza City as Israeli forces continue pounding the Gaza Strip, October 11, 2023. (Mohammed Zaanoun/Activestills) |
The sources told +972 and Local Call that since everyone in Gaza had a
private house with which they could be associated, the army's
surveillance systems could easily and automatically "link" individuals
to family houses. In order to identify the moment operatives enter their
houses in real time, various additional automatic softwares have been
developed. These programs track thousands of individuals simultaneously,
identify when they are at home, and send an automatic alert to the
targeting officer, who then marks the house for bombing. One of several
of these tracking softwares, revealed here for the first time, is called
"Where's Daddy?"
"You put hundreds [of targets] into the system and wait to see who
you can kill," said one source with knowledge of the system. "It's
called broad hunting: you copy-paste from the lists that the target
system produces."
Evidence of this policy is also clear from the data: during the first
month of the war, more than half of the fatalities
6,120 people
belonged to 1,340 families, many of which were completely wiped out
while inside their homes, according to UN
figures.
The proportion of entire familes
bombed in their houses in the current war is much higher than in the 2014 Israeli
operation
in Gaza (which was previously Israel's deadliest war on the Strip),
further suggesting the prominence of this
policy.
Another source said that each time the pace of assassinations waned,
more targets were added to systems like Where's Daddy? to locate
individuals that entered their homes and could therefore be bombed. He
said that the decision of who to put into the tracking systems could be
made by relatively low-ranking officers in the military hierarchy.
"One day, totally of my own accord, I added something like 1,200 new
targets to the [tracking] system, because the number of attacks [we
were conducting] decreased," the source said. "That made sense to me.
In retrospect, it seems like a serious decision I made. And such
decisions were not made at high levels."
The sources said that in the first two weeks of the war, "several
thousand" targets were initially inputted into locating programs like
Where's Daddy?. These included all the members of Hamas' elite special
forces unit the Nukhba, all of Hamas' anti-tank operatives, and anyone
who entered Israel on October 7. But before long, the kill list was
drastically expanded.
"In the end it was everyone [marked by Lavender]," one source
explained. "Tens of thousands. This happened a few weeks later, when the
[Israeli] brigades entered Gaza, and there were already fewer
uninvolved people [i.e. civilians] in the northern areas." According
to this source, even some minors were marked by Lavender as targets for
bombing. "Normally, operatives are over the age of 17, but that was not
a condition."
|
Wounded Palestinians are treated on the floor due to overcrowding at Al-Shifa Hospital, Gaza City, central Gaza Strip, October 18, 2023. (Mohammed Zaanoun/Activestills) |
Lavender and systems like Where's Daddy? were thus combined with deadly
effect, killing entire families, sources said. By adding a name from the
Lavender-generated lists to the Where's Daddy? home tracking system, A.
explained, the marked person would be placed under ongoing surveillance,
and could be attacked as soon as they set foot in their home, collapsing
the house on everyone inside.
"Let's say you calculate [that there is one] Hamas [operative] plus
10 [civilians in the house]," A. said. "Usually, these 10 will be
women and children. So absurdly, it turns out that most of the people
you killed were women and children."
STEP 3: CHOOSING A WEAPON
'We usually carried out the attacks with "dumb bombs"'
Once Lavender has marked a target for assassination, army personnel
have verified that they are male, and tracking software has located the
target in their home, the next stage is picking the munition with which
to bomb them.
In December 2023, CNN
reported
that according to U.S. intelligence estimates, about 45 percent of the
munitions used by the Israeli air force in Gaza were "dumb" bombs, which
are known to cause more collateral damage than guided bombs. In response
to the CNN report, an army spokesperson quoted in the article said:
"As a military committed to international
law and a moral code of conduct, we are devoting vast resources to
minimizing harm to the civilians that Hamas has forced into the role of
human shields. Our war is against Hamas, not against the people of
Gaza."
Three intelligence sources, however, told +972 and Local Call that
junior operatives marked by Lavender were assassinated only with dumb
bombs, in the interest of saving more expensive armaments. The
implication, one source explained, was that the army would not strike a
junior target if they lived in a high-rise building, because the army
did not want to spend a more precise and expensive "floor bomb" (with
more limited collateral effect) to kill him. But if a junior target
lived in a building with only a few floors, the army was authorized to
kill him and everyone in the building with a dumb
bomb.
|
Palestinians at the site of a building destroyed by an Israeli airstrike in Rafah, in the southern Gaza Strip, March 18, 2024. (Abed Rahim Khatib/Flash90) |
"It was like that with all the junior targets," testified C., who used
various automated programs in the current war. "The only question was,
is it possible to attack the building in terms of collateral damage?
Because we usually carried out the attacks with dumb bombs, and that
meant literally destroying the whole house on top of its occupants. But
even if an attack is averted, you don't care
you immediately move on
to the next target. Because of the system, the targets never end. You
have another 36,000 waiting."
STEP 4: AUTHORIZING CIVILIAN CASUALTIES
'We attacked almost without considering collateral damage'
One source said that when attacking junior operatives, including those
marked by AI systems like Lavender, the number of civilians they were
allowed to kill alongside each target was fixed during the initial weeks
of the war at up to 20. Another source claimed the fixed number was up
to 15. These "collateral damage degrees," as the military calls them,
were applied broadly to all suspected junior militants, the sources
said, regardless of their rank, military importance, and age, and with
no specific case-by-case examination to weigh the military advantage of
assassinating them against the expected harm to civilians.
According to A., who was an officer in a target operation room in the
current war, the army's international law department has never before
given such "sweeping approval" for such a high collateral damage degree.
"It's not just that you can kill any person who is a Hamas soldier,
which is clearly permitted and legitimate in terms of international
law," A. said. "But they directly tell you: 'You are allowed to kill
them along with many civilians.'
"Every person who wore a Hamas uniform in the past year or two could be
bombed with 20 [civilians killed as] collateral damage, even without
special permission," A. continued. "In practice, the principle of
proportionality did not exist."
According to A., this was the policy for most of the time that he
served. Only later did the military lower the collateral damage degree.
"In this calculation, it could also be 20 children for a junior
operative ... It really wasn't like that in the past," A. explained.
Asked about the security rationale behind this policy, A. replied:
"Lethality."
|
Palestinians wait to receive the bodies of their relatives who were killed in Israeli airstrikes, at Al-Najjar Hospital in Rafah, southern Gaza Strip, November 7, 2023. (Abed Rahim Khatib/Flash90) |
The predetermined and fixed collateral damage degree helped accelerate
the mass creation of targets using the Lavender machine, sources said,
because it saved time. B. claimed that the number of civilians they were
permitted to kill in the first week of the war per suspected junior
militant marked by AI was fifteen, but that this number "went up and
down" over time.
"At first we attacked almost without considering collateral damage," B.
said of the first week after October 7. "In practice, you didn't really
count people [in each house that is bombed], because you couldn't
really tell if they're at home or not. After a week, restrictions on
collateral damage began. The number dropped [from 15] to five, which
made it really difficult for us to attack, because if the whole family
was home, we couldn't bomb it. Then they raised the number
again."
(max char reached). Read the entire article here (mirror)
Iirc the show "Where in the World is Carmon Sandiego?" was created by the government of the US because something like half of US Americans couldn't identify the Pacific Ocean on a map.
Edit: it was 25%
https://en.m.wikipedia.org/wiki/Where_in_the_World_Is_Carmen_Sandiego%3F_(game_show)
They also couldn't find the Soviet Union on the map (literally the biggest country in the world) in 1991.