×
Israel’s AI identified 37K Palestinians for strikes with suspected militant scoring system
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Israel has deployed artificial intelligence systems in Gaza that assign numerical scores to Palestinian civilians based on suspected militant affiliations, with one program identifying 37,000 potential targets in the war’s early weeks. These AI-powered targeting systems, including programs called “Lavender” and “Where’s Daddy,” represent what experts describe as a live testing ground for military technologies that will likely be exported globally, raising concerns about the future proliferation of AI-enabled warfare and surveillance tools.

The big picture: Israel has positioned itself as a leader in battlefield-tested AI weapons, with the current Gaza conflict serving as what the military called the “first AI War” in 2021, now escalated to unprecedented levels of AI-assisted targeting.

How the targeting works: The Lavender program uses machine learning to assign Gaza residents numerical scores indicating their suspected likelihood of being Hamas members.

  • The system identified 37,000 possible Palestinians and their homes as potential targets due to assumed connections to Hamas in the war’s opening weeks.
  • Targeting criteria are concerningly broad, with young males, residents of specific Gaza areas, or particular communication behaviors deemed sufficient for targeting.
  • An Israeli intelligence officer told +972 Magazine that “civil defense personnel, police officers” were flagged as targets, though “it would be a shame to waste bombs” on them.

Human oversight concerns: Despite claims of human approval, sources indicate the process has become largely automated with minimal meaningful review.

  • Human operators reportedly spend only “20 seconds” reviewing each AI-generated target, serving as a “rubber stamp” for machine decisions.
  • The primary human check involves confirming the target is male, with one source explaining this approach has made “human decision-making too robotic.”
  • Israeli officers admitted the Lavender system makes “mistakes” in roughly 10% of situations.

Civilian casualties: The AI-enhanced targeting has coincided with unprecedented civilian harm in Gaza.

  • In October 2023 alone, at least 5,139 civilians, including 1,900 children, were killed—the highest monthly civilian casualties recorded since 2014.
  • Israel increased acceptable civilian casualty thresholds to 20 at the war’s start and allowed strikes that could “harm more than 100 civilians” on a case-by-case basis.
  • The majority of deaths occurred in residential buildings, with families often killed together, averaging 15 family members per incident.

U.S. tech company involvement: Major American technology firms provide crucial infrastructure supporting Israel’s AI warfare capabilities.

  • Google and Amazon signed a $1.2 billion contract with Israel in 2021 called Project Nimbus, helping “store, process, and analyze data, including facial recognition, emotion recognition, biometrics and demographic information.”
  • Palantir, a data analytics company, collaborates with Israeli military organizations, with AI programs relying on Palestinian intelligence data, including information from the U.S. National Security Agency.
  • Other Silicon Valley companies involved include Shield AI, providing self-piloting drones, and Skydio, supplying reconnaissance drones.

Global export concerns: Technologies tested in Gaza are positioned for international sale as “battle-tested” solutions.

  • Israel is already one of the world’s largest arms exporters relative to its size, with AI technologies becoming core components of its defense export portfolio.
  • Surveillance platforms and automated targeting systems could be sold to authoritarian governments seeking to suppress dissent or control marginalized populations.
  • Matt Mahmoudi of Amnesty International warns that “U.S. technology companies contracting with Israeli defense authorities have had little insight or control over how their products are used.”

What they’re saying: Israeli military officials have emphasized the speed and scale advantages of AI targeting.

  • Lt. Gen. Aviv Kochavi noted the AI integration allows the IDF to identify as many targets in a month as it previously did in a year.
  • A colonel serving as chief of the IDF “target bank” told The Jerusalem Post that “AI targeting capabilities had for the first time helped the IDF cross the point where they can assemble new targets even faster than the rate of attacks.”
Gaza: Israel’s AI Human Laboratory

Recent News

Chile develops Latam-GPT, a 50B-parameter AI model for Latin America

The open-source model addresses cultural blind spots that global AI systems routinely miss.

Alibaba shares surge 19% as cloud growth and AI chip development fuel rally

Cloud revenue growth accelerated to 26% as AI products maintained triple-digit expansion for the eighth straight quarter.