Sunday, August 24, 2025
NewsWave
No Result
View All Result
  • Home
  • World
  • USA
  • Business
  • Sports
  • Entertainment
  • Technology
Login
  • Home
  • World
  • USA
  • Business
  • Sports
  • Entertainment
  • Technology
Login
No Result
View All Result
Login
NewsWave
No Result
View All Result
Home World UK

Israel using AI to identify human targets raising fears that innocents are being caught in the net

12 April 2024
in UK
0
Israel using AI to identify human targets raising fears that innocents are being caught in the net
Share on FacebookShare on Twitter


A report by Jerusalem-based investigative journalists published in +972 magazine finds that AI targeting systems have played a key role in identifying – and potentially misidentifying – tens of thousands of targets in Gaza. This suggests that autonomous warfare is no longer a future scenario. It is already here and the consequences are horrifying.

There are two technologies in question. The first, “Lavender”, is an AI recommendation system designed to use algorithms to identify Hamas operatives as targets. The second, the grotesquely named “Where’s Daddy?”, is a system which tracks targets geographically so that they can be followed into their family residences before being attacked. Together, these two systems constitute an automation of the find-fix-track-target components of what is known by the modern military as the “kill chain”.

Systems such as Lavender are not autonomous weapons, but they do accelerate the kill chain and make the process of killing progressively more autonomous. AI targeting systems draw on data from computer sensors and other sources to statistically assess what constitutes a potential target. Vast amounts of this data are gathered by Israeli intelligence through surveillance on the 2.3 million inhabitants of Gaza.

Such systems are trained on a set of data to produce the profile of a Hamas operative. This could be data about gender, age, appearance, movement patterns, social network relationships, accessories, and other “relevant features”. They then work to match actual Palestinians to this profile by degree of fit. The category of what constitutes relevant features of a target can be set as stringently or as loosely as is desired. In the case of Lavender, it seems one of the key equations was “male equals militant”. This has echoes of the infamous “all military-aged males are potential targets” mandate of the 2010 US drone wars in which the Obama administration identified and assassinated hundreds of people designated as enemies “based on metadata”.

What is different with AI in the mix is the speed with which targets can be algorithmically determined and the mandate of action this issues. The +972 report indicates that the use of this technology has led to the dispassionate annihilation of thousands of eligible – and ineligible – targets at speed and without much human oversight.

The Israel Defense Forces (IDF) were swift to deny the use of AI targeting systems of this kind. And it is difficult to verify independently whether and, if so, the extent to which they have been used, and how exactly they function. But the functionalities described by the report are entirely plausible, especially given the IDF’s own boasts to be “one of the most technological organisations” and an early adopter of AI.

With military AI programs around the world striving to shorten what the US military calls the “sensor-to-shooter timeline” and “increase lethality” in their operations, why would an organisation such as the IDF not avail themselves of the latest technologies?

The fact is, systems such as Lavender and Where’s Daddy? are the manifestation of a broader trend which has been underway for a good decade and the IDF and its elite units are far from the only ones seeking to implement more AI-targeting systems into their processes.

When machines trump humans

Earlier this year, Bloomberg reported on the latest version of Project Maven, the US Department of Defense AI pathfinder programme, which has evolved from being a sensor data analysis programme in 2017 to a full-blown AI-enabled target recommendation system built for speed. As Bloomberg journalist Katrina Manson reports, the operator “can now sign off on as many as 80 targets in an hour of work, versus 30 without it”.

Directed by a machine: IDF forces are increasingly using artificial intelligence to identify and track suspected Hamas fighters in Gaza.
The Yomiuri Shimbun via AP Images

Manson quotes a US army officer tasked with learning the system describing the process of concurring with the algorithm’s conclusions, delivered in a rapid staccato: “Accept. Accept, Accept”. Evident here is how the human operator is deeply embedded in digital logics that are difficult to contest. This gives rise to a logic of speed and increased output that trumps all else.

The efficient production of death is reflected also in the +972 account, which indicated an enormous pressure to accelerate and increase the production of targets and the killing of these targets. As one of the sources says: “We were constantly being pressured: bring us more targets. They really shouted at us. We finished [killing] our targets very quickly”.

Built-in biases

Systems like Lavender raise many ethical questions pertaining to training data, biases, accuracy, error rates and, importantly, questions of automation bias. Automation bias cedes all authority, including moral authority, to the dispassionate interface of statistical processing.

Speed and lethality are the watchwords for military tech. But in prioritising AI, the scope for human agency is marginalised. The logic of the system requires this, owing to the comparatively slow cognitive systems of the human. It also removes the human sense of responsibility for computer-produced outcomes.

I’ve written elsewhere how this complicates notions of control (at all levels) in ways that we must take into consideration. When AI, machine learning and human reasoning form a tight ecosystem, the capacity for human control is limited. Humans have a tendency to trust whatever computers say, especially when they move too fast for us to follow.

The problem of speed and acceleration also produces a general sense of urgency, which privileges action over non-action. This turns categories such as “collateral damage” or “military necessity”, which should serve as a restraint to violence, into channels for producing more violence.

I am reminded of the military scholar Christopher Coker’s words: “we must choose our tools carefully, not because they are inhumane (all weapons are) but because the more we come to rely on them, the more they shape our view of the world”. It is clear that military AI shapes our view of the world. Tragically, Lavender gives us cause to realise that this view is laden with violence.



Source link

🪄 Creating a simple explanation...

Tags: CaughtFearsHumanidentifyinnocentsisraelnetraisingtargets
Previous Post

Republican Outreach Center Shuts Down, Turns into Sex Shop

Next Post

Man told to repay £18,000 to bank despite never having an account with them

Related Posts

Fears raised that a crowd crush at Notting Hill Carnival is ‘only a matter of time’
UK

Fears raised that a crowd crush at Notting Hill Carnival is ‘only a matter of time’

by My News Wave
23 August 2025
0

Crowd safety experts express significant concerns regarding the Notting Hill Carnival, as fears of a mass casualty event loom with an expected attendance of one million this weekend. Attendees report feeling unsafe, citing past experiences of extreme crowding, and experts warn that serious injuries or fatalities are "100% foreseeable," urging for stricter crowd management measures. Want More Context? 🔎

Read more
UK to get glimpse of 30°C heat over Bank Holiday before Hurricane Erin rolls in
UK

UK to get glimpse of 30°C heat over Bank Holiday before Hurricane Erin rolls in

by My News Wave
23 August 2025
0

UK weather forecasts predict temperatures up to 30°C over the August Bank Holiday weekend, with sunny conditions expected. However, following the holiday, the UK will experience unsettled weather and potential heavy rain due to Hurricane Erin, prompting safety advisories for beachgoers regarding strong winds and rip currents. Want More Context? 🔎

Read more
Man in his 20s dies after suffering head injuries on seaside funfair ride
UK

Man in his 20s dies after suffering head injuries on seaside funfair ride

by My News Wave
23 August 2025
0

A man in his 20s died from serious head injuries sustained in a workplace accident at the Spanish City Summer Funfair in Whitley Bay, prompting emergency services to respond shortly after 2:15 PM. The fairground has since been closed to the public as Northumbria Police express condolences to the man's family during this tragic time. Want More Context? 🔎

Read more
Family of British woman, 69, feared killed in Ukraine ‘left in limbo’
UK

Family of British woman, 69, feared killed in Ukraine ‘left in limbo’

by My News Wave
23 August 2025
0

Annie Lewis Marffy, a 69-year-old British aid worker from Exeter, was killed in a Russian drone strike in Ukraine shortly after arriving on June 4, 2025, to deliver supplies. Her family is seeking a death certificate to help with the probate process, but her remains are in a conflict zone, making recovery impossible, leaving them in distress and without closure. Want More Context? 🔎

Read more
Teenage girl ‘raped at major UK festival’ with 16-year-old boy arrested
UK

Teenage girl ‘raped at major UK festival’ with 16-year-old boy arrested

by My News Wave
23 August 2025
0

Police are investigating a reported rape of a teenage girl at the Camper Calling Festival in Warwickshire, where a 16-year-old boy has been arrested. This incident follows another rape allegation involving a one-eyed man in Oxfordshire, prompting police appeals for public assistance in both cases. Want More Context? 🔎

Read more
Bid to rescue woman stuck 22,000ft up mountain called off after man dies trying to save her
UK

Bid to rescue woman stuck 22,000ft up mountain called off after man dies trying to save her

by My News Wave
23 August 2025
0

A rescue mission for Russian climber Natalia Nagovitsina, 47, who has been trapped for ten days at 22,965 feet on Victory Peak in Kyrgyzstan after breaking her leg, has been abandoned due to severe weather and failed attempts. Despite efforts from climbers and helicopters, conditions worsened following the death of fellow climber Luca Sinigaglia, leading rescue leaders to doubt her survival chances. Want More Context? 🔎

Read more
NewsWave

News Summarized. Time Saved. Bite-sized news briefs for busy people. No fluff, just facts.

CATEGORIES

  • Africa
  • Asia Pacific
  • Australia
  • Business
  • Canada
  • Entertainment
  • Europe
  • India
  • Middle East
  • New Zealand
  • Sports
  • Technology
  • UK
  • USA
  • World

LATEST NEWS STORIES

  • Parents of missing 7-month-old child in California arrested for murder: Sheriff
  • Intel's New Funding Came From Already-Awarded Grants. So What Happens Next?
  • 49ers vs. Chargers: Best performances from San Francisco's 30-23 victory over Los Angeles
  • About Us
  • Disclaimer
  • Privacy Policy
  • Cookie Privacy Policy
  • Terms and Conditions
  • Contact Us

Copyright © 2025 News Wave
News Wave is not responsible for the content of external sites.

No Result
View All Result
  • Home
  • World
  • USA
  • Business
  • Sports
  • Entertainment
  • Technology

Copyright © 2025 News Wave
News Wave is not responsible for the content of external sites.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In