---
_id: '58472'
abstract:
- lang: eng
  text: The “kill chain”—involving the analysis of data by human users of military
    technologies, the understanding of that data, and human decisions—has fast been
    replaced by the “kill cloud” that necessitates, allows, and exacerbates increased
    thirst for domination, violence against distant populations, and a culture of
    experimentation with human lives. This commentary reports an interdisciplinary
    discussion organised by the Disruption Network Lab that brought together whistleblowers,
    artists, and experts investigating the impact of artificial intelligence and other
    emerging technologies on networked warfare. Exposing the problematics of networked
    warfare and the kill cloud, their colonial overtones, effects on human subjects
    in real life, erroneous scientific rationalities, and the (business) practices
    and logics that enable this algorithmic machinery of violence. The conference
    took place from the 29th of November to the 1st of December 2024 at the Kunstquartier
    Bethanien in Berlin, Germany.
article_type: review
author:
- first_name: Ishmael
  full_name: Bhila, Ishmael
  id: '105772'
  last_name: Bhila
citation:
  ama: 'Bhila I. Investigating the kill cloud: information warfare, autonomous weapons
    &#38; AI. <i>Digital War</i>. 2025;6(4). doi:<a href="https://doi.org/10.1057/s42984-025-00101-x">10.1057/s42984-025-00101-x</a>'
  apa: 'Bhila, I. (2025). Investigating the kill cloud: information warfare, autonomous
    weapons &#38; AI. <i>Digital War</i>, <i>6</i>(4). <a href="https://doi.org/10.1057/s42984-025-00101-x">https://doi.org/10.1057/s42984-025-00101-x</a>'
  bibtex: '@article{Bhila_2025, title={Investigating the kill cloud: information warfare,
    autonomous weapons &#38; AI}, volume={6}, DOI={<a href="https://doi.org/10.1057/s42984-025-00101-x">10.1057/s42984-025-00101-x</a>},
    number={4}, journal={Digital War}, publisher={Springer Science and Business Media
    LLC}, author={Bhila, Ishmael}, year={2025} }'
  chicago: 'Bhila, Ishmael. “Investigating the Kill Cloud: Information Warfare, Autonomous
    Weapons &#38; AI.” <i>Digital War</i> 6, no. 4 (2025). <a href="https://doi.org/10.1057/s42984-025-00101-x">https://doi.org/10.1057/s42984-025-00101-x</a>.'
  ieee: 'I. Bhila, “Investigating the kill cloud: information warfare, autonomous
    weapons &#38; AI,” <i>Digital War</i>, vol. 6, no. 4, 2025, doi: <a href="https://doi.org/10.1057/s42984-025-00101-x">10.1057/s42984-025-00101-x</a>.'
  mla: 'Bhila, Ishmael. “Investigating the Kill Cloud: Information Warfare, Autonomous
    Weapons &#38; AI.” <i>Digital War</i>, vol. 6, no. 4, Springer Science and Business
    Media LLC, 2025, doi:<a href="https://doi.org/10.1057/s42984-025-00101-x">10.1057/s42984-025-00101-x</a>.'
  short: I. Bhila, Digital War 6 (2025).
date_created: 2025-01-31T13:40:36Z
date_updated: 2025-01-31T13:43:54Z
doi: 10.1057/s42984-025-00101-x
intvolume: '         6'
issue: '4'
keyword:
- autonomous weapons systems
- algorithmic warfare
- cloud computing
- war on terror
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://link.springer.com/article/10.1057/s42984-025-00101-x#article-info
oa: '1'
publication: Digital War
publication_identifier:
  issn:
  - 2662-1975
  - 2662-1983
publication_status: published
publisher: Springer Science and Business Media LLC
status: public
title: 'Investigating the kill cloud: information warfare, autonomous weapons & AI'
type: journal_article
user_id: '105772'
volume: 6
year: '2025'
...
---
_id: '56279'
abstract:
- lang: eng
  text: <jats:title>Abstract</jats:title><jats:p>Biases in artificial intelligence
    have been flagged in academic and policy literature for years. Autonomous weapons
    systems—defined as weapons that use sensors and algorithms to select, track, target,
    and engage targets without human intervention—have the potential to mirror systems
    of societal inequality which reproduce algorithmic bias. This article argues that
    the problem of engrained algorithmic bias poses a greater challenge to autonomous
    weapons systems developers than most other risks discussed in the Group of Governmental
    Experts on Lethal Autonomous Weapons Systems (GGE on LAWS), which should be reflected
    in the outcome documents of these discussions. This is mainly because it takes
    longer to rectify a discriminatory algorithm than it does to issue an apology
    for a mistake that occurs occasionally. Highly militarised states have controlled
    both the discussions and their outcomes, which have focused on issues that are
    pertinent to them while ignoring what is existential for the rest of the world.
    Various calls from civil society, researchers, and smaller states for a legally
    binding instrument to regulate the development and use of autonomous weapons systems
    have always included the call for recognising algorithmic bias in autonomous weapons,
    which has not been reflected in discussion outcomes. This paper argues that any
    ethical framework developed for the regulation of autonomous weapons systems should,
    in detail, ensure that the development and use of autonomous weapons systems do
    not prejudice against vulnerable sections of (global) society.</jats:p>
author:
- first_name: Ishmael
  full_name: Bhila, Ishmael
  id: '105772'
  last_name: Bhila
citation:
  ama: Bhila I. Putting algorithmic bias on top of the agenda in the discussions on
    autonomous weapons systems. <i>Digital War</i>. Published online 2024. doi:<a
    href="https://doi.org/10.1057/s42984-024-00094-z">10.1057/s42984-024-00094-z</a>
  apa: Bhila, I. (2024). Putting algorithmic bias on top of the agenda in the discussions
    on autonomous weapons systems. <i>Digital War</i>. <a href="https://doi.org/10.1057/s42984-024-00094-z">https://doi.org/10.1057/s42984-024-00094-z</a>
  bibtex: '@article{Bhila_2024, title={Putting algorithmic bias on top of the agenda
    in the discussions on autonomous weapons systems}, DOI={<a href="https://doi.org/10.1057/s42984-024-00094-z">10.1057/s42984-024-00094-z</a>},
    journal={Digital War}, publisher={Springer Science and Business Media LLC}, author={Bhila,
    Ishmael}, year={2024} }'
  chicago: Bhila, Ishmael. “Putting Algorithmic Bias on Top of the Agenda in the Discussions
    on Autonomous Weapons Systems.” <i>Digital War</i>, 2024. <a href="https://doi.org/10.1057/s42984-024-00094-z">https://doi.org/10.1057/s42984-024-00094-z</a>.
  ieee: 'I. Bhila, “Putting algorithmic bias on top of the agenda in the discussions
    on autonomous weapons systems,” <i>Digital War</i>, 2024, doi: <a href="https://doi.org/10.1057/s42984-024-00094-z">10.1057/s42984-024-00094-z</a>.'
  mla: Bhila, Ishmael. “Putting Algorithmic Bias on Top of the Agenda in the Discussions
    on Autonomous Weapons Systems.” <i>Digital War</i>, Springer Science and Business
    Media LLC, 2024, doi:<a href="https://doi.org/10.1057/s42984-024-00094-z">10.1057/s42984-024-00094-z</a>.
  short: I. Bhila, Digital War (2024).
date_created: 2024-09-30T11:31:29Z
date_updated: 2024-09-30T11:55:06Z
doi: 10.1057/s42984-024-00094-z
language:
- iso: eng
publication: Digital War
publication_identifier:
  issn:
  - 2662-1975
  - 2662-1983
publication_status: published
publisher: Springer Science and Business Media LLC
status: public
title: Putting algorithmic bias on top of the agenda in the discussions on autonomous
  weapons systems
type: journal_article
user_id: '105772'
year: '2024'
...
