Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

usonian

(24,636 posts)
Wed Mar 4, 2026, 03:49 PM 18 hrs ago

Autonomous Weapon Systems and International Humanitarian Law: Selected Issues (International Red Cross)

https://www.icrc.org/en/article/autonomous-weapon-systems-and-international-humanitarian-law-selected-issues

ICRC’s concerns about AWS
In the ICRC’s view, as well as many States and other actors, AWS are weapon systems that, once activated, can select and engage one or more targets without further human intervention. After initial activation or launch, an autonomous weapon system triggers a strike in response to information from the environment received through sensors and on the basis of a generalized "target profile". As a result, the user does not choose, or even know, the specific target(s) and the precise timing and/or location of the resulting application of force.

The use of AWS entails serious risks due to the difficulty of anticipating and limiting their effects. The loss of human control and judgement in decisions over life and death raises profound humanitarian, legal and ethical concerns. In particular, AWS:


• pose risks of harm to those affected by armed conflict, both civilians and combatants, as well as dangers of conflict escalation.
• raise challenges for compliance with international law, including IHL, notably, the rules on conduct of hostilities; and
•raise fundamental ethical concerns by delegating life and death decisions to machines, which diminishes both the moral agency of the users and the human dignity of those against whom force is used.


Regardless of the sophistication of AWS and associated sensor, software and robotics technologies, it is important to emphasize that IHL obligations regarding the conduct of hostilities must always be fulfilled by humans. It is not the weapon system that must comply with IHL, but the humans using it.

Download position paper (473K PDF)
https://www.icrc.org/sites/default/files/2026-03/4896_002_Autonomous_Weapons_Systems_-_IHL-ICRC.pdf
2 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
Autonomous Weapon Systems and International Humanitarian Law: Selected Issues (International Red Cross) (Original Post) usonian 18 hrs ago OP
About a year ago... 2naSalit 16 hrs ago #1
Humans? We don't need no stinkin' humans. usonian 16 hrs ago #2

2naSalit

(101,905 posts)
1. About a year ago...
Wed Mar 4, 2026, 05:24 PM
16 hrs ago

"They" predicted AI would destroy humanity within a couple years. Here we are.

Latest Discussions»General Discussion»Autonomous Weapon Systems...