Autonomous Weapon Systems and Compliance with International Humanitarian Law
It is undisputed that autonomous weapon systems (AWS) must be developed and used in compliance with international humanitarian law (IHL). It is also well-established that human–machine interaction must be taken into account to ensure compliance with IHL. However, how the rules of IHL should be interpreted and applied in the context of AWS remains, in some respects, unclear or disputed.
Zooming in on the specific case of AWS, this session aims to provide greater clarity on what IHL permits, prohibits, and requires concerning the development and use of military applications of AI. In particular, experts and participants are invited to reflect on the question of to what extent (if at all) IHL-mandated evaluations may be reposed fully or partially in technical processes instead of natural persons.
Organiser: Stockholm International Peace Research Institute