Not exact matches
Created by Dr. Alexander Leveringhaus, who specializes in moral responsibility and robotic
weapons, this initiative analyses
how militaries can design ethically responsible combat
systems using increasingly sophisticated and potentially
autonomous technology.
Based on these touchpoints, states should be prepared to explain
how control is applied over existing
weapons systems, especially those with certain
autonomous or automatic functions.
Tuesday, 12 April is dedicated entirely to discussing
how to move «towards a working definition» of lethal
autonomous weapons systems.
This chapter commences with an examination of the emerging technology supporting these sophisticated
systems, by detailing
autonomous features that are currently being designed for
weapons and anticipating
how technological advances might be incorporated into future
weapon systems.
Project Narrative: We address
how technological developments in artificial intelligence (AI) affect the relationships between society, AI and
autonomous weapons systems.
Any statements renouncing these
weapons systems are welcome and show
how the discourse and the debate within the armed forces of various countries are is increasingly focusing not only questions relating to the legality of fully
autonomous weapons, but the much bigger concerns.
The 58 - page Keeping Control report by Daan Kayser provides an overview of the positions of European states on lethal
autonomous weapon systems, including on the call for a ban and on
how to ensure
weapons systems remain under meaningful human control.
... New Zealand will develop a position on [lethal
autonomous weapons systems] in concert with other governments when the international community is clearer about their potential impact and when there is a clearer understanding about
how a line could be drawn between automated and
autonomous weapons.»