The technical sessions begins on Monday afternoon and Tuesday morning with a review of the state of play on research and development of
autonomous weapons systems as well as an exchange on the military rationale for autonomous functions in weapons systems.
Although
autonomous weapons systems as described herein have not yet been deployed and the extent of their development as a military technology remains unclear, discussion of such questions must begin immediately and not once the technology has been developed and proliferated.
Artificial intelligence experts in Australia, Belgium, and Canada appealed to their respective government leaders to support the call to ban fully
autonomous weapons systems as a matter of urgency.
Professor Stuart Russell described an emerging consensus in the artificial intelligence and robotics community against
autonomous weapons systems as «most don't want to build systems that will kill.»
Mines Action Canada, as a co-founder of the Campaign to Stop Killer Robots, believes that the way forward must lead to a pre-emptive ban on
autonomous weapons systems as a tool to prevent humanitarian harm without damaging research and development on autonomy and robotics for military or civilian purposes.
Not exact matches
These are some of the questions we have been exploring in the domain of self - driving vehicles, care robots,
as well
as lethal
autonomous weapons systems, or LAWS.
The advent of new
weapons technologies such
as fully
autonomous weapons systems only underline the need for us to continue to work together to ensure that the principles which guide us continue to be upheld.
He says questions where we are heading on lethal
autonomous weapons systems are warranted
as «the future is not a destiny, it is a choice.»
While developing new
weapon systems, both LAWS
as well
as weapon systems with more advanced
autonomous functions in general, states should remain within the boundaries of international law.
AAR Japan finds that technological developments aimed at lowering cost and risk to human soldiers
as well
as increasing speed and efficiency have led to the development of
autonomous weapons systems with various levels of human control.
At the campaign's opening briefing, Nobel Peace Laureate Ms. Jody Williams of the Nobel Women's Initiative, a campaign co-founder, rejected the notion that
autonomous weapons systems are «inevitable» stating «this is a decision that we
as human beings can make.
On Friday, 12 December, states attending the United Nations meeting in Geneva will decide on future work to address concerns over fully
autonomous weapons systems, known
as lethal
autonomous weapons systems.
The UN Special Rapporteur on extrajudicial, summary or arbitrary executions, Professor Christof Heyns, is due to present his latest report (A - HRC -26-36) to the the Human Rights Council on 12 June recommending that the body «remain seized» with the issue of
autonomous weapons systems and «make its voice heard
as the international debate unfolds.»
In the future, these concerns, and others, may apply also to the use of
autonomous weapons systems, or what are known
as «killer robots», which, once activated, can select and engage targets and operate in dynamic and changing environments without further human intervention.
These same principles must also apply to
weapons which will be developed in the future, such
as fully
autonomous weapons systems.
The Campaign to Stop Killer Robots calls for a pre-emptive and comprehensive ban on the development, production, and use of fully
autonomous weapons, also known
as lethal
autonomous weapons systems or killer robots.
Pakistan, 26 October The development of new types of
weapons, such
as lethal
autonomous weapon systems (LAWS) remains a source of concern for the international community.
In the event that states adopt a new CCW protocol on lethal
autonomous weapons systems — where talks have been underway since 2014 and another round is due in April — the report states that «it will be natural for
autonomous weapons to be added to the list of
weapon types that provide grounds for the exclusion of companies under the Fund's ethical guidelines, in the same way
as it has done» before.
... India supports continued discussions in the CCW on lethal
autonomous weapons Systems (LAWS)
as per an agreed mandate.
Non-Aligned Movement — delivered by Indonesia, 8 October NAM is of the view that lethal
autonomous weapons systems (LAWS) raise a number of ethical, legal, moral and technical,
as well
as international peace and security related questions which should be thoroughly deliberated and examined in the context of conformity to international law including international humanitarian law and international human rights law.
The fact that CCW States Parties are dealing with the issue of lethal
autonomous weapons systems,
as proposed by France in 2013, is a significant development with regard to the Convention.
With respect to the scope of what was discussed at the Chatham House, Scharre's depiction of the conference being focused only on «anti-material»
autonomous weapons systems is confusing,
as the conference addressed all types of
autonomous weapons systems, including «anti-personnel.»
Ireland made its first public statement on the matter at the UN General Assembly in September 2013, stating that «our focus must always be to ensure respect for international humanitarian law and human rights,» principles that «must also apply to
weapons which will be developed in the future, such
as fully
autonomous weapons systems.»
In the «research priorities» document section on «Computer Science Research for Robust AI» (page 3), the authors note that «
as autonomous systems become more prevalent in society, it becomes increasingly important that they robustly behave
as intended,» and state that the development of
autonomous weapons and other
systems has «therefore stoked interest in high - assurance
systems where strong robustness guarantees can be made.»
As the international debate over
autonomous weapons systems emerges the
autonomous features in these
systems are coming under scrutiny.
The report by Heyns, who addressed the 2014 experts meeting and issued a 2013 report calling for a moratorium on
autonomous weapons systems, recommends that the Human Rights Council «remain seized» with the issue and «make its voice heard
as the international debate unfolds.»
Most states are now calling for a legally - binding instrument on fully
autonomous weapons, known at the CCW
as «lethal
autonomous weapons systems.»
The 2014 experts meeting concentrated on the role played by
autonomous weapons systems in situations of armed conflict in part because their possible use in law enforcement and other situations is seen
as a matter better suited to the Human Rights Council.
The 22 - page Where to draw the line report by Frank Slijper documents the trend towards increasing autonomy in
weapon systems by identifying
systems with the ability to select and attack targets with automated «critical» functions, such
as loitering munitions,
autonomous fighter aircraft, and automated ground
systems with varying levels of human control.
Pakistan (10 October)-- «Pakistan also supports the commencement of negotiations in the CD on the Prevention of an Arms Race in Outer Space and Negative Security Assurances,
as well
as on contemporary issues such
as chemical and biological terrorism, lethal
autonomous weapon systems (LAWS), and cyber
weapons.»
Lebanon (4 October)-- «There's also a need to determine the linkages with issues such
as nuclear safety,
autonomous lethal
weapons, and the network
systems through which drones operate.
The letter links to a document outlining «research directions that can help maximize the societal benefit of AI» that includes a list of legal, ethical, and other questions relating to «lethal
autonomous weapons systems,» also known
as fully
autonomous weapons or killer robots.
At this week's CCW Fifth Review Conference, China for the first time said it sees a need for a new international instrument on lethal
autonomous weapons systems,
as it questioned the adequacy of existing international law to deal with the challenges posed.
Asaro's tone was chilling
as he contemplated
autonomous weapon systems and armed artificial intelligences:
Deliberations on
autonomous weapons systems should not be limited to considering transparency measures or article 36
weapons reviews,
as several nations have noted.
At «informal consultations» on Monday 11 November, France —
as chair of this week's Convention on Conventional
Weapons meeting — proposed a mandate to «discuss questions related to emerging technologies» in the area of «lethal
autonomous weapons systems.»
Professor Noel Sharkey of ICRAC described precursors to fully
autonomous weapons as «
systems that can select targets on their own and attack them on their own.»
Brehm said the campaign welcomes the strong interest by governments in discussing concerns relating to the
weapons,
as shown by the first meeting on lethal
autonomous weapons systems chaired by France at the Convention on Conventional
Weapons (CCW) in May 2014.
The International Committee of the Red Cross (ICRC) convened its second experts meeting on
autonomous weapons systems on 15 - 16 March, attended by representatives from 20 countries
as well
as campaign members.
The report finds a lack of clarity
as to who would be accountable if an
autonomous weapons system violates international law and notes that «proactive and future - oriented work in many fields is needed to counteract «the tendency of technological advance to outpace the social control of technology.»»