- International humanitarian law continues to apply fully to all weapons systems,
including the potential development and use of lethal autonomous weapons systems.
This means that it continues to be humans and not machines who are held accountable for ccomplying with the law and human control is still needed for force to be legallly applied. The relevance of IHL is a point of contention among states though, with some- such as the USA saying that the current IHL is sufficient to regulate killer robots and others saying the current IHL does not sufficiently address LAWS. This is particularly contentious in respect of Article 36 reviews and sharing of best practices, many pointing out that the complex "black box" nature of functions enabled by machine learning could not be properly addressed using the current legal review process- something that could endanger the process entirely
- Human responsibility for decisions on the use of weapons systems must be retained
since accountability cannot be transferred to machines. This should be considered
across the entire life cycle of the weapons system.
- Human-machine interaction, which may take various forms and be implemented at
various stages of the life cycle of a weapon, should ensure that the potential use of
weapons systems based on emerging technologies in the area of lethal autonomous
weapons systems is in compliance with applicable international law, in particular
IHL. In determining the quality and extent of human-machine interaction, a range
of factors should be considered including the operational context, and the
characteristics and capabilities of the weapons system as a whole.
- Accountability for developing, deploying and using any emerging weapons system
in the framework of the CCW must be ensured in accordance with applicable
international law, including through the operation of such systems within a
responsible chain of human command and control.
- In accordance with States’ obligations under international law, in the study,
development, acquisition, or adoption of a new weapon, means or method of
warfare, determination must be made whether its employment would, in some or all
circumstances, be prohibited by international law.
- When developing or acquiring new weapons systems based on emerging
technologies in the area of lethal autonomous weapons systems, physical security,
appropriate non-physical safeguards (including cyber-security against hacking or
data spoofing), the risk of acquisition by terrorist groups and the risk of
proliferation should be considered.
- Risk assessments and mitigation measures should be part of the design,
development, testing and deployment cycle of emerging technologies in any
weapons systems.
- Consideration should be given to the use of emerging technologies in the area of
lethal autonomous weapons systems in upholding compliance with IHL and other
applicable international legal obligations.
- In crafting potential policy measures, emerging technologies in the area of lethal
autonomous weapons systems should not be anthropomorphized.
- Discussions and any potential policy measures taken within the context of the CCW
should not hamper progress in or access to peaceful uses of intelligent autonomous
technologies.
- The CCW offers an appropriate framework for dealing with the issue of emerging
technologies in the area of lethal autonomous weapons systems within the context
of the objectives and purposes of the Convention, which seeks to strike a balance
between military necessity and humanitarian considerations.