AustLII Home | Databases | WorldLII | Search | Feedback

Edited Legal Collections Data

You are here:  AustLII >> Databases >> Edited Legal Collections Data >> 2017 >> [2017] ELECD 1282

Database Search | Name Search | Recent Articles | Noteup | LawCite | Help

Margulies, Peter --- "Making autonomous weapons accountable: command responsibility for computer-guided lethal force in armed conflicts" [2017] ELECD 1282; in Ohlin, David Jens (ed), "Research Handbook on Remote Warfare" (Edward Elgar Publishing, 2017) 405

Book Title: Research Handbook on Remote Warfare

Editor(s): Ohlin, David Jens

Publisher: Edward Elgar Publishing

ISBN (hard cover): 9781784716981

Section: Chapter 13

Section Title: Making autonomous weapons accountable: command responsibility for computer-guided lethal force in armed conflicts

Author(s): Margulies, Peter

Number of pages: 38

Abstract/Description:

In debates about autonomous weapons systems (AWS) in armed conflict, narratives have their own momentum. One could frame AWS as a variant of driverless cars: a means to reduce the havoc and mayhem caused by human error. Indeed, some commentators view AWS in armed conflict as a potential cure for defects in human perception and judgment. In contrast, AWS opponents warn of killer robots going rogue, and urge a ban on development and deployment of AWS. Proponents of a ban often also raise the specter of impunity, asserting that it will be impossible to hold a human accountable for the mistakes of a computer (a ‘machine’ or ‘agent’, in data scientists’ parlance). This chapter argues that a ban on AWS is unwise. Adaptations in current procedures for the deployment and use of weapons can ensure that any AWS used in the field complies with IHL. Solving the AWS accountability problem hinges on the doctrine of command responsibility, applied in a three-pronged approach that the chapter calls ‘dynamic diligence’. Dynamic diligence is a demanding standard. First, it requires continual adjustments in the machine-human interface, performed within a military command structure staffed by persons who possess specialized knowledge of AWS’s risks and benefits. Second, dynamic diligence requires ongoing assessment of the AWS’s compliance with IHL. This assessment starts with validation at the weapons review stage, prior to an AWS’s deployment. It includes frequent, periodic assessments of an AWS’s learning in the field, to ensure that field calculations enabled by the machine’s software are IHL-compliant.


AustLII: Copyright Policy | Disclaimers | Privacy Policy | Feedback
URL: http://www.austlii.edu.au/au/journals/ELECD/2017/1282.html