AustLII Home | Databases | WorldLII | Search | Feedback

Edited Legal Collections Data

You are here:  AustLII >> Databases >> Edited Legal Collections Data >> 2016 >> [2016] ELECD 232

Database Search | Name Search | Recent Articles | Noteup | LawCite | Help

Smith, Bryant Walker --- "Lawyers and engineers should speak the same robot language" [2016] ELECD 232; in Calo, Ryan; Froomkin, Michael A.; Kerr, Ian (eds), "Robot Law" (Edward Elgar Publishing, 2016) 78

Book Title: Robot Law

Editor(s): Calo, Ryan; Froomkin, Michael A.; Kerr, Ian

Publisher: Edward Elgar Publishing

ISBN (hard cover): 9781783476725

Section: Chapter 4

Section Title: Lawyers and engineers should speak the same robot language

Author(s): Smith, Bryant Walker

Number of pages: 24

Abstract/Description:

Lawyers and engineers can, and should, speak to each other in the same language. Both the law and engineering are concerned with the actual use of the products they create or regulate. They engage in similar concepts and terms and have interconnecting roles. Yet confusion and inconsistencies can lead to a regulator’s system boundaries being wholly incongruous with a developer’s system. This chapter emphasizes the importance of four concepts – systems, language, use, and users – to the development, regulation, and safety of robots. To guide the discussion, the author uses motor vehicle automation as an example and references a number of technical documents. The author finds that defining a system’s boundaries is a key conceptual challenge. Inconsistency in the use of language – particularly in the use of the terms control, risk, safety, reasonableness, efficiency, and responsibility – leads to unnecessary confusion. Furthermore, there is no uniform understanding of “safety” from a technical, much less legal, perspective. The author discusses how several concepts and terms are susceptible to numerous meanings, and suggests more effective uses of these concepts. Developers and regulators have interconnecting roles in ensuring the safety of robots and must thoughtfully coordinate the technical and legal domains without conflating them. Additionally, humans should be understood as part of the systems themselves, as they remain a key part of the design and use of automated systems. The systems analysis introduced in this chapter reveals the conceptual, linguistic, and practical difficulties that developers and regulators will confront on the path of increasing automation. Sensibly defining automated systems requires a thoughtful dialogue between legal and technical domains in the same robot language.


AustLII: Copyright Policy | Disclaimers | Privacy Policy | Feedback
URL: http://www.austlii.edu.au/au/journals/ELECD/2016/232.html