Autonomous Systems (AS) may offer significant societal benefits but will also create new types of incidents and accidents. The ability to access, explain and understand data related to failure or accidents will be a fundamental requirement for ensuring safety, liability and public trust. Drawing on Responsible Research and Innovation principles, we analysed a particular AS – autonomous vehicles (AVs) – with three objectives:
- Investigate the ethical risks and legal implications related to the collection, access and use of data.
- Test the legal usefulness of data sets.
- Evaluate public acceptance of data recorders (‘black boxes’) for AS.
A data recording device, or Black Box (BB), can have a positive impact on at least three key aspects concerning the deployment of AS: 1) Safety – in case of failure, a BB permits investigation of what went wrong and learning from accidents (fostering a no-blame safety culture). 2) Accountability – by logging parameters related to the working of an AS, including the AI decision-making process and interactions with human and environmental factors, a BB facilitates legal investigation after an accident. 3) Trust – the possibility of reconstructing the timeline leading up to an incident and attributing actions to user, producer, maintainer, etc. especially when many parties are involved, can increase public trust and acceptance of an AS. In RoAD we focused on recording devices for autonomous vehicles, drawing on state-of-the-art standards and regulations.
UNECE’s GRVA working party has drafted regulatory provisions for two types of BBs specifically designed for AVs: the Event Data Recorder (EDR) and the Data Storage System (DSSAV). The EDR collects information on the vehicle’s behaviour and crash severity when a specific triggering event occurs (eg airbag activation). The DSSAV by contrast is ‘always on’ and collects information on interactions between the driver and the system. However, datasets retrieved from data recorders such as EDR and DSSAV may not contain either enough data or the right data to facilitate practical and legal investigations, especially in collisions or near-miss events with vulnerable road users (VRU).
IEEE has drafted a standard (P7001) to measure and identify transparency requirements. The purpose is to always make it possible to understand why and how an autonomous system makes a particular decision. However, drivers or AV operators and their legal representatives may face barriers in accessing datasets that are needed for an accident investigation, civil claim or criminal prosecution.
The ITU Focus Group on AI for Autonomous and Assisted Driving (FG-AI4AD) is working on the evaluation of public expectations for post-collision behaviour of self-driving software. However, there may be resistance within civil society towards data recording within AVs, with respect to the potential ethical risks and legal implications concerning their use.
Outputs from the RoAD project included:
Ten Holter, C., Kunze, L., Pattinson, J.-A., Salvini, P., & Jirotka, M. (2022). Responsible innovation; responsible data. A case study in autonomous driving. Journal of Responsible Technology, 11(July), 100038. https://doi.org/10.1016/j.jrt.2022.100038
Image: autonomous vehicle by Berkah Icon from the Noun Project