The inherent dangers of automated vehicles

white car on autosalon

An urgent investigation is needed into the technology that allows self-driving cars to communicate with their operators, researchers say.  Academics from the University of Leeds’ School of Law say digital interfaces may be unable to adequately communicate safety and legal information, which could result in accidents. The team has produced a study arguing that the terms and conditions presented to drivers via an interface should be subject to intense scrutiny. The researchers, Dr Jo-Ann Pattinson and Dr Subhajit Basu, are calling for an urgent review of the technology before automated cars are introduced onto roads. 

Shared driving

Some of the first ‘automated’ vehicles to be deployed on our roads will require a system of shared driving with a human driver. While this creates technical and operational challenges, the law must also facilitate such a transfer. One method may be to obtain the driver’s consent to share operational responsibility and to delineate legal responsibility between vehicle and driver in the event of an accident. Consent is a voluntary agreement where an individual is aware of the potential consequences of their consent, including the risks. The driver of a partially automated vehicle must be informed of potential risks before giving consent to share operational responsibility.

Operational responsibility

The researchers argue that there are inherent dangers associated with shared operational responsibility, in particular where there has been a request for the driver to take back control from the automated vehicle during the journey. Drivers are likely to experience delay in regaining situational awareness, making such operational transfers hazardous. It is argued that where an interactive digital interface is used to convey information, such as driver responsibility, risk and legal terms, drivers may fail to sufficiently process such communications due to fundamental weaknesses in human–machine interaction.

‘Extreme difficulty’

Dr Basu said: “The main safety messages surround the extreme difficulty most drivers will encounter when an AV suddenly transfers the driving back to them. Even if a driver responds quickly, they may not regain enough situational awareness to avoid an accident. The general public is not aware of their vulnerability, and it is doubted that an interface in an automated vehicle will communicate this point with sufficient clarity.” 

Driver training

The use of an interactive digital interface alone may be inadequate to effectively communicate information to drivers. If the problems identified are not addressed, it is argued that driver consent may be inconsequential, and fail to facilitate a predicable demarcation of legal responsibility between automated vehicles and drivers. Ongoing research into automated vehicle driver training is considered as part of the preparation required to design driver education to a level whereby drivers may be able to sufficiently understand the responsibilities involved in operating a partially automated vehicle, which has implications for future driver training, licensing and certification.

The paper, Legal issues in automated vehicles, has been published in Nature journal.

Written by: editors Smart City Hub.

Leave a Reply

Your email address will not be published. Required fields are marked *