Push for new body to probe ‘half-human’ vehicle crashes
Australia should set up a US-style investigation body to examine road incidents involving automated vehicles, two experts say, after a woman who claimed to be driving a Tesla in autopilot mode collided with a nurse boarding a tram in Melbourne this week.
With partially self-driving cars increasingly prominent across the country, the experts recommend an approach that gathers information on how much technology or human error contributed to an accident, allowing manufacturers to improve safety.
The highest permitted level of driving automation in Australia is “level 2”, which allows vehicles to steer in their lane as well as brake and accelerate, but they cannot navigate traffic unsupervised. Several car manufacturers offer this functionality locally, including Tesla, Mercedes-Benz, Kia and Hyundai.
Guidelines issued by the National Transport Commission state drivers must constantly be in control of their vehicle, including having one hand on the steering wheel.
While the driver is legally liable for all road incidents, a system malfunction or lack of clear information could sometimes be a contributing factor, concluded a US investigation into a crash that killed a father of two in California in 2018.
Tia Gaffney, a road safety expert with the independent Australian Road Research Board, said police investigations focused on whether a crime had been committed, rather than gathering information on the vehicle and road environment.
She said a new investigation body could mimic the Australian Transport Safety Bureau that probed rail and aviation crashes.
“They break down incidents systematically, so we know how to improve next time. There is no such thing for vehicles, which was probably fine when we had purely human drivers. Now we have these half-human, half-machines operating, it really is important to establish that investigation when anything goes wrong,” Ms Gaffney told The Age.
“That’s fundamental to science in general: when you want to know what’s happening, you need that feedback loop. It’s crazy to me that there is still not such a thing.”
Tuesday’s incident in Armadale, in Melbourne’s south-east, was the first involving a Tesla for Victoria’s major collision investigators. The driver, 23-year-old loading dock manager Sakshi Agrawal, told a court hearing on Tuesday night the car was in autopilot when it allegedly dragged the victim 15 to 20 metres.
Ms Agrawal allegedly initially fled the crash, which occurred at 6.30am when street lights were on and it was still relatively dark, before returning shortly after.
Acute care nurse Nicole Lagos, 26, was in a critical condition at The Alfred hospital with life-threatening injuries to her upper and lower body after the crash.
Driver accused of Armadale hit-run tells court Tesla was on autopilot
While the incident is said to be the first of its kind in Victoria, the National Highway Traffic Safety Administration — an arm of the US government — has investigated 31 crashes involving partially automated vehicles since 2015.
Swinburne future urban mobility professor Hussein Dia said a central investigation body in Australia could gather information from manufacturers such as Tesla, which generally kept data on all of their vehicles, similar to how a plane’s black box operated.
“A regulation body, like for air crashes, is a very good idea,” he said.
Professor Dia added that manufacturers and governments should do more to clarify the features of semi-autonomous vehicles. Germany has banned Tesla naming its system “autopilot”, for example, because it implies an unrealistic level of autonomy.
“People see ‘autopilot’ and think it is completely driverless. It should be a consistent name such as ‘driver assist system’ across the industry,” he said.
“This is an exciting space and I would like to see autonomous cars on our roads tomorrow, because every year we lose 1.2 million people to road crashes [around the world] and over 90 per cent are human error. But I think we are at least five to 10 years away from fully driverless cars and there is much to improve in the meantime.”
Mercedes has designed Drive Pilot, the world’s first level-3 autonomous vehicle, meaning the German manufacturer will accept liability for any road incidents.
It has received approval in Germany and expects to be on the roads of some US states by the end of the year. Australia will require significant legislative changes before level-3 vehicles are permitted to operate on the country’s roads.
The National Transport Commission acts as the national co-ordinating body between states and territories in Australia and has previously released guidelines around automated vehicle trials and enforcement.
Flinders University dean of law Tania Leiman, an automated vehicle expert, said extra training and information for drivers of semi-autonomous vehicles would also be beneficial.
“Technology provides the basis for much safer vehicles, saving a lot of lives and an enormous amount of money. We just need to equip the human part of that with the skills they need to properly operate those cars,” Professor Leiman said.
Tesla was contacted for comment.