Definitions of autonomous driving systems could lead to drivers relying on them more than they should, it has been said.
The subject was among those discussed at a Westminster Energy Environment and Transport Forum conference on intelligent and autonomous transport in the UK.
At the event, which heard how the development of autonomous vehicles appeared to have been slower than expected compared with predictions from a year or two ago, David Williams, managing director of underwriting and technical services at AXA Insurance, told delegates he found the current definition of level three autonomy to be particularly problematic.
The widely used level system for describing stages of automation, devised by automotive body SAE, includes: Level One, which provides steering or braking/acceleration support to the driver; Level Two, which provides both steering and braking/acceleration support; level three, which means the car can drive itself for a period, but the driver must be ready to intervene when requested; level four, meaning fully autonomous driving in a specified area; and level five, which means full autonomy everywhere.
Williams said he felt level four and five could be accepted as autonomous, as the driver was not required to step in as a fallback, but this was not the case for level three.
He said: “If the driver is required as a fallback, then from my perspective it is not autonomous. My worry is we are hearing more and more people talking about autonomous from level three.
“You can say ‘conditional automation’ if you want, but the reality from my perspective is it is just very, very good driver assistance. I worry about the messages we are sending to people on the roads. If you are telling them that level three is autonomous, they will do things that
they shouldn’t.
“We already know that people are doing things they shouldn’t at level two, so if we start talking about level three as being autonomous, they won’t be ready to take back control, and the definition of level three means they will need to at some point.”
Williams warned that research had shown the handover stage between autonomous and manual driving – required by level three – was the moment of greatest risk. He also said he was concerned that car manufacturers might push for level three to be given greater status.
“I worry that, particularly with us leaving Europe, the OEM lobby could be pushing very hard for scope creep, and for level three vehicles being described as autonomous,” he said. “I think that will cause confusion and potentially could also cause lots of accidents.”
Away from the debate about definitions, autonomous tech development is still going on. Another speaker, Nissan head of UK external and government affairs Peter Stephens, explained some of the key lessons the manufacturer had picked up from its own trials. These included a drive from Milton Keynes to Sunderland done 99% autonomously.
Stephens said: “One is about the clarity of the road environment – being able to pick out road markings and demark the kerb. There is a real challenge there around on-street parking and street furniture, on how easy it is for an autonomous car to be able to detect where the roadway is.
“There is the importance of diversity of technology, so it is not that one technology will work for all road conditions; particularly in terms of positioning and locating the vehicle, you need a variety of technologies – GPS, camera systems, Lidar, and so on.
“Finally there are challenges around traffic signal recognition, and I guess there is a question here about the extent to which we depend and design [based] on what I would call dumb road infrastructure, versus looking forward to future intelligent traffic systems.”
While there are obvious hurdles in the way, the underlying potential of the technology is still being acknowledged.
Catherine Lovell, deputy head of the Department for Transport’s Centre for Connected and Autonomous Vehicles, said: “They have the potential to radically improve road safety. In 2018, 85% of accidents resulting in personal injury involved driver error.
“Connected and autonomous vehicles don’t get tired, they don’t get drunk, they don’t get distracted looking at Instagram when they should be driving.
“They also have the potential to better calculate the safest course of action in unexpected events, something we know humans are not always terribly good at.”