The Future Is Looking Bright For Self-Driving Cars. But Are They Safe?
A topical debate in the automotive automation industry right now is over the safety of automated lane-keeping systems (ALKS) in new self-driving cars, and whether they are safe enough for to be relied upon on the roads.
This comes after years of Elon Musk’s Tesla autopilots being known for crashes in the popular press.
Now, the UK government has announced that self-driving cars would be allowed on UK roads this year, with the Department for Transport confirming that ALKS would be legalised for hands-free transport and a trial of self-driving cars already taking place in cities like Cambridge.
However, experts are claiming that ALKS should not be perceived as automated technology, but instead as assistive technology, due to safety concerns over stopping and much more.
How does the tech work?
The ALKS enables a driver to become a passenger, by taking over and driving the vehicle itself in a single lane. The driver is then essentially “hands free” driving.
The driver can also take back control, and sometimes the car will prompt the driver to do so when it can no longer navigate the road autonomously.
The technology does not allow lane changes, and if the driver fails to respond to a threat in the road after 10 seconds, it will automatically put hazard lights on, slow down and stop.
Cars using ALKS or similar technologies have been on Britain’s roads for many years; Tesla uses a similar system within their “autopilot” function.
However, they have caused some concern in the past as just two years ago a Tesla driver was banned from driving for 18 months after hopping into the passenger seat whilst the car was on autopilot.
By allowing ALKS in 2021, drivers will, for the first time, be able to legally disengage, go on their phones, play games and watch films behind the wheel.
The government has confirmed that drivers will not be required to monitor the road nor keep their hands on the steering wheel whilst their car is engaged in self-driving.
What is the problem?
If a car cruising along the road with a driver playing Candy Crush at the wheel is setting alarm bells off in your head, you are not alone.
Experts at Thatcham Research and The Association of British Insurers are urging caution over labelling this technology autonomous when driver engagement is still crucial to safety.
In a cautionary video they used an example of a car using an ALKS approaching car debris on the road. The ALKS does not sense the debris, and instead just drives over it.
An engaged driver, on the other hand, would see the debris and change lanes, thus avoiding damage to the car or life. They then explored a similar scenario, but instead a model of a person stepping into the lane was used.
Furthermore, the group had been working with UK insurers and came up with 12 key principles that must be in place for autonomous vehicles to be safe on the road.
These range from sustainability to collision protection. The ALKS that the government have approved only satisfy 2 out of the 12 principles.Thatcham Research said: “This automated driving technology is not mature and can’t be relied upon to keep the driver safe. It should be regarded as assistive driving technology because it always relies upon a driver to take back control.”
Despite the Society of Motor Manufacturers and Traders insisting that these systems could prevent 47,000 serious accidents due to their ability to reduce human error, there are immediate concerns that promoting the systems as autonomous will encourage negligence and endanger life.
As exciting and futuristic as it is, is this technology ready to be left unsupervised to face the roads?
Header Image: Elon Musk has pioneered self-autonomous vehicles with Tesla. Image via AP and Business Insider.