Lack of driver blamed for crash
Self-driving system drives car off the road
Tesla is again in the headlines for all the wrong reason.
Two men died after a Tesla vehicle crashed in Texas. It is believed to be operating without anyone driving before it crashed into a tree.
“There was no one in the driver’s seat,” Sgt Cinthya Umanzor of the Harris County Constable Precinct 4 said of the crash on Saturday night.
Bending the rules
The 2019 Tesla Model S was traveling at high speed and failed to negotiate a curve. On leaving the highway, the car hit a tree, bursting into flames. The emergency services took four hours to extinguish the blaze and used more than 100,000 litres of water.
After the fire was extinguished, authorities located two passengers, with one in the front passenger seat while the other was in the back seat of the Tesla. Both occupants were in their 50s and were less than half a mile from the owner’s property.
Investigators believe the driver may have activated the car’s Autopilot mode. This driver assistance function in Teslas can control steering, acceleration and brakes within lanes on motorways. However, all drivers are told they must be aware at all times that ‘it is not a fully autonomous system. They must keep their hands on the wheel and remain in full control of the car according to the handbook.
Tesla chief executive Elon Musk claims the car was not using Autopilot at the time. What’s more, he says the car was not equipped with the brand’s Full Self Driving (FSD) package.
Last year, German courts banned the manufacturer from making ‘self driving’ claims. However, it is common for Tesla owners and fans of the brand to share videos clips or images of Tesla models supposedly driving themselves.
For example, one American woman filming her child asleep in the driver’s seat of a Model 3 while it travelled on the motorway.
Better education and training
Matthew Avery, research director at Thatcham Research, described the Texas crash as “an incredibly sobering”. Commenting in Autocar, he says it is an “illustration as to why education and correct naming are so important to the safe use of driver assistance systems”.
“A lack of understanding of system capability is causing confusion around driver responsibility,” he added. “System names should not be misleading.”
In February 2020, an investigation into a fatal crash involving a Tesla Model X being driven on Autopilot in Mountain View, California, found that the driver was distracted using his mobile phone. The driver’s over reliance on the Autopilot and the driver’s distraction – likely from a mobile phone game app – caused the crash.
There is growing scrutiny of semi-automated driving system following recent accidents. Tesla is currently preparing to launch its updated “full self-driving” software to more customers. The US auto safety agency is currently investigating 27 Tesla crashes.
However, Tesla CEO, Elon Musk, said in January that he is “highly confident the car will be able to drive itself with reliability in excess of human this year”.
Self-driving technology and autonomous vehicles are accepted to be th future of motoring. However, they must overcome safety and regulatory hurdles, whilst drivers need better education of the limits of intermediate systems.