The autopilot function of smart electric cars made by Tesla is under intense scrutiny following a recent fatal accident.
Last week Tesla boss Elon Musk noted his cars with autopilot engaged have a much lower chance of having an accident than the average vehicle. “Because every Tesla is connected, we’re able to use the billions of miles of real-world data from our global fleet – of which more than 1 billion have been driven with Autopilot engaged – to understand the different ways accidents happen,” said the Tesla safety report he linked to.
Tesla with Autopilot engaged now approaching 10 times lower chance of accident than average vehicle https://t.co/6lGy52wVhC
— Elon Musk (@elonmusk) April 17, 2021
The very next day, however, a Tesla crashed into a tree and caught fire, killing its two passengers, neither of whom appeared to have been in the driver’s seat. That forced Musk onto the defensive, who took to Twitter once more to insist the crash was not the fault of Tesla’s autopilot function and that the car wasn’t even equipped with Full Self Driving.
Your research as a private individual is better than professionals @WSJ!
Data logs recovered so far show Autopilot was not enabled & this car did not purchase FSD.
Moreover, standard Autopilot would require lane lines to turn on, which this street did not have.
— Elon Musk (@elonmusk) April 19, 2021
That prompted US consumer advocate outfit Consumer Reports to get hold of a Tesla and take a closer look at its autopilot. It claims to have found it easy to trick the car into ‘thinking’ there was someone in the driver’s seat when there wasn’t. It seems the only evidence a Tesla needs of someone actively driving the car is some weight placed on the steering wheel, which is straightforward to simulate.
“In our evaluation, the system not only failed to make sure the driver was paying attention, but it also couldn’t tell if there was a driver there at all,” said Jake Fisher, CR’s Senior Director of Auto Testing, who conducted the experiment. “Tesla is falling behind other automakers like GM and Ford that, on models with advanced driver assist systems, use technology to make sure the driver is looking at the road.
“The car drove up and down the half-mile lane of our track, repeatedly, never noting that no one was in the driver’s seat, never noting that there was no one touching the steering wheel, never noting there was no weight on the seat. It was a bit frightening when we realized how easy it was to defeat the safeguards, which we proved were clearly insufficient.”
Tesla doesn’t seem to have responded to the report and Musk’s Twitter is uncharacteristically quiet on the matter. There’s no suggestion that this hack was employed in the above crash, but any time a ‘smart car’ crashes in strange circumstances questions are going to be asked about the reliability of the technology.