Who's At Fault When Crashes Happen With Tesla's Autopilot?

We may earn a commission from links on this page.

Tesla regularly implores drivers that use the company’s semi-autonomous Autopilot system to pay attention to the road and keep their hands on the steering wheel at al times. But a new report on a fatal March crash involving a Tesla Model X says the driver only had his hands on the wheel for 34 of the final 60 seconds before the collision, and, in that time, he received no visual or audio warnings to put his hands back on the wheel.

It illustrates the disconnect in Tesla’s deployment of Autopilot: Drivers are told to pay attention, while simultaneously encouraged to be inattentive, without any sort of prompt appearing for a lengthy period of time.

Advertisement

Now, Tesla CEO Elon Musk has made a show about covering crashes involving semi-autonomous systems, so I’ll state the obvious up front: yes, there’s plenty of inattentive driving on the road at any given moment. It’s a large part of the reason why there’s 40,000 fatalities annually from driving. Automakers should be working on solutions to reduce inattentive driving, including in situations like this crash.

Advertisement

But we’re in a world with cars that have systems capable of assuming driving tasks, like Tesla’s Autopilot or GM’s Super Cruise. These system are going to proliferate in the coming years. So it’s vital to discuss what automakers can do to ensure drivers pay attention while they’re in use. And I think this report, while not ascribing probable cause for the crash, shows that driver abuse of Autopilot is just as much of a problem as Tesla’s design of it.

Advertisement

Look to Super Cruise, or even the shoddy system BMW installs in higher-end models, like the 740e I recently tested. Super Cruise won’t let you peel your eyes off the road while it’s in use for more than a few seconds, thanks to a (creepy) eye-tracking camera on the steering wheel. And BMW’s system prompted me to put my hands back on the wheel every 5 seconds, no matter what.

Tesla expected drivers to be inattentive when using Autopilot. So what’s stopping the company from reining in Autopilot, even to a minor extent? It’s brilliant technology; I’ve marveled repeatedly in the past about Autopilot’s capability, and its only bested by Super Cruise—in my mind—because of how conservative GM’s approach is to semi-autonomy.

Advertisement

But the sort of dynamic seen here is why major companies developing self-driving cars are nowhere near the semi-autonomy playing field. Waymo, Google’s self-driving car unit, made an explicit decision to pass on developing a system akin to Autopilot because it’s such a tricky, if not impossible, balance to strike: it demands two different states of mind, easing up and letting the car handles driving tasks, while also remaining fully-attentive to assume control at a moment’s notice.

Advertisement

Human error is the fault of most fatal crashes, and that certainly factored into what happened here. But allowing drivers to go for extended periods of time without prompting them to keep their hand on a steering wheel only seems to invite this sort of tragedy.

Tesla passed on adding sensors to ensure drivers look at the road or keep their hands on the wheel in the past, according to a recent Wall Street Journal report, which said the safeguards were rejected over cost concerns.

Advertisement

Musk disputed the cost assertion and—without elaboration—said eye-tracking technology was rejected because it was ineffective.

Maybe so. Still, this NTSB report showed Tesla can do more. Tesla keeps pointing to driver error, but at some point it’ll have to look inward.