Vote 2020 graphic
Everything you need to know about and expect during
the most important election of our lifetimes

Limits Of Tesla's Autopilot And Driver Error Cited In Fatal Model S Crash

Illustration for article titled Limits Of Teslas Autopilot And Driver Error Cited In Fatal Model S Crash

The U.S. National Transportation Safety Board ruled today that the “operational design” of Tesla’s Autopilot played a key role in a 2016 fatal crash that left the owner of a Model S dead. The vehicle’s semi-autonomous system permitted the driver, Joshua Brown to use Autopilot in ways “inconsistent” with warnings from Tesla that “permitted his prolonged disengagements from the driving task” and ultimately contributed to his death.

Advertisement

The nearly three-hour hearing offered a wide-ranging discussion on the limitations of automated systems on the road today. The NTSB suggested to any maker of semi-autonomous vehicles to prevent the use of the technology on roads where the vehicles aren’t suited to travel without human control of the vehicle.

Advertisement
Photo: Florida Highway Patrol
Photo: Florida Highway Patrol

Brown, a 40-year-old self-described Tesla fanatic, died in May 2016 when his Model S sedan rammed into the side of a trailer on a big-rig truck that was turning left in front of the vehicle. The Model S was traveling at 74 mph using Autopilot, which failed to identify the truck and never slowed the vehicle down, the NTSB said.

But Brown and the truck driver bore some fault, as well, in the Florida crash, the agency said.

Photo: NTSB
Advertisement

The NTSB issued 13 findings related to the crash, including a pair of citations over Brown’s driving patterns. Brown relied too much on automation, and demonstrated a “lack of understanding of system limitations,” the agency said.

Part of that, the NTSB indicated, had to do with Tesla. Tesla has repeatedly said in the past that drivers must keep their hands on the wheel and pay attention to the road if they enable Autopilot.

Advertisement

But Tesla could’ve taken additional steps to prevent misuse of Autopilot, which worked as designed, the NTSB said, but the feature was used in ways that weren’t intended—and that Tesla didn’t go far enough to ensure drivers remained alert. (Tesla has previously said to a separate federal regulator that it evaluated the potential for some drivers to become inattentive while using the feature.)

“Tesla allowed the driver to use the system outside of the environment for which it was designed and the system gave far too much leeway to the driver to divert his attention,” said Robert Sumwalt, the NTSB chairman.

Advertisement

The truck driver, who refused to be interviewed by the NTSB, failed to yield the right of way to Brown, but the agency said both Tesla and the driver had “at least 10 seconds” to see each other, and there’s “no evidence” that either took evasive action to prevent the crash.

In a statement, a Tesla spokesperson said the “safety of its customers comes first” and “Autopilot significantly increases safety, as [the National Highway Traffic Safety Administration has found that it reduces accident rates by 40 percent.” (The conclusion of that 40 percent figure is the subject of an ongoing suit.)

Advertisement

“We appreciate the NTSB’s analysis of last year’s tragic accident and we will evaluate their recommendations as we continue to evolve our technology,” the spokesperson said. “We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times.”

Brown’s death drew worldwide attention as it was the first known fatal crash to involve a vehicle traveling on its own using semi-automated technology. A preliminary report released earlier this year by the NTSB found that Brown kept his hands on the wheel of his 2015 Model S70D for only 25 seconds of an extended 37-minute period where his car was in Autopilot at 74 mph. The vehicle gave numerous audio and visual warnings before the crash.

Advertisement

Tesla CEO Elon Musk said himself that a later update to Autopilot—improving the system’s ability to differentiate a truck from a sign—“very likely” would’ve prevented the crash.

The NTSB staff said that Brown was knowledgable about the vehicle, but that the Tesla user manual offered sometimes conflicting accounts.

Advertisement

“A driver could have difficulties interpreting which roads it might be appropriate [to use Autopilot],” Ensar Becic, an NTSB human performance investigator, said during the hearing.

The NTSB also found that relying on sensors that determine whether a driver’s hands are on a steering wheel isn’t an effective method of determining whether they’re paying attention.

Advertisement

Above all, the NTSB’s 80-page report illustrated the limitations of tech currently commercially available semi-autonomous cars. The Tesla Model S was not a fully self-driving car, stressed Sumwalt, the NTSB chairman, in a warning call of sorts to consumers.

“It’s a long way from partially automated vehicles to self-driving cars,” he said. “And until we get there somebody still has to drive.”

Senior Reporter, Jalopnik/Special Projects Desk

Share This Story

Get our newsletter

DISCUSSION

Does this mean that that kid that got hung by the neck a couple of weeks ago gets to sue the rope manufacturer for failing to ensure the rope couldn’t be misused?

If you want ad absurdum examples, I have plenty. A modern car is a highly complex, extraordinarily dangerous piece of equipment and the scope for its misuse is frankly amazing. We’ve already seen what one person can do with one car, and the list of people who have deliberately set out to kill themselves with their own cars is very, very long indeed. Should the manufacturers be held directly responsible, at least in part, for those deaths too? Certainly there were steps that could have been taken, features that could have been included or left off, that would have prevented suicide by exhaust.

All of that said, I don’t think the judge/s are wrong to give this judgment. Autopilot is not ready for prime time, and the public isn’t mature enough to use it wisely. As to the public being mature enough, that day may never come but shelving Autopilot while it continues to receive training and enhancement, that’s easy enough to do and if it were my call to make, it would never have been rolled out - especially not with a name like Autopilot.

I think an autonomous system needs a few more rules. For instance, at no point should it be allowed to speed. If any liability can reflect back onto the manufacturer, then that’s one hurdle I simply won’t jump. If you want to use my robot, don’t ask it to break a law for you. It won’t.

Autopilot shouldn’t work without the driver’s hand on the wheel. It can’t ensure that you’re watching the road but it can at least ensure that you’re positioned to take over if it becomes overwhelmed. It doesn’t take a lot to confuse a robot; fruit flies manage to get from one place to the other without dying and they’re not known for their processing power. Forcing the monkey at the wheel to keep a hand on the wheel enforces his duty to take responsibility.

Don’t want the responsibility, don’t drive the car. Simple.

Were it on me, I would have given these as two separate judgments. Brown signed off on a lot of paperwork and probably a few on-screen waivers in order to activate and then use Autopilot. Whether or not he was aware of the system’s capabilities, he took those liabilities upon himself when he clicked, “I Agree.” ‘Nuff said.

And then I would have issued a separate statement telling Tesla to pull Autopilot. Until Tesla can prove that Autopilot is at least as good as a good human driver, can differentiate between obstacles and scenery, can be counted on to adhere to traffic controls at all times, Autopilot ought to be benched. There are lots of car features that have been disallowed, there’s no reason why this shouldn’t be another one. And likewise, those features sometimes come off the bench when the time is right for them. The time is not right for Autopilot, not yet.

Hey Elon, here’s a free tip: pull Autopilot. Don’t stop working on it, because I think it has the potential to be a winner. But until it’s ready to actually be that winner - and it isn’t yet - taking its availability away will protect Tesla and you from a lot of liability.