Photo: AP

Back in March, a Tesla Model 3 crashed into a semi-truck turning onto a Florida highway, killing the driver. After a preliminary investigation, the National Transportation Safety Board has concluded Autopilot, Tesla’s semi-autonomous driving technology, was engaged at the time of the fatal crash.

The report is only two pages long so there’s not a lot in the way of detail, but the basic narrative is this: the driver engaged Autopilot 10 seconds before the crash, and the vehicle stopped detecting his hands on the wheel eight seconds before the crash. Note that this is not enough information to say the driver definitely took his hands off the wheel; it only means the car did not detect any torque from the driver’s hands.

Advertisement

In any event, the semi-truck was making a left turn out of the side road, requiring it to cross traffic on the 55 mile-per-hour highway. The Tesla was traveling at 68 miles per hour. Neither Autopilot nor the driver attempted evasive maneuvers.

Advertisement

If you’re got a very strong feeling of déjà vu right now, don’t worry, as we’re getting it too. The very first (or second, depending on how you count it) fatal crash while using Autopilot occurred under what sounds like, at least for now with our very limited amount of information, similar circumstances. In that instance, a semi-truck was turning onto a Florida highway, and the driver of the Tesla struck the side of the trailer and died.

Advertisement

Tesla said back in 2016 that a subsequent update to Autopilot would’ve saved the driver’s life in that particular instance, but it is unclear the degree to which the two fatal incidents are similar beyond the superficial circumstances.

In a statement, a Tesla spokesperson said that the company provided NTSB with the vehicle log data shortly after the crash. “Tesla drivers have logged more than one billion miles with Autopilot engaged, and our data shows that, when used properly by an attentive driver who is prepared to take control at all times, drivers supported by Autopilot are safer than those operating without assistance.”

Advertisement

From this outline of events, it sure sounds like this is exactly the type of incident human factor researchers warned me about regarding semi-autonomous driving technology. The hand-off between computer and human, and their respective responsibilities in driving the car safely, can be muddled, leading to instances where neither is detecting a plain, obvious threat right in front of them.

Advertisement

Obviously, there’s still a lot to learn about this crash and precisely what happened in the 10 seconds prior to impact. But hopefully this can be a reminder that no car on the market right now can drive itself, no matter what any company’s CEO may say or do. Thinking otherwise could be fatal.