Why Tesla's Autopilot Isn't The Menace You Think It Is

We may earn a commission from links on this page.

Last week, we learned of the tragic death of Joshua Brown, who died in a crash involving a truck crossing a divided highway and his Tesla Model S, which was cruising on its semi-autonomous Autopilot system. Today, Consumer Reports called on Tesla to disable the system until major tweaks are made, and to change the name of it. This is all a bit much, really.

The CR post, entitled “Tesla’s Autopilot: Too Much Autonomy Too Soon” does make salient points about the nature of control and human responsibility, citing data to back up its concerns:

Research shows that humans are notoriously bad at re-engaging with complex tasks after their attention has been allowed to wander. According to a 2015 NHTSA study (PDF), it took test subjects anywhere from three to 17 seconds to regain control of a semi-autonomous vehicle when alerted that the car was no longer under the computer’s control. At 65 mph, that’s between 100 feet and quarter-mile traveled by a vehicle effectively under no one’s control.

Before finally concluding that Tesla should:

  • Disable Autosteer until it can be reprogrammed to require drivers to keep their hands on the steering wheel
  • Stop referring to the system as “Autopilot” as it is misleading and potentially dangerous
  • Issue clearer guidance to owners on how the system should be used and its limitations
  • Test all safety-critical systems fully before public deployment; no more beta releases
Advertisement

Those are four very strong recommendations, but they’re an extreme reaction to an extreme circumstance. It’s all a bit of odd handwringing, one borne not of a concern that was so loudly considered when the system debuted, but one prompted by an incredibly unfortunate (and yes, possibly avoidable) death. We still don’t know, in fact, who or what was responsible for Brown’s death.

Advertisement

To blame Tesla’s Autopilot system is, at best, premature. Any one of three factors could be responsible for the crashes on which Consumer Reports is basing their demands. We could be dealing with freak accidents, unavoidable even with the best self-driving technology. We could be dealing with recklessly inattentive drivers. We could be dealing with perfectly responsible drivers dealing with a flawed sensor array. I can’t make it clear enough that we still do not know the definitive causes for these crashes, and that this level of uncertainty is not enough to start calling autopilot a menace.

Advertisement

But since that’s not all that should be said about CR’s premature call to arms.

Tesla’s oft-cited statistic of one death per 100 million American miles traveled for regular, human-driven miles, as compared to one death per 130 million Autopilot miles traveled, is neither a good statistic (it’s full of confounding variables, with no controls for the demographics of Tesla buyers or the times and places in which they typically drive as compared to the rest of the populace, among others) nor is it a valid explanation when we get into murky moral and ethical areas of control and responsibility.

Advertisement

But at the moment, it’s the best leaping-off point that we have for the discussion we find ourselves mired in. It’s the one number that both sides can point to, and while it’s not perfect, it does lend credence to the notion that Autopilot itself is no worse than human drivers alone, if not better. The delay in reaction time may not even be a factor.

All that may not matter, however. CR’s main concern – and they’re not alone in it – is that humanity is not ready for a semi-autonomous system, and that any incident that may occur with it activated is blood on Tesla’s hands, and Tesla’s responsibility. I’m not so sure.

Advertisement

Ever since the automobile debuted, sometime in the past couple of hundred years, there’s always been a modicum of responsibility apportioned to the operator of the vehicle. And for over a hundred years now, that responsibility has held much greater weight than is often recognized. Cars have vast capabilities, with the ability to not just cruise comfortably down a suburban street, but to exceed speeds of 200 miles per hour, to be handled dangerously, and yes, even to easily kill.

And for as long as the automobile has been around, we’ve entrusted the humans that operate them to do so safely. It’s one of the many pacts we make each day with society to prevent it all from coming apart.

Advertisement

With Autopilot, its detractors are arguing that that very pact is becoming frayed around its edges, if not torn apart completely. Is it positioned deceptively, they wonder. Does it promise something it cannot be provide? Does it encourage poor behavior, or even a dereliction of responsibility to both ones own life and the ones around it? It’s known as a “beta” system, which would ostensibly imply that it’s not ready for widespread use, so why should the public have access to it?

A lot of it centers around how we think of Autopilot, and systems like it. The detractors see the worst actors, the ones proudly jumping into the back seat as the car cruises on the highway, or reading a book, and points fingers at Tesla. The system allows drivers to take their hands off the wheel, something that supposedly should never be done, and it encourages distractions, and this is all one company’s fault.

Advertisement

But it really isn’t.

Consumer Reports argues that Tesla’s Autopilot system, as it sits, is too open for misuse. I don’t buy that. Because nowhere in the system’s marketing, in its instructions, or even its implied promises of capability, does it ever absolve that responsibility that we’ve relied on for over a hundred years. It’s still the responsibility of the person sitting in the driver’s seat, not the responsibility of the company that created it. That’s how we’ve looked at cars since they’ve been invented, and no one is going after Ferrari, for instance, for all the deaths caused when its cars are used improperly. We’re not demanding it electronically limit the speeds of its cars, adjustable only to GPS-informed local speed limits.

Advertisement

Sure, CR can tell people to constantly put their hands on the wheel if you want, but that just renders the system pointless. If we demand full, physical control of automotive systems to ensure safety, we might as well encourage people to constantly hover their foot over the brake when enabling bog-standard cruise control, lest some obstacle suddenly arise in the way. If we’re going to have the systems, let’s just have them, and learn to live with them, responsibly, like human beings.

Even the name, “Autopilot,” isn’t really that problematic, and its downsides are only arguable at best. “By marketing their feature as ‘Autopilot,’ Tesla gives consumers a false sense of security,” said Laura MacCleery, Vice President of Consumer Policy and Mobilization for Consumer Reports, in the CR statement. But it’s called “autopilot,” and not “driverless system,” for a reason. Autopilot systems in airplanes don’t encourage pilots to leave the cockpit, or just take a nap right then and there. The pilot must still remain alert and aware, ready to take control at a moments notice, as controlled flight into terrain incidents, even when using autopilot have happened.

Advertisement

Tesla even alludes to this same line of thinking in its warning it gives when a driver activates the car’s Autopilot system (emphasis mine)

Autosteer feature is currently in Beta:

Autosteer is for use on highways that have a center divider and clear lane markings, or where there is a car directly ahead to follow. It should not be used on other kinds of roads or where the highway has very sharp turns or lane markings that are absent, faded, or ambiguous. Similar to the autopilot function in airplanes, you need to maintain control and responsibility for your vehicle while enjoying the convenience of Autosteer.

Do you want to enable Autosteer while it is in Beta?

The system is not perfect, Tesla says, and you need to still keep it together.

Just as well, when Tesla launched the Autopilot system, it said that its cars “will be able to steer to stay within a lane, change lanes with the simple tap of a turn signal, and manage speed by reading road signs and using active, traffic aware cruise control.” There was nothing there about collision avoidance, or crash mitigation. It’s as if Tesla described an absurdly rudimentary system, and one that drivers—not the company—have placed additional capability upon.

Advertisement

Tesla’s announcement for the system, too, left no question as to whether this was a truly “driverless” system:

Our goal with the introduction of this new hardware and software is not to enable driverless cars, which are still years away from becoming a reality. Our system is called Autopilot because it’s similar to systems that pilots use to increase comfort and safety when conditions are clear. Tesla’s Autopilot is a way to relieve drivers of the most boring and potentially dangerous aspects of road travel – but the driver is still responsible for, and ultimately in control of, the car.

Advertisement

All of that, in so much legally-wrangled writing, leaves little room for doubt as to the system’s original intention.

But we’re still left with questions, questions that we need to work out as a society. Even if we can’t guarantee who is “driving,” can we guarantee who is responsible? If the driver is alert and aware when the system is activated, should we still hold the driver responsible for a collision in which Autopilot failed, and the driver did not have enough time to make a corrective action?

Advertisement

But the answer to all of our questioning should not be a draconian solution, one that suggests the only answer is to do away with the system completely, and one that implies a life lays at the feet of Tesla CEO Elon Musk.

There’s too much we haven’t agreed upon before we decree its removal.

It is absolutely not an ideal situation, as there’s a massive question left as to what the point of a system that allows you to relinquish control, even temporarily, is when you still need to sit at attention. And yes, what appears to be a massive gap in its hardware may or may not have led to the loss of a human life. But we as a society, and Tesla as a company, are clearly not ready to completely absolve drivers of their culpability behind the wheel.

Advertisement

So why are we pretending otherwise? Why are we demanding a nanny corporation when we’ve never demanded one before?

It’s all a little bit much.