Despite what many, many people seem to think and some frankly confusing terminology on Tesla’s website and marketing materials, Autopilot is not a fully self-driving system.

Advertisement
Advertisement

As Tesla likes to remind us every time this happens, this is how they describe the system’s use themselves:

“Before enabling Autopilot, the driver first needs to agree to “keep your hands on the steering wheel at all times” and to always “maintain control and responsibility for your vehicle”. Subsequently, every time the driver engages Autopilot, they are shown a visual reminder to “keep your hands on the wheel”...

Autopilot is intended for use with a fully attentive driver who has their hands on the wheel and is prepared to take over at any time.”

Advertisement

Requiring the driver to be ready to take control at a moment’s notice is a defining characteristic of a Level 2 autonomous driving system, which is not autonomous at all, but really more of an advanced driver assist.

That means doing things like “checking on [your] dog” in the back seat while the car is driving at highway speeds is an absolutely idiotic idea, and exactly what this driver did, which is why they weren’t able to take control of the car when it was clear it would be running into the stopped police car and the disabled car in front of the police car.

Advertisement

Keep in mind, this was not a situation that would have confused a human driver: the cars were stopped, with their hazard lights on, along with flares set to warn drivers that the cars were there, immobile. These were hardly hidden cars, and not unusual circumstances in the slightest.

The driver was charged with Reckless Driving and Reckless Endangerment because if you’re driving a car, you need to be paying attention, dummy, even if your love for Elon Musk and Tesla is so powerful and real that you can just feel it, deep inside you, where music is born.

Advertisement

If you still think that Tesla’s Autopilot system is close enough to being fully self-driving, let’s try an analogy: if you had a chauffeur that was an excellent driver 80 percent of the time, but, just so you know, was also a narcoleptic and could fall dead asleep without warning at any moment, would you be okay with being driven around by them? I’m not so sure I would.

That’s what’s going on here. Autopilot has no graceful fail-over; if something fails to work like it should, for any number of reasons, it needs a human at the wheel paying attention to take immediate control. The system may not even be aware there’s a problem until way too late—that’s why it needs your practiced, moist, human driver’s eyes watching as well.

Advertisement

This particular wreck seems to have some similarities to a known Tesla Autopilot issue with stationary cars, as seen here:

Remember, it’s a driving assist system. It’s not self-driving. So pull over to check on your dogs.

Advertisement

If you stop, it’ll be much easier to ask who’s a good boy/girl, too, and you need that information.