Vote 2020 graphic
Everything you need to know about and expect during
the most important election of our lifetimes

Tesla On Autopilot Crashes Into Cop Car Because Driver Was Checking On Dog And It's Not A Damn Self-Driving Car

Illustration for article titled Tesla On Autopilot Crashes Into Cop Car Because Driver Was Checking On Dog And Its Not A Damn Self-Driving Car
Photo: Connecticut State Police

Just in case you needed a little reminder as to whether or not you lived in a fictional 2019 where we have fully autonomous, self-driving vehicles or the actual 2019 where we have, at best, partially self-driving vehicles that require constant driver attention, this should be a good reminder: Over the weekend, a Tesla Model 3 with Autopilot engaged crashed into the back of both a police car and another vehicle. This is because the 2019 we all live in is not one where we have truly self-driving vehicles on the road. Sorry.

Advertisement

The wreck happened early on Saturday, December 7, near Norwalk, Connecticut. The Tesla Model 3, with a license plate that helpfully reads “MODEL3,” just so you really know what car it’s bolted to, had its Autopilot system engaged at the time of the crash.

Advertisement

Despite what many, many people seem to think and some frankly confusing terminology on Tesla’s website and marketing materials, Autopilot is not a fully self-driving system.

As Tesla likes to remind us every time this happens, this is how they describe the system’s use themselves:

“Before enabling Autopilot, the driver first needs to agree to “keep your hands on the steering wheel at all times” and to always “maintain control and responsibility for your vehicle”. Subsequently, every time the driver engages Autopilot, they are shown a visual reminder to “keep your hands on the wheel”...

Autopilot is intended for use with a fully attentive driver who has their hands on the wheel and is prepared to take over at any time.”

Requiring the driver to be ready to take control at a moment’s notice is a defining characteristic of a Level 2 autonomous driving system, which is not autonomous at all, but really more of an advanced driver assist.

That means doing things like “checking on [your] dog” in the back seat while the car is driving at highway speeds is an absolutely idiotic idea, and exactly what this driver did, which is why they weren’t able to take control of the car when it was clear it would be running into the stopped police car and the disabled car in front of the police car.

Advertisement

Keep in mind, this was not a situation that would have confused a human driver: the cars were stopped, with their hazard lights on, along with flares set to warn drivers that the cars were there, immobile. These were hardly hidden cars, and not unusual circumstances in the slightest.

The driver was charged with Reckless Driving and Reckless Endangerment because if you’re driving a car, you need to be paying attention, dummy, even if your love for Elon Musk and Tesla is so powerful and real that you can just feel it, deep inside you, where music is born.

Advertisement

If you still think that Tesla’s Autopilot system is close enough to being fully self-driving, let’s try an analogy: if you had a chauffeur that was an excellent driver 80 percent of the time, but, just so you know, was also a narcoleptic and could fall dead asleep without warning at any moment, would you be okay with being driven around by them? I’m not so sure I would.

That’s what’s going on here. Autopilot has no graceful fail-over; if something fails to work like it should, for any number of reasons, it needs a human at the wheel paying attention to take immediate control. The system may not even be aware there’s a problem until way too late—that’s why it needs your practiced, moist, human driver’s eyes watching as well.

Advertisement

This particular wreck seems to have some similarities to a known Tesla Autopilot issue with stationary cars, as seen here:

Remember, it’s a driving assist system. It’s not self-driving. So pull over to check on your dogs.

Advertisement

If you stop, it’ll be much easier to ask who’s a good boy/girl, too, and you need that information.

Senior Editor, Jalopnik • Running: 1973 VW Beetle, 2006 Scion xB, 1990 Nissan Pao, 1991 Yugo GV Plus, 2020 Changli EV • Not-so-running: 1977 Dodge Tioga RV (also, buy my book!: https://rb.gy/udnqhh)

Share This Story

Get our newsletter

DISCUSSION

Having an “advanced driver assistance” system that is JUST good enough to drive the car right up until the moment it can’t is idiotic. Humans are not wired to work very well in that sort of situation.

That’s before wondering why the hell the Tesla system can’t deal with this seemingly simple situation. I had a Cadillac as a rental recently that nailed the brakes due to a big plastic bag blowing across the road. Which isn’t ideal either, but I would rather the car try to stop when it shouldn’t than plow into a large stationary object.