Tesla Driver Watching Movie While Using Autopilot Crashes Into Cop Car Because Autopilot Doesn't Really Drive, Dummy

We may earn a commission from links on this page.
Image for article titled Tesla Driver Watching Movie While Using Autopilot Crashes Into Cop Car Because Autopilot Doesn't Really Drive, Dummy

Some of you may recall how a week or so ago I had to defend myself from angry Tesla-stans regarding my statements about how Level 2 semi-automation—the level of autonomy where Tesla’s Autopilot operates—sucks. It sucks because the sort of vigilance tasks demanded of the not-really-driving driver are fundamentally incompatible with how humans behave. In case you need proof of this, just this morning fate provided us with one, as a guy watching a movie in his Tesla on Autopilot slammed into a cop car. Yep, a cop car. What a moron.

The accident happened early this morning, right here in my home state of North Carolina, on highway 64 West, outside the town of Nashville. The driver of the Tesla Model S, Devainder Goli of Raleigh, is, according to the Charlotte Observer,

“...accused of violating the move-over law and watching television while operating a vehicle, according to officials.”


The “move-over law” is about driving cautiously when passing emergency vehicles parked on the side of the road, like the Nash County Sheriff’s car was, and I think it’s also implied in that law not to slam into the car, either. The fact that the driver was watching a movie on his phone is covered by the slight archaic-sounding “watching television” violation.

All the information currently given about the crash states that the Tesla rammed into the back of the deputy’s vehicle, which then was pushed into a State Troopers car, also parked at the scene.

The Sheriff’s car and the Tesla were totaled, but thankfully there were no injuries. Based on the pictures shared from the NC Highway Patrol, it’s clear that the Sheriff’s car had its warning lights on, too.

So, for those of you who still think Tesla’s Autopilot is safer than a human driving, let’s look at the situation here: it seems this happened pre-dawn, so fairly dark out, but in clear weather and the Sheriff’s car had its normal and flashing blue and red police lights going, while parked on the side of the road.


Somehow Autopilot not only steered the car off the road, but directly into a parked police car with lights on. This is not the sort of mistake most human drivers would make. If the driver was using Autopilot properly—as in paying attention and not watching a fucking movie while he’s supposed to be driving—this would not have happened.

But there’s the problem: people simply aren’t good at delegating 80 or so percent of the task of driving to an automated system and then remaining vigilant to monitor that system. The temptation to let focus wander is real and powerful, and with everyone having pocket-sized distraction machines on them at all times, there’s plenty of temptation.


Yes, the driver was misusing Autopilot and was a fool. But the fundamental design of Autopilot allows this to happen, and is even deceptive enough in how it’s marketed to even enable this sort of abuse.

Level 2 semi-automation does not work well with humans. Even if it’s found this Model S was using an older version of autopilot or whatever, that doesn’t matter: semi-autonomous systems that demand immediate human input on demand will always have issues like this.


Time to rethink things, Tesla stans.