Don't Use Tesla's Autopilot Like This

We may earn a commission from links on this page.

My god, people. Tesla’s Autopilot requires constant human supervision—Tesla knows it, we know it, you should know it. In case you don’t, here’s another helpful video: A Model S running into a highway barrier near Dallas, after it failed to recognize the roadway and merge into another lane.

First noticed by Electrek, the Tesla driver conveyed what went wrong this week in a Reddit thread:

So I was driving in the left lane of a two lane highway. The car is AP1 and I’ve never had any problems until today. Autopilot was on didn’t give me a warning. It misread the road and hit the barrier. After the airbags deployed there was a bunch of smoke and my car rolled to a grinding stop. Thankfully no one was hurt and I walked away with only bruises.


Here’s a shot of what happened to his car as a result:


Another Redditor, remarkably, had video of the accident, and it showed a situation that human intervention was almost certainly needed.


The road barrier seems to come up pretty abruptly, but that’s not something a driver paying attention to the road couldn’t have swiftly addressed. The driver says the vehicle’s Forward Collision Warning didn’t initiate, which would’ve alerted the driver in some fashion if an object is within the vehicle’s path.

That’s not to say Autopilot isn’t safe when used right. An investigation by the National Highway Traffic Safety Administration into the fatal 2016 crash involving a Tesla driver found the crash rate for Tesla vehicles dropped nearly 40 percent after Autosteer was installed.


Even so, the video’s a clear reminder that Autopilot still requires your attention—for now—even if federal regulators have cleared the system.

Update: This post has been updated.