Tesla Driver In Marsh Crash Now Says It Was His Fault And Not Autopilot's

We may earn a commission from links on this page.

Tesla’s semi-autonomous Autopilot feature caught flak on Sunday, as it appeared a driver had blamed the system for causing his car to suddenly accelerate, head off a road, and roll over into a marsh. On Monday, his story changed. It wasn’t Autopilot’s fault, he said, it was his own.


The incident began on Sunday, when the Kandiyohi County Sheriff’s Office in Minneapolis released a statement that said a 2016 Tesla Model S ended up on its roof as a result of the incident, leaving all five occupants in the car with minor injuries.

“Clark stated that when he engaged the auto pilot (sic) feature that the vehicle suddenly accelerated causing the car to leave the roadway and overturn,” the sheriff’s office said in a press release. “Clark and his four adult passengers all sustained minor injuries.”


The police report said the Tesla was driving toward an intersection that dead-ended, requiring him to go right or left. The driver, 58-year-old David Clark, chose neither.

Tesla said it has “no reason to believe that Autopilot ... worked other than as designed.” The automaker reiterated the need for drivers to remain fully alert and watch the road, even if Autopilot is engaged.


The whole thing got weird on Monday, when Clark backtracked and said his initial statement wasn’t accurate. In an email to a sheriff’s deputy that was distributed by Tesla, Clark said he was “pretty shook up at the time when you and I spoke but I did not intend to put the blame Tesla or the auto pilot system as I am aware that I need to be in control of the vehicle regardless if the auto pilot system is engaged or not.”

“I have had a chance to discuss with passengers and try to replay the sequence of events leading up to the accident,” Clark wrote, adding:

To the best of my recollection I had engaged the autopilot system but then I had disengaged it by stepping on accelerator. I then remember looking up and seeing the sharp left turn which I was accelerating into. I believe we started to make the turn but then felt the car give way and lose its footing like we hit loose gravel. That was the feeling that I was trying to describe to you that I had lost control of the vehicle.

The next thing I know tall grass is whipping past the windshield and we were traveling at an odd angle in the ditch and then flipped over the right side and ended up on the roof.

I am truly thankful for the safety features that Tesla had put into this car that saved all 5 of us from serious injury.


A sheriff’s deputy responded:

I understand the misunderstanding about our conversation the day of the accident. When we have accidents with injuries it is our responsibility to provide the information regarding details to the press outlets.

With the information we talked about and how you explained them to me, I was in the understanding that the vehicle was in autopilot based on how you explained it. Therefore, that’s how it was reported.


It’s a strange situation. Initially, the sheriff’s office issued a statement that didn’t attribute the statement that blamed Autopilot for the crash to anyone, giving the impression that an investigation revealed that to be the case. Tesla said it didn’t believe Autopilot was at fault, and the statement was then adjusted to attribute the Autopilot comment to Clark. Now, Clark says he had it wrong.

In the past, Tesla has released logs in response to crashes that have been attributed to a sudden acceleration by Autopilot. Asked for the data logs from Clark’s vehicle for the incident, the automaker pointed Jalopnik to its original statement, which said that’s it’s working on an internal investigation of the accident. The sheriff’s office didn’t immediately respond to a request for comment.


Crashes are jarring, especially rollovers, and it wouldn’t necessarily be surprising if Clark spoke too soon without gathering a full recollection of what transpired. But the situation’s still an odd one.

Autopilot has been scrutinized by regulators, following a fatal crash that left a Tesla owner dead last year. Earlier this year, the National Highway Traffic Safety Administration cleared Autopilot in the crash and said there was no evidence of a defect.


A class-action lawsuit filed in April accuses Tesla of making owners “beta-testers” for Autopilot and renders their vehicles “dangerous.” The case remains pending.

Taking Clark at his word, it’s another testament to the importance of human driving while using semi-automated features in a car.