Black Friday Is Almost Here!
The Inventory team is rounding up deals you don’t want to miss, now through Cyber Monday. Click here to browse!

Driver In Model X Crash Claims Tesla Is Trying To Cover Up Issues With Autopilot

Autopilot dashboard view. Image credit: Tesla
Autopilot dashboard view. Image credit: Tesla

An open letter published by the Tesla Model X driver involved in a Montana crash where autopilot was engaged now alleges that he was never contacted before Tesla issued their conclusions on the accident, reports Electrek. He claims Tesla was not interested in learning why their system veered into a barrier at speed.


The driver, who identifies himself in the letter as Mr. Pang, says that Tesla is trying to cover up issues with their autopilot system by blaming crashes where autopilot was engaged on whoever’s sitting behind the wheel. Pang writes, as quoted on Electrek:

Tesla never contacted me after the accident. Tesla just issued conclusion without thorough investigation, but blaming me for the crash. Tesla were trying to cover up the lack of dependability of the autopilot system, but blaming everything on my hands not on the steering wheel. Tesla were not interested in why the car veered right suddenly, nor why the car did not slow down during the crash.


When Electrek asked Tesla for comment, they referred back to their previous statement on the incident, where they insisted that Pang should have had his hands on the steering wheel while autopilot was engaged. A sensor on the Model X will detect if a driver’s hands are on the wheel and the car will sound a warning if the driver isn’t touching the wheel for a long enough period of time.

However, Pang also claims that the Autopilot system gave him no warning before veering off at speed:

After about 1 mile, we saw that road condition was good, and turned on autopilot again. The speed setting was between 55 and 60 mph. After we drove about another mile on state route 2, the car suddenly veered right and crashed into the safety barrier post. It happened so fast, and we did not hear any warning beep. Autopilot did not slow down at all after the crash, but kept going in the original speed setting and continued to crash into more barrier posts in high speed. I managed to step on the break, turn the car left and stopped the car after it crashed 12 barrier posts. After we stopped, we heard the car making abnormal loud sound. Afraid that the battery was broken or short circuited, we got out and ran away as fast as we could. After we ran about 50 feet, we found the sound was the engine were still running in high speed. I returned to the car and put it in parking, that is when the loud sound disappeared.

Even if he had wanted to react, Pang claims that the barrier was very close to the road, and also says that it was only a tenth of a second between when autopilot changed its course and the crash occurred.

Pang’s letter asks Tesla and Elon Musk to take more responsibility for accidents whenever one of their cars’ supposedly “beta” autopilot systems are involved. You can only glean so much information from data without talking to the driver and asking what they felt in terms of sensory feedback.


Pang alleges that Tesla’s making their conclusions without reaching out to the driver not because drivers are unwilling to talk, but rather, because Tesla doesn’t want any unflattering news to go public.

“We are willing to talk to Tesla concerning the accident anytime, anywhere, in front of the public,” Pang writes.


You can read Pang’s full letter here.

Moderator, OppositeLock. Former Staff Writer, Jalopnik. 1984 "Porschelump" 944 race car, 1971 Volkswagen 411 race car, 2010 Mitsubishi Lancer GTS.

Share This Story

Get our newsletter


My X-type is too a real Jaguar

Maybe people should actually drive their cars. This is the problem with beta testing and not vetting the technical expertise of the testers. Tesla please talk to some old school software developers from the DOS and Win 3.1 days they will tell you why everyone can’t be a beta tester.