The owner of a Tesla Model S that crashed into a parked firetruck in Utah last Friday said she had the car’s semi-autonomous Autopilot system engaged at the time of the incident, police said Monday. More than likely, this crash will lead to yet more scrutiny around Tesla’s driving assistance system, which is already under investigation by the National Transportation Safety Board.
In a statement, police in the city of South Jordan, Utah, reported that the 28-year-old female driver said in an interview that she had been using Autopilot at the time. The driver, police said, admitted that she was looking at her phone prior to the collision.
“Tesla has not yet received any data from the car and thus does not know the facts of what occurred, including whether Autopilot was engaged,” a company spokesperson said via email.
Tesla tells drivers in its owner’s manual that they must remain attentive at all times when Autopilot is engaged.
Tesla is facing a number of investigations right now, including one over a fatal March crash involving a Tesla Model X driver that had Autopilot on at the time.
Meanwhile, CEO Elon Musk fired back today at coverage of this crash, and others, that potentially involved Autopilot:
The crash happened around 6:30 p.m. PT Friday, when the Model S driver slammed into a stopped fire truck while traveling at 60 mph, police said. Witnesses indicated the Model S didn’t brake prior to impact.
The driver was transported to a nearby hospital with non-life threatening injuries, including a broken right ankle.
The investigation remains ongoing, police said.
Autopilot has long been the subject of scrutiny by regulators and news outlets, much to the dismay of Musk, who has asserted that reporting on crashes involving still-new semi-autonomous technology is essentially akin to vehicular manslaughter.
Autopilot has been linked to at least three fatal crashes, including one as early as January 2016, when a driver in a Tesla Model S in China died after crashing into the back of a road-sweeping truck. The victim’s father has an ongoing case against Tesla, Jalopnik reported, claiming the automaker overstated the capabilities of Autopilot. (Like the Model X crash in March, Tesla blamed the driver in this crash, claiming his father told Tesla personnel that his son knew Autopilot very well and had read the owner’s manual for Model S “over and over again.”.)
In July 2016, a Tesla Model S driver in Florida died after his car crashed into a tractor trailer while Autopilot was engaged. The National Highway Traffic Safety Administration ultimately cleared Autopilot in the crash, but the NTSB later concluded an “over-reliance” on the technology by the part of the victim, Joshua Brown, played a role in the collision.
Musk suggested this was an issue in a recent conference call, saying: “When there is a serious accident, it is almost always, in fact, maybe always the case, that it is an experienced user,” Musk said. “And the issue is... more one of complacency, like we get too used to it.”
On Monday, the Wall Street Journal reported that Musk rejected using eye-tracking technology in Autopilot similar to General Motors’ Super Cruise system, citing cost concerns. Musk claimed cost wasn’t the issue, and instead cited unclear evidence that it’s “ineffective.”