Tesla and NHTSA have been spoiling for a fight for years now, or maybe have actually been fighting during that time, given the Biden Administration’s renewed interest in flexing its regulatory muscles in the name of safety (at least compared to Trump’s). NHTSA struck a new blow on Thursday, saying it would be expanding an investigation into Tesla’s Autopilot, after several crashes into emergency-scene vehicles.
First reported in August, NHTSA’s probe targeted 11 crashes involving Teslas. Thursday the NHTSA said that it had identified six more crashes, and that 15 injuries were tied to the crashes, including one fatality. The crashes involve collisions with first responder vehicles, though NHTSA indicated Thursday it would be investigating more than that.
At question is, in part, how Tesla’s various semi-autonomous do even when the driver is apparently engaged. From NHTSA’s report:
The agency’s analysis of these sixteen subject first responder and road maintenance vehicle crashes indicated that Forward Collision Warnings (FCW) activated in the majority of incidents immediately prior to impact and that subsequent Automatic Emergency Braking (AEB) intervened in approximately half of the collisions. On average in these crashes, Autopilot aborted vehicle control less than one second prior to the first impact.
All subject crashes occurred on controlled-access highways. Where incident video was available, the approach to the first responder scene would have been visible to the driver an average of 8 seconds leading up to impact. Additional forensic data available for eleven of the collisions indicated that no drivers took evasive action between 2-5 seconds prior to impact, and the vehicle reported all had their hands on the steering wheel leading up to the impact. However, most drivers appeared to comply with the subject vehicle driver engagement system as evidenced by the hands-on wheel detection and nine of eleven vehicles exhibiting no driver engagement visual or chime alerts until the last minute preceding the collision (four of these exhibited no visual or chime alerts at all during the final Autopilot use cycle).
Still, NHTSA is curious if Teslas do a good enough job at keeping drivers engaged. Or at least NHTSA has seen enough evidence to take a deeper look.
A driver’s use or misuse of vehicle components, or operation of a vehicle in an unintended manner does not necessarily preclude a system defect. This is particularly the case if the driver behavior in question is foreseeable in light of the system’s design or operation. For systems labeled as SAE Level 2 ADAS, important design considerations include the ways in which a driver may interact with the system or the foreseeable ranges of driver behavior, whether intended or unintended, while such a system is in operation. This is because these systems still depend upon the driver to maintain supervisory responsibility for the DDT, whereas the vehicle features perform only a support role. As such, ensuring the system facilitates the driver’s effective performance of this supervisory driving task presents an important safety consideration.
Accordingly, [Preliminary Evaluation 21-020] is upgraded to an Engineering Analysis to extend the existing crash analysis, evaluate additional data sets, perform vehicle evaluations, and to explore the degree to which Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks by undermining the effectiveness of the driver’s supervision. In doing so, NHTSA plans to continue its assessment of vehicle control authority, driver engagement technologies, and related human factors considerations.
When the NHTSA announced its Preliminary Evaluation in August, it covered around 765,000 Model Ss, Xs, Ys, and 3s, but on Thursday, the administration said that that number had expanded to around 830,000 models, from model years 2014-2022. The investigation’s upgrade to an Engineering Analysis is the last major step before NHTSA may or may not issue a Recall Request Letter, which is more like a recall order, which then Tesla may or may not challenge, including in federal court.
Tesla could also, of course, issue a voluntary recall if it knows there is a safety defect with its cars, which is something automakers do all the time, but I’m not sure that anyone’s expecting that in this case. I emailed Tesla for comment, and will update this post if they respond.