Last November, Elon Musk announced that Tesla’s so-called “Full Self-Driving Beta” software would become available to all Tesla owners. Just hours later, news broke that a Tesla Model S had caused an eight-car pile-up on San Francisco’s Bay Bridge. The crash sent nine people to the hospital and caused a massive traffic jam as emergency crews had to stop traffic for 90 minutes to bring in ambulances and clear the wrecked cars from the bridge. The driver claimed “Full Self-Driving” was active at the time of the crash.
Today, The Intercept published videos and photos of the crash that it obtained from a California Public Records Act request. We can’t embed them here, but you should definitely head over to the linked article to give the videos a watch. It’s a pretty bad pile-up, and people were injured, but thankfully, none of the injuries were life-threatening, and you don’t see anything graphic in the footage.
The video confirms initial reports that the Tesla was driving with traffic before changing lanes while braking for no discernible reason. The footage also appears consistent with previous reports of Tesla drivers regularly experiencing “phantom braking” in cars with FSD activated. The Intercept also reports that at least 285,000 Teslas in North America are now equipped with FSD.
In February of last year, the National Highway Traffic Safety Administration announced an investigation into Tesla’s deceptively-named driver-assistance feature. Over the course of nine months, NHTSA said it had received 354 complaints about Tesla phantom braking. The incidents continued to occur even after Tesla was forced to roll back its FSD update in October 2021.
As you can see in the videos published by The Intercept, Tesla clearly hasn’t worked out the problem, despite knowing about for more than a year now. But Elon Musk did recently tweet that the automaker plans to remove one of Tesla’s only FSD safety features: the “steering wheel nag” part of its driver monitoring system. That Musk tweet also got the attention of NHTSA, which confirmed yesterday that it has contacted Tesla about the tweet as part of its larger investigation into the automaker’s driver-assistance system.
On the same day, Tesla announced another update to its FSD policy that would only suspend drivers who abused the system for a two-week period. Previously, inattentive drivers could be locked out of using FSD for as long as six months.
It’s not clear when NHTSA will wrap up its investigation or what it will do once the investigation is concluded. But yesterday, Ann Carlson, the acting head of the agency, told Reuters, “the resources require a lot of technical expertise, actually some legal novelty and so we’re moving as quickly as we can, but we also want to be careful and make sure we have all the information we need.”
Since 2016, the agency has reportedly opened at least three dozen special investigations into crashes involving Teslas where driver-assistance software was likely in use. So far, 19 deaths have been attributed to these crashes.