Vote 2020 graphic
Everything you need to know about and expect during
the most important election of our lifetimes

Of Course Semi-Autonomous Systems Can Fail

The IIHS study was released on Tuesday.
The IIHS study was released on Tuesday.
Photo: Eric Risberg (AP)

We’ve aired our grievances about the fallibility of semi-autonomous driving features time and again, and a new study out this week confirms the fact: Cars with driver-assist systems that include things like lane-keep and auto-steering and braking may not see stopped vehicles. Worse, reports the Associated Press, they could even steer you into a crash.


The study from the Insurance Institute for Highway Safety scrutinized five systems from Tesla, Mercedes, BMW, and Volvo on a track and public roads, reports the AP, which said, “The upshot is while they could save your life, the systems can fail under many circumstances.


“We have found situations where the vehicles under semi-automated control may do things that can put you and your passengers at risk, and so you really need to be on top of it to prevent that from happening,” David Zuby, the institute’s chief research officer, told the news agency.

Such as:

Among the scariest found by the Virginia-based institute was with the system in two Tesla vehicles, the Model S and Model 3. The institute tested the system with the adaptive cruise-control turned off, but automatic braking on. At 31 miles per hour, both Teslas braked and mitigated a crash but still hit a stationary balloon. They were the only two models that failed to stop in time during tests on a track.

Yet when the adaptive cruise-control, which keeps a set distance from cars in front, is activated, the Teslas braked earlier and gentler and avoided the balloon, the agency said.

The Model 3 was the only vehicle in the test group that “failed to respond to stopped vehicles ahead of them,” the AP reports.

Tesla told Jalopnik that the Model S tested by IIHS used dated software, and that it recommends installing software updates as soon as they’re live. The automaker also said that in IIHS’s automatic emergency braking test, the vehicles braked significantly to mitigate the impact of the crash, and that the Model 3 and S manuals state that the AEB feature is “designed to reduce the severity of an impact,” not to avoid a collision. IIHS has also rated the Model 3 and S AEB systems as “superior” in the past.


Still, the findings underscore the very squishy reality surrounding these semi-automated systems. They require a driver’s full attention while at the same time give the false impression that perhaps their full attention isn’t necessary.

And it’s a crucial thing to examine and point out. Examples abound of drivers abusing these kinds of systems, and this year alone we’ve seen a number of severe—sometimes fatal—crashes involving cars that were being driven in a semi-autonomous mode.


The IIHS tested Tesla’s Autopilot system, along with cars equipped with similar features from BMW, Volvo and Mercedes. Zuby was frank with the AP, saying that while “the systems do increase safety but the tests show they are not 100 percent reliable.”

Many of the scenarios discovered by IIHS are covered in the vehicles’ owner’s manuals, which tell drivers they have to pay attention. But Zuby said not many people read their owner’s manuals in detail. Even though the systems have names like Tesla’s “Autopilot” or Volvo’s “Pilot Assist,” they are not self-driving vehicles, Zuby said. “They will help you with some steering or speed control but you really better be paying attention because they don’t always get it right,” he said.

Many of the cars’ lane-centering systems failed, especially on curves or hills. The BMW, Model S and Volvo “steered toward or across the lane line regularly,” requiring driver intervention, the IIHS said.


And that’s something we can’t underscore enough if you use these systems (which we indeed do like!): pay attention to the road while they’re in-use.

Senior Reporter, Jalopnik/Special Projects Desk

Share This Story

Get our newsletter



From the very moment automakers started implementing this “autonomous” tech into mainstream vehicles, it was a fallacy. Because they marketed it as a replacement for healthy driving habits everyone should have, such as physically turning your head when changing lanes, turning or backing up, checking your mirrors, be aware of your surroundings at all times, and so on. It’s unfortunate, because automakers did supply documentation and warnings saying these systems are not a replacement for all those things, but clearly the marketing department and engineering department don’t communicate much.

It’s good to see pieces like this explain the issue, and I hope many more follow suit in explaining to the public to stop being dumb and relying solely on their tech rather than being good, responsible drivers. I’m sure this autonomous stuff will become pretty safe and fantastic one day, but we are not there yet. Not even close.