South Korean musician and actor Ji Chang Son made headlines recently for filing a lawsuit against Tesla after his Model X blasted through the wall of his house when he was attempting to park it. That much isn’t in dispute. What is somewhat up in the air is that Son claims the car accelerated on its own, and that it’s happened to other people, too. I’m not buying it.
Son appears to blame a mixture of the car’s electronic brain and its autonomy systems, while absolving himself of any responsibility for the incident. But the facts of the matter don’t really point in that direction at all. They—and the other cases Son cites—all seem to point in a more human direction.
We’re now entering a new era, though. One in which our cars record everything going on in an effort to save our lives and improve their own reliability. But in cases of sudden intended acceleration, they can actually be more illuminating than ever before. And Tesla says that, universally, the data has borne out that drivers are to blame.
“In every case we’ve seen where unintended acceleration was alleged, the vehicle diagnostic logs confirmed the acceleration was due to the driver pressing the accelerator pedal,” a Tesla spokesperson told Jalopnik.
After Son crashed into his house in the September incident, he immediately went to Tesla and told them what happened, according to the Tesla spokesperson. Tesla responded by having a technician examine his car’s data, at which point Tesla says it found that Son depressed the throttle pedal to the maximum. (For the record, Tesla refused to share any of the specific data of Son’s case with Jalopnik, citing customer privacy requirements.)
Unsatisfied with this turn of events, Son went to Facebook, where he recounted his version of the incident, before finally saying (which has been translated from its original Korean):
How could they make me out to be such a shameless person to put my life on the line that way? They boast that Tesla X is the safest car, but to my family, it is a name that we will never forget.
Son did a few Korean media interviews about the incident, and filed a class action lawsuit against Tesla in California as a result of the incident, alleging that not only did his Model X careen wildly out of control and into his house, but that other people experienced the issue too.
Teslas all over are experiencing a “problem,” the suit alleges, and furthermore, Tesla knows about it, and refuses to stop it. Which brings us to the meat of Son’s incident, as it is laid out in lawsuit. He was pulling into his garage like normal, when his Tesla jerked forward:
...as Plaintiff Ji Chang Son slowly pulled into his driveway, the vehicle spontaneously began to accelerate at full power, jerking forward and crashing through the interior wall of the garage, destroying several wooden support beams in the wall and a steel sewer pipe, among other things, and coming to rest in Plaintiffs’ living room.
The complaint continues, telling a tale of courage and bravery, with Son’s own child helping him escape from the vehicle.
The suit then alleges that the Tesla Model X is defective because of the inability of the car to save itself, whether it’s because the driver has not asked for acceleration, or because they have, and the car doesn’t overrule the driver. But is that the case?
Understanding Sudden Unintended Acceleration
The crux of the matter involves a phenomenon known as “sudden unintended acceleration,” or SUA. Notice that the phrasing of that term is very careful—it’s not “that one time your car leapt wildly out of control and ran through a house,” but neither is it “Jesus, man, you just dumped your heavy right foot on the accelerator for no reason.”
A few years back a sort of mass panic ensued as literally thousands of Toyota drivers claimed their cars were violently lurching towards the horizon. Some blamed Toyota’s software. Wonky floor mat placement was cited in some incidents. But some drivers went as far as to claim that even when they positively stood on the brakes, they couldn’t get the things to stop.
In the end, the federal government found that the vast majority of cases involved drivers, well, dumping their heavy right feet on accelerator pedals for no reason. Pedal misapplication, in other words. It turns out that they were positively standing on the throttle pedal, and got confused:
We found that when a complaint alleged the brakes didn’t work, what most likely happened was pedal misapplication,” said deputy NHTSA administrator Ron Medford.
A Fantastic, Wonderful, Slightly Glitchy Car
Son’s lawsuit begins by saying that just like the Tesla Model S, the Tesla Model X is a “computer on wheels.” After spending a few paragraphs noting how “futuristic” the Model X is, the suit comes to the kicker:
As is true for all computers, however, Tesla vehicles are only as good as the hardware, engineering, and programming of their onboard computers. As even casual computer users know, even the most sophisticated and successful computer companies in history, such as Microsoft and Apple, regularly release computers and software with bugs, glitches, and unanticipated problems that cause their computers to unexpectedly crash, malfunction, or work differently than intended.
These bugs have serious consequences for users of traditional computer products. But for a computer that controls a 5,000 pound machine that can explosively accelerate to 60 miles per hour in under 3 seconds, the consequences of a computer glitch would be catastrophic.
In theory, that part of the suit is not wrong. Computer glitches happen with some frequency, as anyone who has ever touched a computer can attest. And as Son’s suit notes, the Tesla Model X has in fact suffered from a few “glitches” of its own, if that’s what we want to call them. Those snazzy falcon wing doors didn’t exactly work properly when they first debuted. Tesla drivers complained about climate control issues. There are videos of the self-parking feature not actually self-parking where it should.
But all of those glitches don’t seem to be significant safety issues, let alone one so drastic as a behemoth of a car launching itself without the driver’s command. And even if it happened to Son, you’d have to find a pattern of similar events to really ascribe any fault to Tesla here.
Son’s lawsuit contends that the pattern does in fact exist, and he feels he can prove it. Specifically, by citing complaints to the National Highway Traffic Safety Administration, or NHTSA.
The NHTSA Complaints
Son’s lawsuit alleges that there are eight SUA complaints to NHTSA, which is presented as fact, and I can verify that there are SUA complaints to NHTSA. The suit then contends, prima facie, that because there are complaints, it is a known electrical fault on the Model X, and Tesla must do something about it.
But looking at the actual NHTSA complaints themselves presents another story. Stories of paranoia, suspicion, disbelief, and a general refusal of personal responsibility.
“OUR 5 DAY OLD TESLA X WHILE ENTERING A PARKING STALL SUDDENLY AND UNEXPECTEDLY ACCELERATED AT HIGH SPEED ON ITS OWN CLIMBING OVER GRASS AND CRASHED INTO A BUILDING,” one complaint from June 2016 contends. There’s no placement of guilt, no blame whatsoever. It was just a thing that happened.
Another complaint, while ostensibly about “unintended acceleration,” seems more likely to be a case of unfamiliarity with a car that can accelerate to highway speeds in 2.9 seconds. “Wow guys,” it begins:
I WAS DRIVING INTO A PARKING LOT AND I JUST LIGHTLY PRESSED THE ACCELERATOR AS I WAS GOING UNDER 10 MPH AND ALL OF A SUDDEN MY X WENT FROM 10 TO OVER 40 MPH IN ABOUT 2 SECONDS! I DIDN’T EVEN KNOW THE THING COULD ACCELERATE THAT FAST! CAN ANYBODY EXPLAIN WHAT THE HECK MIGHT’VE HAPPENED?
Other complaints are filled with similar tales of brutal acceleration, coupled with innocent pleas of “I lightly pressed the accelerator,” or the notion that they were stepping on the brake, followed by communications with Tesla revealing that no, the vehicle’s logs showed a 100 percent throttle application by the driver.
The biggest universal thread among them seems to be that if there is blame to be directed somewhere, it’s to be directed at Tesla. Never to themselves.
But there was one complaint that stuck out to me, if for nothing else than its own claimed technical sophistication:
IT APPEARS THAT TESLA IS STILL USING AN ACCELERATOR PEDAL THAT HAS ONLY A SINGLE SENSOR.
THIS PEDAL IS SUPPLIED BY FORD WHO HAVE A HISTORY OF UA INCIDENTS INVOLVING ELECTRONIC THROTTLE CONTROL.
SOME MANUFACTURERES HAVE NOW SWAPPED TO A DUAL SENSOR 6 WIRE SYSTEM.
IN THE EVENT OF A BAD INPUT FROM ONE SENSOR THIS TRIGGERS A FAULT CODE AND DISABLES THE ACCELERATOR.TESLA HAS REPEATEDLY CLAIMED THAT IN ALL THESE INCIDENTS THE LOGS SHOW 100% THROTTLE.
THIS CAN EASILY HAPPEN DUE TO DIRT OR WHISKERS ON THE SENSOR.THE LOG WOULD BE UNABLE TO SHOW WHETHER IT WAS A GENUINE CASE OF PEDAL PUSHED TO THE FLOOR OR AN ELECTRICAL GLITCH.
For someone without an understanding of how throttle sensors work, or for someone who hasn’t realized that it’s actually quite easy to get in touch with Tesla (the CEO loves interacting with randos on Twitter, even), that all sounds very credible, at least at first.
Tesla’s Defense
Leaving aside for a minute that a dirty throttle sensor would actually result in a loss of demanded power, rather than an increase, I emailed Tesla to see if they had any comment on this specific complaint. And it turns out, Tesla says that it was inaccurate, too. Tesla isn’t using a single-sensor pedal at all.
“The accelerator pedal that Tesla uses contains two sensors and has 6 wires,” a Tesla spokesperson told me. “In the case of a fault on either sensor, the system enters a failsafe condition.”
Even with all that, however, there was one thing that was still nagging me. Shouldn’t the Tesla Model X’s sensors detect objects in front of the vehicle? And if so, shouldn’t they prevent gross pedal misapplication events like these from happening?
A spokesperson for Tesla told Jalopnik that the sensors do detect object, including walls, in front of the vehicle. But the reason why they don’t stop the car in situations like Son’s can be boiled down to the notion that in the end, the human is always the boss.
Humans are fallible, but when they truly want something, the car will give it to them. Tesla’s cars have software that is designed to distinguish a misapplied throttle command from a driver, such as a light tap on the accelerator when the driver is actually trying to carefully park. But if a driver gives it a boot-full?
“Pedal misapplications make up only a super tiny percent of the occasions that drivers apply the accelerator pedal,” the Tesla spokesperson said. “Therefore, so as not to interfere with the driver’s ability to accelerate when they need to, the system only limits acceleration in clearly unambiguous cases of pedal misapplication and may not mitigate all driver pedal errors.”
In short, if the driver really stomps on the throttle, the car will really give the driver every ounce of its power, anything the car senses in the way be damned. That might seem unsafe on the surface, but it actually reinforces the importance of human driving. No camera is yet as good as the combination of the human eye and brain. If a human says they truly need to accelerate, well then the car “trusts” the human to make that decision.
The driver is still always the one ultimately in charge.
Aside from its own string of public complaints, NHTSA’s expert findings actually seem to back up Tesla in this case. “A NHTSA study shows that these crashes can occur up to 16,000 times per year in the United States – that’s almost 44 incidents per day,” one report notes.
And furthermore, these incidents are actually more likely to occur when a driver is trying to be careful, such as when they’re parking, than when they’re doing something as simple as just driving down the highway:
These incidents are initiated most frequently in vehicles that are traveling at very low speeds, such as when attempting to park the vehicle in parking lots and driveways. They can also occur in other situations in which braking is commonly required, including intersections and highway exit ramps. Many drivers recognize that a pedal error occurred after the incident, but are unable to correct the error in time to prevent a crash. This happens because once the initial pedal error occurs, the situation develops rapidly, often in the confined space of a parking lot, with drivers only having a few seconds to correct the issue while they are often startled and stressed by the unexpected acceleration of the vehicle.
In Son’s specific case, Tesla says that they detected a 100 percent throttle input from the driver, and the car responded accordingly. And furthermore, the system cuts power entirely if it detects both the brake pedal and the throttle pedal to be depressed at the same time. If Son wanted to slow down, all he had to do was tap the brake.
But with systems like Autopilot able to increase the speed of the vehicle, couldn’t the vehicle launch itself on its own, while its sensors erroneous read a physical depression of the accelerator pedal?
Again, Tesla says no:
In Model S and Model X, there are two redundant sensors located on the accelerator pedal that monitor the pedal’s physical position. An independent monitor in the vehicle continuously compares the readings from both of these sensors, and both sensor readings must remain consistent in order for the vehicle to provide full torque. Neither Autopilot, nor the vehicle’s controls have the ability to physically move this pedal. If the pedal moves, as confirmed by these sensors, the movement is caused by an external source; typically a human foot.
In all of these cases, Tesla’s statements are quite clear. A throttle pedal push is a throttle pedal push, and there’s no mistaking it for anything else.
Nevertheless, Son’s suit alleges that, in large part because of the NHTSA complaints, “Tesla knew that the Model X was defectively designed or manufactured, unsafe, and was not suitable for its intended use,” and that Tesla knew that the Model X “would fail without warning.”
At the heart of it, Son feels that he was not at fault, and even if he was at fault, the car should have stopped him regardless.
Which leads us to another question:
What Is It We Want From Our Cars?
As we enter the brave new world of autonomous driving, what do we want from our cars in the here and now?
Do we still want to maintain our control while these systems still work their kinks out? Or do we demand that they have the power to override drivers if the car feels the driver is wrong? Is it responsible on our part to even demand such a thing when we know that these systems aren’t perfect? Or is it worth it, because they could prevent incidents like the one that happened to Son?
For now, Tesla (and pretty much every other carmaker) has said that as long as these systems have not yet reached perfection, the driver is still responsible and in control. Even if the driver screws up.
Until these systems are “perfect,” then we can’t really ask for them to be our benevolent chauffeurs.