The consumer advocacy groups Center for Auto Safety and Consumer Watchdog called on the Federal Trade Commission today to investigate Tesla for “deceptive advertising and marketing practices and representations” made regarding Tesla’s Autopilot semi-autonomous driving system. The letter specifically points out advertising, statements, and marketing materials that give the impression that the Autopilot system is far more capable than it actually is right now.
Tesla disagreed, and said its claims around Autopilot’s abilities are very clear.
In a letter sent to Federal Trade Commission Chairman Joseph Simmons, the two consumer groups state the problem as follows:
Two Americans are dead and one is injured as a result of Tesla deceiving and misleading consumers into believing that the Autopilot feature of its vehicles is safer and more capable than it actually is. After studying the first of these fatal accidents, the National Transportation Safety Board (NTSB) determined that over-reliance on and a lack of understanding of the Autopilot feature can lead to death. The marketing and advertising practices of Tesla, combined with Elon Musk’s public statements, have made it reasonable for Tesla owners to believe, and act on that belief, that a Tesla with Autopilot is an autonomous vehicle capable of “self-driving.”
The crux of the issue is partially with the nature of all Level 2 semi-autonomous systems: These are semi-autonomous systems that require a driver to be ready to take control, should the situation demand it, even if there are warnings in place like Autopilot has. I don’t think it’s reasonable to expect a person to take control if needed. That’s not how people work, and we’ve seen over and over again that people will find ways to not pay attention in semi-autonomous vehicles. (Indeed, you are trusting the car’s systems to know if a problem even exists, and it may not; warnings happen in best-case and predictable scenarios, like if a sensor becomes obstructed or road lines become to hard to “see.”)
The bigger problem, and the issue being directly addressed by this request for investigation, is that Tesla’s marketing, advertising, promotion, and general attitude towards their Autopilot system seems to imply that it’s much more than it is, that it’s a full self-driving system, which it isn’t.
If you think I’m being unfair by saying that Tesla suggests Autopilot is a full self-driving system, I’d like to point out that on Tesla’s own website it blares “Full Self-Driving Hardware,” before explaining a caveat in smaller print that the hardware is only “capable” of full self-driving “in the future”:
What Tesla is saying is that its Model S vehicles supposedly have the hardware potential to have “full self-driving” in the future, but at the moment, they do not have that ability. It’s this descriptor that watchdog groups find misleading—they worry it could imply to less-savvy customers (of which, let’s admit, there are many) that the car can drive itself now or at least is capable of doing things it actually can’t.
The letter points out this example, and cites a number of others as well to make its case that Tesla is overselling the capabilities of Autopilot, and that’s having dangerous, and sometimes fatal results. From the letter:
“Tesla is the only automaker to market its Level 2 vehicles as “self-driving”, and the name of its driver assistance suite of features, Autopilot, connotes full autonomy. In addition to these formal marketing and advertising ploys, Elon Musk, Tesla’s CEO, frequently misleads and deceives consumers about Autopilot’s safety and capabilities. Also, technical aspects of Autopilot, such as allowing for prolonged periods without touching the steering wheel with no way of determining whether drivers are in fact monitoring their driving environment- a required task for drivers of SAE Level 2 vehicles- deceive and mislead consumers into believing Autopilot is more advanced than it is. These formal and informal representations, combined with the technical features of Tesla vehicles, lead reasonable consumers to believe that Autopilot is more than mere driver assistance.
Visitors to the Autopilot page on Tesla’s website are greeted with large typeface proclaiming, “Full Self-Driving Hardware on All Cars” and below that message, in standard sized typeface, “All Tesla vehicles produced in our factory, including Model 3, have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver.” After presenting two links to order a Model S or Model X, ostensibly with “full self-driving hardware”, there is a video of a Tesla vehicle driving itself through a suburb; the video begins with the typed words “THE PERSON IN THE DRIVER’S SEAT IS ONLY THERE FOR LEGAL REASONS. HE IS NOT DOING ANYTHING. THE CAR IS DRIVING ITSELF.”
The letter goes on to cite examples of Tesla and/or Elon Musk, via his personal social media accounts or statements made to the media, making Autopilot out to be safer or more capable than it actually is.
The two advocacy groups claim that Tesla is in violation of Section 5 of the FTC Act (Unfair and Deceptive Promises) regarding how it markets and advertises the Autopilot feature, and they make a pretty compelling argument.
The press release is here, If you want to read the letter in full, I have it posted below. It’s now up to the FTC to decide if an investigation of Tesla regarding this matter will happen.
But a Tesla spokesperson countered by saying it’s been clear on what Autopilot can and can’t do, and said owners understand that.
“The feedback that we get from our customers shows that they have a very clear understanding of what Autopilot is, how to properly use it, and what features it consists of.”
Tesla has said it expresses specifically what the Autopilot system is capable of on their purchase page and the Autopilot page. It also provided some samples of their on-screen alerts that appear in the instrument panel displays when Autopilot is engaged:
Autosteer also has this warning display that appears on the car’s screen:
We’ll see what the FTC has to say about this matter soon. But all of this comes at a tough time for Tesla, which is also facing an investigation by the National Transportation Safety Board for various crashes, production issues with the new Model 3, labor issues, and a potential cash crunch.
This story has been updated to clarify the nature of Autopilot’s warnings, and on the potential hazards of Level 2 AD cars.