The Way We Define Automated Driving Is Hell And It's All Of Our Faults

Photo: Cadillac
Photo: Cadillac

Reporters who cover the burgeoning self-driving car industry threw a fit this week over a New York Times column that failed to accurately capture the automated capabilities of a new Volvo S90. “Driverless Cars Made Me Nervous,” the piece’s headline reads, ignoring the fact that no car on the road today can operate without a driver behind the wheel.


It’s hard to talk about this stuff without seeming like a pedantic mope, but this is becoming a problem. It’s important, and it’s something we’ve harped on relentlessly as the auto industry shifted gears this year to rapidly speed up the development of fully-autonomous cars. The National Transportation Safety Board recently emphasized this point, in saying part of what led to the 2016 death of a Tesla owner driving in his car’s semi-autonomous Autopilot mode was an “overreliance” on automation. YouTube videos abound of Tesla owners pushing the limits of what their vehicle can handle.

Whether or not intentional, blanket statements offered by everyone from our current transportation secretary to the writer of the NYT column, Pulitzer Prize-winner David Leonhardt, can give the distinct impression that cars on the road today come equipped with automated technology far exceeding reality.

For Leonhardt’s part, he set the scene with a neologism that immediately set my blood to boil:

On my fourth day in a semi-driverless car, I finally felt comfortable enough to let it stop itself. Before then, I’d allowed the car — a Volvo S90 sedan — to steer around gentle turns, with my hands still on the wheel, and to adjust speed in traffic. By Day 4, I was ready to make a leap into the future.

Semi-driverless? Did Leonhardt vanish into thin air when the S90's adaptive cruise control was set? Did he hop into the backseat? It implies—at least, to me, but I’m not an NYT editor—that a driver isn’t necessary for part of the ride.

I highly doubt that’s what Volvo conveyed to Leonhardt, and if it did, the automaker should brush up on its safety instructions, because there’s no car on the road today with automated functions that doesn’t require the driver to pay attention. No matter what.


Who’s to blame for this? And by this, I mean the dizzying array of definitions deployed to describe what’s (generally) a combination of adaptive cruise control and whatever fancy name an automaker gives the technology that keeps a car within a lane.

Car and Driver reporter Pete Bieglow captured most of them yesterday:


Everyone’s to blame. Reporters, the standards set by engineers and adopted (for the most part) by the industry, automakers, regulators, policymakers. That’s why I’m sympathetic to Leonhardt. It’s a confounding mess.

The SAE Chart

The industry focuses heavily on a ranking system defined by the Society of Automotive Engineers. At level 0 (say, a Ford Pinto), the driver’s in control of everything. Level 1, the car has something like automatic emergency braking. Level 2—Tesla’s Autopilot, for instance—the vehicle can handle significant driving maneuvers, like lane changes on a highway.


Level 5 is the gold standard, the purportedly idyllic state of automation for a car, in which the driver is eliminated and the car is expected to handle all driving tasks without any oversight.

But look at the SAE’s definitions:


The narrative definitions are probably impossible for the layman to understand. The breakdown next to them makes more sense—at level 3, SAE says, the system monitors the driving environment.

OK. Audi’s calling its new Traffic Jam Pilot a level 3 system, first to be installed in the 2018 A8 sedan. Putting aside the relevant concerns about a level 3 system, here’s how Audi says it’ll work:

The traffic jam pilot manages starting, accelerating, steering and braking. The driver no longer needs to monitor the car permanently. They can take their hands off the steering wheel permanently and, depending on the national laws, focus on a different activity that is supported by the car, such as watching the on-board TV.


But when the system “reaches its limits,” Audi says, the driver needs to be prepared to grab control. That seems to jibe with SAE’s take that drivers need to be alert to grab the wheel on a moment’s notice—but shouldn’t they also be monitoring the driving environment then? Both seem to go hand-in-hand.

It’s Not Just The NYT

Reporters, those tasked with making sense of the world, also repeatedly goof up. The Drive’s Lawrence Ulrich wrote a great piece about Leonhardt’s column that hits on this point directly.

There are no “driverless cars” in showrooms, or “self-driving cars,” or whatever weasel words can snare eyeballs and advertising. Someday, there will be. Some automakers have promised too much, too soon. That bugs us, too, but it’s part of their job. It’s largely the media, led by the usual Silicon Valley sycophants, that keeps hyping cars that don’t exist, and misrepresenting the features that do. That’s not their job.


Excellent point! There’s no driverless car or fully-autonomous car for sale today. But even Ulrich botched one part of his otherwise excellent piece.

The S90 can’t even change lanes on a highway when cued by the driver’s turn signal, which is something that Teslas, Mercedes and now Cadillacs can do.


The new Cadillac Super Cruise system explicitly doesn’t allow lane changes on a highway—something I asked GM reps about last month after testing the new CT6 equipped with the semi-autonomous features. (It’s possible down the line, one told me, but for now, “We want [drivers] in the loop during lane changes.”


Over dinner one night, Robb Bolio, Cadillac’s vehicle performance manger who was involved in the Super Cruise project, was drilled about how he’d define the feature under the SAE chart. He balked, and probably rightly so. Later, I got into a brief back-and-forth with another reporter about whether Super Cruise is level 2 (my take) or level 3 (his). We agreed to disagree.

Examples of journalists overstating the technology are everywhere. In describing the new federal policy guidelines for self-driving cars, the Washington Post last month opened their piece, “Each day, driverless cars carry passengers around U.S. cities big and small.” Woops.


That’s not to say Jalopnik’s free of sin here. I slip up, for sure. The SAE chart is definitely confusing, but just last month I leaned on it to discuss the new Audi system. In April, I called Google’s fleet of cars for a pilot project “autonomous”—the implication being they’re fully-autonomous. But that’s not the case, at least from a passenger standpoint: an engineer still needs to be in the driver’s seat to resume control of the wheel if needed.

It’s hard, though. If most of the auto industry has glommed onto the SAE levels as a way forward, extricating yourself from them to describe a car’s automated functions can become a mind-bending exercise of mental gymnastics.


Self-driving cars being tested on the road today are indeed impressive—a vehicle handling turns on its own is something to behold. But until we’re able to sit back and safely take a nap on the expressway, until cars without a steering wheel and pedals are ferrying passengers around town, the phrase “driverless” or “fully-autonomous” should probably be used only to describe hypothetical scenarios of the future.

What To Do

I don’t have a full-fledged solution to offer here. Just last week, I debated a colleague about Tesla’s decision to name its semi-autonomous suite “Autopilot.” My esteemed coworker felt the name was appropriate; I didn’t. His argument was that Autopilot accurately describes the situation—airline pilots don’t take a nap when they engage their plane’s Autopilot.


Sure, but people are also capable of doing stupid things. Tesla even designed the system knowing that some drivers would be inattentive idiots. (They also stress that the system is intended to be used on highways with divided barriers, yet still allow for it to be engaged on low-speed city streets. But I digress.) Still, my colleague said some of the onus should be placed on drivers themselves. They should ask questions until they’re confident and comfortable about what they’re getting themselves into. I think he has a point.

But automakers, tech developers, policymakers, and regulators (who seem to prefer a hands-off approach to overseeing the fledgling industry), also need to accept that some stringent, clear definitions are needed to prevent another fatal accident from happening—a scenario that could derail any momentum the autonomous car industry has going for it. Maybe new legislation could be the vehicle for that.


Reporters should do their best to accurately describe the situation at-hand. Personally, I think the phrase “semi-autonomous” gets closest to the target for something like a Tesla or the new Super Cruise system. Common sense suggests a Tesla Model S or a Cadillac CT6 that drives itself for several miles on the highway is performing the task “autonomously” or is “driving itself.” Can it all the time? No. So just say that: “Super Cruise can handle driving the CT6 on the freeway, allowing you to sit back without your hands on the wheel. But you still have to pay attention to the road.” Some variation of it isn’t hard.

More-advanced cars being tested on the road in California, say, those that can handle harder tasks like turning a corner? Same thing: Plainly state what those cars can do, but also mention that someone’s behind the wheel to resume control of the vehicle if necessary. California requires automakers to submit annual reports on how often their test drivers had to take the wheel back. Those offer perhaps the clearest evidence we have that no car on the road is fully-autonomous. Even Google’s self-driving car project, Waymo, recorded 124 so-called disengagements in 2016, and Waymo’s considered a front-runner in the autonomous car race.


Or, if the SAE levels are here to stay, here’s a potentially even easier way to sum things up. Level 1-4: You still need to pay attention. Level 5: Get drunk.

Senior Reporter, Jalopnik/Special Projects Desk



Suggestion: rate systems on how quickly they require driver intervention in the even of a change in environment. A regular car would require near instant intervention. A vehicle with collision avoidance might give the driver a few seconds. A few seconds may not be enough for the driver to be completely innatentive (watching TV, sleeping), but it may be long enough for them to put down their phone, so it can be enough to help the scourge of texting and driving.

30s-1m would be enough for the driver to come to from being completely distracted (by a tv show), while 10min will be enough to wake up (this is basically a case with a car that can drive itself unless it starts snowing, in which case it find a place to pull over). 30min+ will be as fully autonomous as we are likely to ever see (I doubt any system will handle sudden volcanoes or alien invasion well).