Advertisement
Advertisement


Everyone’s to blame. Reporters, the standards set by engineers and adopted (for the most part) by the industry, automakers, regulators, policymakers. That’s why I’m sympathetic to Leonhardt. It’s a confounding mess.

The SAE Chart

The industry focuses heavily on a ranking system defined by the Society of Automotive Engineers. At level 0 (say, a Ford Pinto), the driver’s in control of everything. Level 1, the car has something like automatic emergency braking. Level 2—Tesla’s Autopilot, for instance—the vehicle can handle significant driving maneuvers, like lane changes on a highway.

Advertisement

Level 5 is the gold standard, the purportedly idyllic state of automation for a car, in which the driver is eliminated and the car is expected to handle all driving tasks without any oversight.

But look at the SAE’s definitions:

Advertisement

The narrative definitions are probably impossible for the layman to understand. The breakdown next to them makes more sense—at level 3, SAE says, the system monitors the driving environment.

OK. Audi’s calling its new Traffic Jam Pilot a level 3 system, first to be installed in the 2018 A8 sedan. Putting aside the relevant concerns about a level 3 system, here’s how Audi says it’ll work:

The traffic jam pilot manages starting, accelerating, steering and braking. The driver no longer needs to monitor the car permanently. They can take their hands off the steering wheel permanently and, depending on the national laws, focus on a different activity that is supported by the car, such as watching the on-board TV.

Advertisement

But when the system “reaches its limits,” Audi says, the driver needs to be prepared to grab control. That seems to jibe with SAE’s take that drivers need to be alert to grab the wheel on a moment’s notice—but shouldn’t they also be monitoring the driving environment then? Both seem to go hand-in-hand.

It’s Not Just The NYT

Reporters, those tasked with making sense of the world, also repeatedly goof up. The Drive’s Lawrence Ulrich wrote a great piece about Leonhardt’s column that hits on this point directly.

There are no “driverless cars” in showrooms, or “self-driving cars,” or whatever weasel words can snare eyeballs and advertising. Someday, there will be. Some automakers have promised too much, too soon. That bugs us, too, but it’s part of their job. It’s largely the media, led by the usual Silicon Valley sycophants, that keeps hyping cars that don’t exist, and misrepresenting the features that do. That’s not their job.

Advertisement

Excellent point! There’s no driverless car or fully-autonomous car for sale today. But even Ulrich botched one part of his otherwise excellent piece.

The S90 can’t even change lanes on a highway when cued by the driver’s turn signal, which is something that Teslas, Mercedes and now Cadillacs can do.

Advertisement

The new Cadillac Super Cruise system explicitly doesn’t allow lane changes on a highway—something I asked GM reps about last month after testing the new CT6 equipped with the semi-autonomous features. (It’s possible down the line, one told me, but for now, “We want [drivers] in the loop during lane changes.”

Advertisement

Over dinner one night, Robb Bolio, Cadillac’s vehicle performance manger who was involved in the Super Cruise project, was drilled about how he’d define the feature under the SAE chart. He balked, and probably rightly so. Later, I got into a brief back-and-forth with another reporter about whether Super Cruise is level 2 (my take) or level 3 (his). We agreed to disagree.

Examples of journalists overstating the technology are everywhere. In describing the new federal policy guidelines for self-driving cars, the Washington Post last month opened their piece, “Each day, driverless cars carry passengers around U.S. cities big and small.” Woops.

Advertisement

That’s not to say Jalopnik’s free of sin here. I slip up, for sure. The SAE chart is definitely confusing, but just last month I leaned on it to discuss the new Audi system. In April, I called Google’s fleet of cars for a pilot project “autonomous”—the implication being they’re fully-autonomous. But that’s not the case, at least from a passenger standpoint: an engineer still needs to be in the driver’s seat to resume control of the wheel if needed.

It’s hard, though. If most of the auto industry has glommed onto the SAE levels as a way forward, extricating yourself from them to describe a car’s automated functions can become a mind-bending exercise of mental gymnastics.

Advertisement

Self-driving cars being tested on the road today are indeed impressive—a vehicle handling turns on its own is something to behold. But until we’re able to sit back and safely take a nap on the expressway, until cars without a steering wheel and pedals are ferrying passengers around town, the phrase “driverless” or “fully-autonomous” should probably be used only to describe hypothetical scenarios of the future.

What To Do

I don’t have a full-fledged solution to offer here. Just last week, I debated a colleague about Tesla’s decision to name its semi-autonomous suite “Autopilot.” My esteemed coworker felt the name was appropriate; I didn’t. His argument was that Autopilot accurately describes the situation—airline pilots don’t take a nap when they engage their plane’s Autopilot.

Advertisement

Sure, but people are also capable of doing stupid things. Tesla even designed the system knowing that some drivers would be inattentive idiots. (They also stress that the system is intended to be used on highways with divided barriers, yet still allow for it to be engaged on low-speed city streets. But I digress.) Still, my colleague said some of the onus should be placed on drivers themselves. They should ask questions until they’re confident and comfortable about what they’re getting themselves into. I think he has a point.

But automakers, tech developers, policymakers, and regulators (who seem to prefer a hands-off approach to overseeing the fledgling industry), also need to accept that some stringent, clear definitions are needed to prevent another fatal accident from happening—a scenario that could derail any momentum the autonomous car industry has going for it. Maybe new legislation could be the vehicle for that.

Advertisement

Reporters should do their best to accurately describe the situation at-hand. Personally, I think the phrase “semi-autonomous” gets closest to the target for something like a Tesla or the new Super Cruise system. Common sense suggests a Tesla Model S or a Cadillac CT6 that drives itself for several miles on the highway is performing the task “autonomously” or is “driving itself.” Can it all the time? No. So just say that: “Super Cruise can handle driving the CT6 on the freeway, allowing you to sit back without your hands on the wheel. But you still have to pay attention to the road.” Some variation of it isn’t hard.

More-advanced cars being tested on the road in California, say, those that can handle harder tasks like turning a corner? Same thing: Plainly state what those cars can do, but also mention that someone’s behind the wheel to resume control of the vehicle if necessary. California requires automakers to submit annual reports on how often their test drivers had to take the wheel back. Those offer perhaps the clearest evidence we have that no car on the road is fully-autonomous. Even Google’s self-driving car project, Waymo, recorded 124 so-called disengagements in 2016, and Waymo’s considered a front-runner in the autonomous car race.

Advertisement

Or, if the SAE levels are here to stay, here’s a potentially even easier way to sum things up. Level 1-4: You still need to pay attention. Level 5: Get drunk.