Regardless, a number of Autopilot crashes have happened, and some of them illustrate the precise “human factor” issues these researchers warn about.

In the fatal crash of Joshua Brown, which was the subject of an National Transportation Safety Board investigation, the driver overly relied on automation by demonstrating a “lack of understanding of the system’s limitations,” while the NTSB said Tesla didn’t go far enough to ensure drivers remained alert. The Tesla spokesperson, speaking generally about Autopilot, defended the company’s approach to Autopilot by pointing out any car could be misused.

In other words, the Joshua Brown case is one perfectly in line with decades of human factor research. It is both a technical problem and a behavioral one.

Not every company is as cavalier about its driver assist programs. Cadillac’s Super Cruise system is generally regarded as one of the most responsible semi-automation systems in the industry because it uses a camera mounted on the steering wheel to ensure the driver’s eyes are on the road, though its rollout has been slow. To date it’s only on one car, the Cadillac CT6 sedan, and not even the rest of the Cadillac lineup. That may speak to GM’s more conservative approach.

Automation Transformed How Pilots Fly Planes. Now the Same Must Happen With Cars

Lisa Talarico, who works as a lead on driver monitoring system performance for Super Cruise, said dealerships are “well trained” on the feature, but drivers must only watch a short video about the system that she described as a “very general overview.”

But she added that in small test studies they’ve found “only a small percentage increase” in driver off-road glances when Super Cruise is activated versus manual driving. (Several other manufacturers, including Nissan and Volvo, declined to be interviewed for this story.)

Talarico also said Super Cruise’s technical specifications are detailed in the owner’s manual. Indeed, every car’s manual will likely have ample descriptions of any automated systems the car includes.

But Manser, the Texas A&M University human factors researcher, warns that even if someone actually reads the damn thing, owner’s manuals typically focus on what you shouldn’t do, absolving the manufacturer of legal liability. Proper human factor training is about a whole lot more than that.

What this training ought to include, according to the researchers I spoke to, is a deep understanding of how the technology works so humans can anticipate problems before they occur. While you really didn’t need to know how a car worked in order to drive one in the manual age, you probably should have an understanding of what your car’s cameras can (and can’t) detect and how that information is interpreted in order to spot dangers on the road.

“I think that these new cars, oddly, so paradoxically, so ironically, in doing more for us, they don’t require us to know less,” Casner noted. “They require us to know more.”

In much the same way pilots had to gain a deeper understanding of automated systems—indeed, that very lack of understanding may have contributed to the Boeing crashes as the pilots couldn’t figure out how to disengage the automated system that was plunging them towards the ground—these experts say drivers now have to as well.

That’s a tall task for a country that has largely given a driver’s license to damn near anyone who wants one. In the 1970s, 95 percent of eligible students received drivers education through public schools, but thanks in part to a 1983 NHTSA study that concluded Georgia teenagers who took drivers’ ed were not better drivers, public schools—often faced with budget cuts—largely did away with such programs. (A recent and more comprehensive eight-year study in Nebraska found that, in fact, drivers ed classes do make better teenage drivers.) Additionally many schools cut behind-the-wheel training over liability issues.

In conjunction with the widespread use of largely unregulated online driving courses, it’s far too easy for teenagers to plunk down a few hundred dollars of their (or their parents’) cash and get a license without learning much about operating a car safely. Of course, earning the privilege of driving when you’re 17 years old generally lasts for your entire life.

It’s hard to imagine many customers sitting through such an extensive tutorial or presentation before buying a car, given that such a program may be more extensive than what they had to do to get their license in the first place. If you found out you had to, say, take even just a one-hour class at the dealership in order to buy a car, would you do it? Or would you buy a different car?

It’s All About the Benjamins (And Also Liability)

What ultimately may prevent the auto industry from taking the same training steps as the airline industry is the issue of liability. Manufacturers and possibly airlines are liable if a plane crashes due to computer error—Boeing has already been sued for the Ethiopian Airlines crash—because customers are paying, in part, for a safe journey. The automation hand-off on planes is between computers and pilots, who are employees of the airlines. In either case, the customer is not taking control of anything.

With semi-autonomous cars, the legal picture is much murkier, as have been discussed in countless legal reviews and academic journals. For one, drivers are simultaneously the “co-driver” and the customer. On top of that, the current legal framework basically makes the driver responsible for what the car does, except in the case of extreme manufacture defect, tampering, or other edge scenarios.

“Ultimately, in our society, the driver will always be responsible for what happens with that car,” Manser says. “If you’re under, say, [Level] 3 or [Level] 4 automation and you’re looking at your phone real briefly and you get in a crash, it’s going to come back to you as the driver. You’re responsible for whatever that car does.”

Consider, for example, when Uber’s self-driving test car struck and killed a pedestrian while in full-autonomous mode. Prosecutors declined to hold Uber accountable even though it was the company’s car, equipment, and algorithm that functionally killed this person.

Prosecutors have not yet determined whether to press charges against the safety driver, who, by allegedly streaming a TV show on her phone at the time of the crash, was exhibiting the very indicators of the boredom and thought creep reams of literature on the airline industry warn about (Uber quickly settled a civil suit with the victim’s family).

At the very least, the legal framework for both criminal and civil cases is nowhere near close to addressing questions such as: was the failed hand-off between computer and human the algorithm’s fault or the human’s fault?

As a result, the automotive industry has no obvious incentive right now to train drivers properly. Instead, it has every incentive to allow misconceptions about the car’s capabilities to linger that exaggerate its abilities, which may boost sales. To this end, one Tesla salesman told me that for some customers he takes test cars on a road near the store with a sharp turn to demonstrate how well Autopilot works, even though the road is not a highway, the only place Autopilot is supposed to be used.

Little wonder, between stunts like this, tweets from Elon Musk that mischaracterize Autopilot’s capabilities, or Musk himself misusing the feature on 60 Minutes, that Tesla owners feel empowered to use the feature where it’s not intended.

Even Cadillac, the most cautious of the manufacturers, debuted Super Cruise with an ad spot featuring the tagline “it’s only when you let go that you begin to dare.”

Manser told me he feels sorry for drivers because they’re being thrown into what he calls an “untenable position.” As he described it to me, it sounded almost like a trap.

Drivers are being sold features on their cars that delegate more and more responsibility of actually driving the car to computer programs created by the manufacturer. And the manufacturer—plus the dealers—give drivers only the briefest tutorial on the system, hand them the keys, and tell them to have fun.

What could possibly go wrong?

Is It Actually Safer?

The counterargument to all this, one I heard several times in these interviews, is that surely these driver assistance features will be safer than humans alone, who are by and large terrible at driving. In 2017, the latest year for which data is available, 37,133 people died in motor vehicle crashes, according to the IIHS. How could computers be any worse?

The researchers I spoke to who are concerned about the lack of driver training happily acknowledge semi-autonomous driving features have the potential to make roads safer if done right. But as of now, these features are for use on highways. And highway driving is one thing humans actually do pretty well.

According to Kidd, the Highway Loss Data Institute researcher, about a third of all miles traveled in the U.S. are on highways, but only nine percent of crashes, meaning highway travel is significantly under-represented in terms of where we’re crashing our cars.

“People are pretty good at traveling on the highways,” Kidd says, “so to what extent will it actually lead to a marked improvement in safety? That question is still out there.”

Automation Transformed How Pilots Fly Planes. Now the Same Must Happen With Cars

Part of the reason the question is still out there that is we don’t have good data on, well, anything related to semi-autonomous features. Kidd says there are three reasons for this. First, the technology is still too new. Second, there is no good public information on which vehicles on the road actually have semi-autonomous capabilities. And third, independent researchers don’t know when drivers are actually using it.

(Tesla does issue quarterly safety reports, but it only includes accidents-or-near-misses-per-mile for when Autopilot is engaged versus when it is not, a somewhat misleading statistic because Autopilot is, per company policy, exclusively used on highways, which, as stated above, have a much lower crash-per-mile rate for human drivers, too. In a recent earnings call, Musk said they won’t be releasing more data because he believes people would “sort of like data mine the situation and try to turn a positive into negative.”)

Kidd says the best solution here would be some kind of data sharing agreement between manufacturers and the government, akin to what the Federal Aviation Administration has in place to detect issues early and present industry-wide solutions. But the government would probably have to regulate that into existence, which could take years—the law always lags behind technological advancement, as has become painfully obvious in recent years.

Recently, GM, Ford, and Toyota announced they would launch a consortium for autonomous vehicle standards, but the details are still murky, and it isn’t clear if such a group would delve into semi-autonomous features. Nor is it clear how transparent this group will be with their data.

Even so, Kidd is skeptical an industry-wide data sharing program will happen any time soon because manufacturers are racing to the trillion-dollar goal of fully self-driving cars and don’t want to risk sharing their secret sauce with competitors. Indeed, car companies have ample incentive to hide as much information about their proprietary semi-autonomous systems as possible.

The flip side of that is, because of the way liability for car crashes works, car companies also have very little incentive to enter these kinds of safety-focused data sharing agreements Kidd advocates for. After all, car companies are only liable for deaths or injuries on the road in cases of manufacturing defects.

But people crashing their cars because they don’t understand the technological or engineering limitations of their machines? Hell, that’s driving.

Please, Sir, May I Have Some Training

My last question for the researchers was what any individual consumer should do if they buy a car with semi-autonomous features. How can they train themselves? How can they be safe?

I expected their answers to make me feel slightly better about the whole situation. Instead, they only made me feel worse.

Casner’s message, short of re-tooling the entire drivers’ education structure including bringing it back to high schools, was, essentially, be really careful.

“My biggest message is that there is so much more going on here that meets the eye … more than what your common sense is telling you,” he said. “And more than what the car companies are telling you. There is more to this than pushing a few buttons, sitting back and relaxing, and enjoying the ride. There are things every driver ought to know about the car, about themselves, and about this new weird driving task they’re about to undertake.”

Automation Transformed How Pilots Fly Planes. Now the Same Must Happen With Cars

Manser, using the careful language of an academic who has spent years researching a subject, advises people to “understand every operational aspect.” When the system turns on and off. How to do it yourself if the system doesn’t. What it’s going to do in emergencies. What it’s going to do in “normal” situations.

“They need to form an accurate mental model of how that system operates,” Manser continued. “Because if your model and the system’s model are different, there’s going to be conflict.”

Where can people find that information if not in the owner’s manual or the dealerships? I told Manser it sounded like he was saying people need training that doesn’t exist yet.

“Yeah,” he conceded. “That’s probably right.”