The Problem Isn’t Media Coverage Of Semi-Autonomous Car Crashes

Scene of the recent Utah Tesla crash into a fire truck
Scene of the recent Utah Tesla crash into a fire truck
Photo: South Jordan police

In some corners of the world where Elon Musk’s word is taken as gospel, there’s a prevailing view that it’s irresponsible for news outlets to cover car crashes involving semi-autonomous technology as diligently as we’ve seen lately. Whether it’s Tesla investors stanning hard for the company, claiming “not one” reporter covers regular car crashes, or even journalists themselves casually agreeing with Musk, you see this perspective everywhere. It’s wrong.


This view either represents an innate desire to prevent any scrutiny toward a still-very-new technology, or a complete lack of understanding of how journalism works. Perhaps both. Regardless, no one should take it seriously.

Here’s a few examples of this, following news that a Tesla Model S operating in Autopilot crashed into a firetruck late last week:

First, Ross Gerber, a Tesla super-fan who runs an investment firm that’s super bullish on the automaker:

A clip from Fred Lambert of Tesla fanatic news site Electrek, which operates effectively as an in-house newsletter for the automaker:


And Tesla CEO Elon Musk, hours before it came out that Autopilot was linked to this crash:


The overarching view here is that, when compared to human-driven cars, it’s a farce to highlight crashes involving semi-autonomous cars even if it’s a new technology. As Musk would put it (using a questionable comparison in its own right due to some seemingly funky math), there’s one fatality per 320 million miles driven in Tesla cars. For every other car, it’s one fatality every 86 million miles driven. In that light, why cover semi-autonomous crashes?

There’s a simple explanation: it’d be wholly irresponsible to avoid scrutinizing whether this technology can perform in the real world. At a time when most people don’t even trust the idea of self-driving cars, it’s needed.


The fledgling autonomous tech industry is still working out kinks within its driving systems, evident by a very jarring cash involving an Uber-owned self-driving car that fatally struck a pedestrian in March. Put simply, the technology hasn’t been proven to be safer than human drivers. Any suggestion by Musk or his kin otherwise is completely misleading.

Take one research paper by the University of Michigan from last year (emphais mine):

Yet for consumers to accept driverless vehicles, the researchers say tests will need to prove with 80 percent confidence that they’re 90 percent safer than human drivers. To get to that confidence level, test vehicles would need to be driven in simulated or real-world settings for 11 billion miles. But it would take nearly a decade of round-the-clock testing to reach just 2 million miles in typical urban conditions.


Tesla’s fleet hasn’t come close to that figure. A RAND study echoed that point, that it’d take accumulating perhaps billions of miles on the road to demonstrate autonomous cars are safer. Arguing this technology has already proven itself to be safer masks the fact that that the technology itself is still a work in progress.

What’s more, there are simply far fewer Teslas on the road vs. non-Autopilot equipped cars. So how is that comparison viable? It falls apart as soon as you dig an inch deep.


And that’s the upshot here. Over the last few years, the auto industry has collectively made a decision that autonomous cars are inherently safer, capable of drastically reducing the 94 percent of fatalities that stem from car crashes, and have—so far—developed and deployed that technology in a regulatory vacuum.

What I’m saying is that it’s inherently a news story. The job for reporters is to appropriately scrutinize whether that theory bears out. By most accounts (outside of Musk and his kin), that’s what’s going on, and has gone on, for years.


Outrage is appropriate when it comes to covering what seems like run-of-the-mill crashes involving a wrecked Tesla that, notably, only left the driver with a minor injury. Every Tesla crash isn’t news. Every Tesla fire isn’t news.

What is newsworthy is that the driver of that car claims Autopilot was on and she was staring at her phone instead of the road at the time of the collision.


Tesla tells drivers they need to pay attention while Autopilot is engaged, but time and again, these crashes keep happening. They’ve happened since Autopilot’s debut. A curious observer might wonder if that means Tesla could do something else to prevent drivers from becoming distracted while Autopilot is being used—or, perhaps, if we should use semi-autonomous systems at all—but instead, this is what Musk, Gerber, and Electrek think is the real story:


Yes, it’s surprising in light of the pictures that the driver escaped with only a broken ankle. But what actually caused the wreck is she was using a system Tesla knows can lead drivers into complacency, and instead of watching the road (as Tesla implores drivers to do), she was staring at her phone. I’m not sure how to take Musk’s open disdain of coverage about this crash other than he truly believes this is a problem that can’t, or shouldn’t, be solved.

The kicker is that Musk, Gerber, et. al., refuse to acknowledge the consistent, daily coverage of serious car crashes in the U.S. Anyone who flips on a nightly news broadcast is sure to find some sort of report on a crash. It’s literal sustenance for TV news stations.


Rather than push for, say, an increased budget for our federal car regulators to oversee safety problems, or maybe promote better driving habits, or call for stronger, obvious safety standards, Musk wants to be left alone. Look elsewhere, he says, and pay no mind as crashes linked to misuse of Autopilot continue to pile up. This attitude alone only invites more coverage, but somehow Musk doesn’t realize this or, actually, relishes it.


This doesn’t negate the fact that Autopilot was pioneering tech, and put Tesla out front as a key innovator in the field of autonomous driving technology—or that Tesla’s reputation has benefitted from this a great deal. When it’s working as designed in theory, it’s cutting-edge and indeed impressive.

But the point is this: We’re dealing with still-unproven technology that more automakers are installing on cars. It would be malpractice not to cover this transition. Musk needs to accept this and should, instead, focus on how to keep improving Autopilot and prevent drivers from misusing it. That would be a far better use of anyone’s time.

Senior Reporter, Jalopnik/Special Projects Desk


Eddie Brannan

The problem is disproportionate coverage that creates the incorrect impression that semi-autonomous vehicles—and in particular Teslas—are inherently dangerous in and of themselves.

No-one is suggesting that these events shouldn’t be covered (that I have seen, anyway) or that’s irresponsible to do so, just that they should be put in context as part of the reporting.

People, myself included, see a general feverish coverage of anything Tesla that generally reveals more bias than factual reporting. That snide “In some corners of the world where Elon Musk’s word is taken as gospel” remark is a case in point. We all know the positives and negatives of the business model, but the incessant scrutiny of every accident—nearly all of which turn out to be user error—seems unwarranted given the overall safety record, which is rarely mentioned to give the articles context.

TL/DR Fake news! Witch-hunt! Sad!