Does Tesla's Autopilot Suffer From A Dangerous Blind Spot?

We may earn a commission from links on this page.

A grim milestone has been set in the history of automobiles: the first person was killed while a semi-autonomous car was driving. While it will be up to the National Highway Traffic Safety Administration to officially determine what went wrong, this circumstances of this accident, coupled with a similar, much more minor incident and Tesla’s own words suggest a large and potentially dangerous blind spot in the Tesla Autopilot system.

The clues that point to the Tesla’s blind spot come from the circumstances and impact location of both recent Autopilot-related crashes. Both incidents involved a Model S hitting a truck trailer, with the point of impact located at the windshield.

Advertisement

In one instance, this impact happened at low speeds and with nobody in the car, that auto-parking accident that occurred in May; this more recent crash ended in tragedy because the car was occupied and the speeds involved were significant.

Advertisement

There are a lot of issues raised by these crashes, including the role of the driver of the car, the level of responsibility and reaction time one can reasonably expect of a person in a car that’s effectively driving itself, but for right now I want to just focus on one key part of these wrecks: the point of impact.

Advertisement

The Tesla Model S in Autopilot mode seems to have a large, important blind spot above the car’s hood. The primary forward-facing sensors used by autopilot are a radar emitter located on the front of the car, centrally and below the upper false grille area, and a camera, mounted at the top of the windshield in front of the rear-view mirror assembly.

Advertisement

The camera above provides the system with lane-keeping information, speed limit data, the radar unit provides the ability to detect cars in front and determine how far away they are. These sensors combine with a dozen ultrasonic sensors around the car that give a limited-range (about 16 feet) “view” of the area around the car covering a full 360°.

While these do provide an impressive sense of the surrounding world to the car’s electronic brains, it does seem to leave a large hole from, essentially, the hoodline of the car and up. The upper camera doesn’t appear to be tasked with looking for obstacles in that volume of space, and the forward radar assembly is calmly unaware of what is happening less than a foot or so above it.

Advertisement

Tesla’s own literature seems to confirm this blind area, as their Autopark/Autopilot instructions include this:

Please note that the vehicle may not detect certain obstacles, including those that are very narrow (e.g., bikes), lower than the fascia, or hanging from the ceiling.

Advertisement

This one sentence manages to give a pretty good idea of the range that the Tesla system is capable of seeing: a horizontal plane of reality that’s about as thick as the car’s “face” and hovering about six feet above the ground.

The two wrecks where the cars impacted truck trailers happened because those trailers existed in the space above where the Tesla was able to detect; the front of the car went under the trailers because they simply didn’t ‘see’ the trailer, which occupied the space above the car’s field of sensory input, causing the cars to smack their own greenhouses into the trailers.

Advertisement

It appears that, in the Tesla’s sensor-informed electronic mind, the car views itself as a sort of floating mattress-shaped being. The car does not seem to be aware of its own structure above the beltline.

This idea is also supported by Tesla’s own description of the accident:

What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.

Advertisement

When they mention the white side of the trailer and the bright sky, which does suggest that some sort of computer-vision system is at least attempting to see objects above the hood, though without much success here.

White is likely the most common color for an 18-wheeler’s trailer, and bright skies are hardly uncommon during that prime driving time known as “day,” so it seems a pretty big oversight if the Autopilot system cannot detect a huge, white truck trailer whose volume is located above the sensor range of the radar window.

Advertisement

According to the Florida Highway Patrol, the wreck occurred at 3:40 p.m., so the sun would have been somewhat lower in the western sky, behind the Tesla driver. The truck would have been illuminated along with a fairly bright sky, though, to be honest, I’ve never had an issue seeing a large truck crossing a road in daytime even in similar lighting.

The bigger issue is that, according to the FHP’s press release,

V01 [the truck] proceeded to make a left turn on to NE 140th Court directly in front of V02 [the Tesla] as it was oncoming.

Advertisement

I mean, a tractor-trailer turning right in front of you as you’re driving pretty much anything is very bad news.

The Tesla hit the trailer itself, and if the accident diagram is accurate, at about the halfway point on the trailer. That means the truck had a bit of time to get partially across the road before impact, and a human driver paying attention would likely have seen the truck, and applied the brakes before impact.

Advertisement

Because the Tesla did not “see” the truck, this did not seem to happen here, and the car went under the truck at speed. If there is, in fact, a sensor blind area above the hoodline, if the truck had aero skirts or some other form of lower body, perhaps the radar system would have detected it as an obstacle and the collision-avoidance braking would have engaged, which could have lessened the severity of the impact to the Tesla’s windshield.

Tesla suggests that if the Model S impacted the front or rear of the trailer, the Model S’ crash safety system would have helped, which is undoubtedly true. Like a lot of modern cars, the Model S has a forward crash avoidance system that detects when the car is about to rear-end another vehicle. This system seems to be more dependent on the low-mounted radar window, which is why it was ineffective on the high side of the trailer.

Advertisement

Maybe Tesla needs to add a high-mounted secondary radar window so the car is “aware” of its entire body?

I would have thought the upper camera would do some sort of object-checking for collision reasons, but the nature of these wrecks and Tesla’s warning about ceiling-hung objects strongly suggests otherwise.

Advertisement

This is unfortunate, because it is precisely in that upper volume of the car that most people keep their heads and other vital parts.

Mobileye, the company that makes camera-based computer-vision systems for autonomous driving, issued this statement about the wreck:

“We have read the account of what happened in this case. Today’s collision avoidance technology, or Automatic Emergency Braking (AEB) is defined as rear-end collision avoidance, and is designed specifically for that. This incident involved a laterally crossing vehicle, which current-generation AEB systems are not designed to actuate upon. Mobileye systems will include Lateral Turn Across Path (LTAP) detection capabilities beginning in 2018, and the Euro NCAP safety ratings will include this beginning in 2020.”

Advertisement

While this kind of wreck is by no means the most common type, it is a type of accident that certainly does happen, and it’s a type of wreck with often catastrophic effects. Pretty much no car on the road today can take an impact to the A-pillars as well as it can to the lower parts of the car.

This is usually less of an issue because objects on a collision course with the windshield tend to be very, very visible to a human driver. But if that driver has been in a car that’s been nearly driving itself for quite a while, there’s no sound reason to expect and rely on that person being alert and ready to leap in and take evasive action.

Advertisement

While it does seem to be the case that the low-mounted, forward radar window is handling much of the collision detection, as the system will shut down if this radar window is blocked, it’s possible that the camera is assisting this as well. In fact, there have been many rumors, as recently as last month, about a new tri-focal camera system that looks to become standard on the Model S very soon, and this three-camera setup is said to improve Autopilot functionality.

It’s not clear if this new camera setup will cover the suspected Model S over-hood blind spot, but I suspect that after these two highly public wrecks, Tesla is likely to modify their software and maybe hardware so that this crucial part of the car has the sensor coverage it deserves.

Advertisement

If Tesla’s current Autopilot system does, in fact, have no ability to detect objects that could collide with the car’s greenhouse, that’s a pretty significant and dangerous oversight. A car’s above-beltline structure is quite vulnerable to impact damage, and is also the place most likely to be full of soft, fragile humans.

I think, after these wrecks, Tesla should temporarily disable Autopilot until they have a system in place that insures that no object can impact the windshield area undetected. This could just be a software update, or it could require an upper radar unit, or perhaps the new camera setup will help.

Advertisement

That is, of course, if all my speculation here is true. It certainly may not be, but the nature of these recent wrecks, and Tesla’s warnings certainly make an upper-body blind spot seem a likely possibility.

An autonomous car needs to be aware of the actual volume of space it takes up. A human driver, for all our faults, has a very real and visceral sense of where our heads are; anyone who has involuntarily ducked while driving under an alarmingly low bridge can attest to this. We’re even able to eventually learn where the rough boundaries of the car are, even if we get there only after a bunch of kerbed wheels and nicked bumpers.

Advertisement

We can’t ignore the human element either, here. It’s likely that the driver was not paying attention to the road prior to the accident. We’ve even had some reports that he was possibly watching a movie in the car, and while that’s hardly advisable, I don’t think the driver deserves all the blame here.

Advertisement

The Tesla Autopilot system, despite all of Tesla’s ass-covering claims of betahood, is used by many people as a true, full self-driving system, which it really isn’t. It’s a mistake to expect people to remain always vigilant and ready to leap into action when the car’s doing most of the driving. Sure, they should be ready, but let’s be honest. People just don’t work that way.

It’s also worth remembering that any car, driven by human or software, would be well and truly boned in a situation where a truck turns in front of them. Driving anything anywhere involves risks, no matter what.

Advertisement

But if Tesla wants to have a self-driving (or nearly self-driving) system they want people to test, for free, on public roads, those people need to be more fully aware of what the system’s limitations are. If it does turn out that a Model S Autopilot system cannot detect obstacles above the sightlines of the radar unit, then that needs to be made absolutely clear.