When Elon Musk announced that Tesla’s semi-autonomous driving system, known as Autopilot, was getting a major update, he also made a point to suggest that he believed that the new version 8.0 of its system could have prevented the first fatal crash of a semi-autonomous car. I’m not convinced, at least not yet.
Here’s what Musk said about the update:
We’re making much more effective use of radar. It will be a dramatic improvement in the safety of the system done entirely through software.
Essentially, the big change seems to be that the Autopilot system will rely more on Tesla’s integrated radar system, as opposed to the primarily camera-based setup that autopilot formerly used.
During yesterday’s press conference, Musk said it was “very likely” that the new version of Autopilot would have prevented the wreck. That first fatal wreck occurred back in May, and was caused when a Tesla Model S, under Autopilot control, drove into and under a big rig trailer that was crossing the road. The driver, Joshua Brown, was killed when the Tesla drove under the trailer, and impacted the greenhouse of the car on the trailer’s side.
The nature of the wreck implied that the Autopilot system did not see the large, white trailer with its camera-based vision system, and that the radar system—which was always a part of autopilot—also did not notice the large trailer.
I had originally speculated that, based on this wreck, another (much lower speed and non-fatal) crash, and Tesla’s own warnings, that the Tesla Autopilot system has a radar blind spot that encompasses the upper part of the car, from the beltline up.
The other, low-speed crash involved a Model S driving itself under another big rig trailer until it impacted on the car’s greenhouse, and Tesla’s warning for it’s Autopark system (which shares sensors with Autopilot) reads:
Please note that the vehicle may not detect certain obstacles, including those that are very narrow (e.g., bikes), lower than the fascia, or hanging from the ceiling. (emphasis mine)
All of this suggests a radar/sensor blind spot that encompasses the car’s greenhouse.
My skepticism that this new update could have prevented the May fatal crash is based on this blind spot, which seems to be a physical limitation of the hardware as opposed to a software issue.
Radar has always been part of Autopilot, especially for calculating when to employ emergency braking. That Tesla has given the radar system a greater role in Autopilot is great, but so far we’ve seen no evidence that the radar transceiver is any more capable of “seeing” above the hood and in the greenhouse area of the car than before.
Musk does give some mention of the Autopilot system seeing overhead signs. As Musk describes in a blog post:
The third part is a lot more difficult. When the car is approaching an overhead highway road sign positioned on a rise in the road or a bridge where the road dips underneath, this often looks like a collision course. The navigation data and height accuracy of the GPS are not enough to know whether the car will pass under the object or not. By the time the car is close and the road pitch changes, it is too late to brake.
Now, if the overhead sign is far enough away, it could be seen even by a low-mounted radar system. Musk suggests that for overhead obstacles like this, a database will be collected with GPS coordinates so the car will know what to be worried about:
This is where fleet learning comes in handy. Initially, the vehicle fleet will take no action except to note the position of road signs, bridges and other stationary objects, mapping the world according to radar. The car computer will then silently compare when it would have braked to the driver action and upload that to the Tesla database. If several cars drive safely past a given radar object, whether Autopilot is turned on or off, then that object is added to the geocoded whitelist.
Of course, this doesn’t help for a truck crossing the road. Maybe the improved radar setup would have seen it? Maybe the radar-bouncing method Musk describes for seeing what’s in front of the vehicle in front of the Tesla would help, too. It’s all possible, of course. But I’d like to know.
I’ve reached out to Tesla to see if they can give me real information on the field of view for the radar emitter. Based on Tesla’s warnings and at least two wrecks, at the moment it does not seem that the radar is capable of seeing close objects above the car’s hood, even ones that may impact the windshield or the greenhouse.
This seems to be a hardware limitation, and as such may not be solvable with a software update. If that’s the case, then, no, I don’t think this may have saved Joshua Brown.
Hopefully, Tesla will get back to me with more details soon. We’ll start researching the possibility of some sort of test, too. This seems like it’d be good to know.
Correction: This post initially stated that Joshua Brown’s fatal crash occurred in June. It was only revealed in June. It actually happened back in May.
UPDATE: I got a response from Tesla:
... our radar functions like any radar with a field of view that extends like a cone from the front of the sensor. As such, radar used in our cars sees well above the hood and well off either side of the road, with the extent increasing as distance from the sensor increases. It is not necessary that a tall object be seen immediately above the car from zero distance; that object will not have arrived there without passing through the field of view of the car’s sensors. Autopilot does the same thing with this information as a human does: track an object while visible, then infer its movement relative to the vehicle when it drops out of view.
So, according to this, I’m wrong, and it should be able to see the area above the hood no problem via radar. Like a cone, the amount above the car is greater the further away from the car. This should solve the issues I’m concerned with, but it doesn’t explain what went wrong with the Brown crash; why didn’t the radar see the truck? I know it wasn’t the primary sensor system used, but it was still involved.
I’m curious to see if there’s any way to test this.