Joshua D. Brown was killed earlier this year when his Tesla Model S, cruising on Tesla’s Autopilot system, crashed into the broad side of a truck after confusing it for a road sign. With Tesla’s newest update of its system to version 8.0, the company aims to change that sort of outcome with radar and machine learning.
Tesla CEO Elon Musk announced a raft of changes to Autopilot on a conference call to reporters today, including expanded use of radar. The system should also now be able to take highway off ramps, and it should better detect when someone wants to change into your lane thanks to an increased ability to see another car’s directional signal.
It’s not like Tesla’s Model S and Model X haven’t had a radar system this whole time. Before 8.0, the Autopilot system used an optical camera as its primary collision avoidance system, with a supplementary radar system. Now the camera and radar are more like equal partners.
The update will be all software-based, using only the sensors on existing vehicles. And that all prompts the question of why Tesla didn’t do this before, but apparently it’s not so simple, as Tesla put it in a blog post:
After careful consideration, we now believe it can be used as a primary control sensor without requiring the camera to confirm visual image recognition. This is a non-trivial and counter-intuitive problem, because of how strange the world looks in radar. Photons of that wavelength travel easy through fog, dust, rain and snow, but anything metallic looks like a mirror. The radar can see people, but they appear partially translucent. Something made of wood or painted plastic, though opaque to a person, is almost as transparent as glass to radar.
On the other hand, any metal surface with a dish shape is not only reflective, but also amplifies the reflected signal to many times its actual size. A discarded soda can on the road, with its concave bottom facing towards you can appear to be a large and dangerous obstacle, but you would definitely not want to slam on the brakes to avoid it.
In short, the radar system can cause a lot of false positives. Tesla’s main plan to solve that problem is to use its machine learning system, in which every single Tesla records data about how it drives and driving conditions, in an attempt to improve the vehicles using massive amounts of data.
Now if a Tesla sees what it “thinks” is a road sign, it will keep on driving through it. The same goes for the second, third, and possibly even the fourth time a Tesla drives by.
By the fifth Tesla onwards driving past a given point, the global fleet of Tesla’s should be sure that that particular location contains a street sign, and not, say, a truck crossing a divided road like in the Brown case. That location will then be added to a “whitelist” of safe areas.
But if a location is not on that whitelist, the car will know that there’s an obstacle ahead, and it should brake. Or, as Tesla puts it:
Initially, the vehicle fleet will take no action except to note the position of road signs, bridges and other stationary objects, mapping the world according to radar. The car computer will then silently compare when it would have braked to the driver action and upload that to the Tesla database. If several cars drive safely past a given radar object, whether Autopilot is turned on or off, then that object is added to the geocoded whitelist.
When the data shows that false braking events would be rare, the car will begin mild braking using radar, even if the camera doesn’t notice the object ahead. As the system confidence level rises, the braking force will gradually increase to full strength when it is approximately 99.99% certain of a collision. This may not always prevent a collision entirely, but the impact speed will be dramatically reduced to the point where there are unlikely to be serious injuries to the vehicle occupants.
In addition to the machine learning component, the radar system is also now being used with more detail, to create a more vivid radar image of the world. That’s actually huge, as radar has the advantage of being able to see through things like rain, snow, and heavy fog, which had been known to give the Autopilot system headaches in the past.
In fact, Tesla’s system can now actually bounce radar waves off the road in front of it, so it could even “see” ahead of a car traveling in front of it, as Musk put it in a conference call:
With an additional level of sophistication we can use the radar through the car in front of you by bouncing the radar off and around the car. By using the radar pulse and [optical camera] to look at the echo in front of the car, as well as what’s in front of you, so if the car in front of you can’t see the object, the tesla can and still brake.
In other words, if the car in front of you doesn’t see something and smashes into it, hopefully you won’t do the same. In addition, if the car in front of you does see something and can only swerve at the last second, hopefully you won’t have to take the same evasive maneuver as the car already knows what’s coming.
Of course, it’s not an ideal system. Elon was quick to make the point that he believes that there is no such thing as “perfect safety,” and that there will always be deaths and injuries on the road, since “the world is a very big place with a huge number of people and a huge number of circumstances.”
(At one point, he even brought up incidents of people dying by being strangled by bed sheets and crushed by vending machines, but noted that no one is trying to ban vending machines, so there’s that.)
The radar system also has a hard time detecting soft bodies, though it should see something as big and dense as a moose. Small deer should be harder.
All of that is in addition to a lot of various other little improvements to what we’ve consistently found to be the best semi-autonomous system on the road today.
We don’t know for certain if all of this adds up to a vastly better system, but we’re probably going to find out very soon.