California requires all companies that test self-driving cars on public roads in the state to report miles driven and the number of “disengagements,” or times a human driver takes over control. Cruise co-founder and CTO Kyle Vogt believes this reporting method is a poor metric for comparing companies, and is causing companies to test and demo in easier environments in order to reduce reported disengagements.
In a post on Medium, Vogt says that at Cruise, disengagements are sometimes used as a courtesy to other drivers, or as a cautious reaction from the driver to a situation that could have been handled by the vehicle. He explains, “Have you ever been in the backseat of a human-driven car and felt the urge to grab the wheel when something crazy happens on the road? It’s exactly like that.”
Autonomous vehicle (AV) companies need to be extra careful when it comes to safety, as well as the perception of safety. AVs working correctly is not news, but disengagements, running red lights, and crashes are very much news that could affect the perception of AV safety, independent of their actual safety record. This leads to the well-controlled demos that Vogt takes issue with.
“Companies carefully curate demo routes, avoid urban areas with cyclists and pedestrians, constrain geofences and pickup/dropoff locations, and limit the kinds of maneuvers the AV will attempt during the ride — all in order to limit the number of disengagements. Because after all, an AV is only ready for prime time if it can do dozens, hundreds, or even thousands of these kinds of trips without a human touching the wheel. That’s the ultimate sign that the technology is ready, right? Wrong.”
Vogt says Cruise will be releasing data that better conveys AV performance once their vehicles are ready for deployment. This will include, “a) data on the true performance of human drivers and AVs in a given environment and b) an objective, apples-to-apples comparison with statistically significant results.”
I am looking forward to this information and I hope it can more clearly convey AV safety, which often gets fudged and obfuscated with incomplete data. I’m sure autonomous vehicles will be a great thing for overall safety and accessibility at some point in time. But to know when that some-point-in-time is, we need a clearer picture of AV safety and how it really compares to human drivers in actual environments.