Making cars drive themselves is very, very hard. Sure, we’ve come an awful long way since that Stanford-modified Volkswagen Touareg won the first DARPA Grand Challenge back in 2005, but we’ve still got a long way to go before full Level 5 autonomy. No matter what anyone says. In case you don’t believe me, why not check out this video of a Waymo robotaxi getting very confused by some traffic cones and causing all kinds of trouble, including escaping from Waymo’s own support team.
The video is from YouTuber JJRicks Studios, who has done many videos documenting his rides in Waymo’s fleet of self-driving, Level 4 (that means no driver needed, but limited to a set, known area of operation) converted Chrysler Pacifica minivans, which Waymo operates as a robotaxi fleet in Phoenix. Sometimes people throw eggs at them.
Waymo’s automated vehicles have had some incidents in the past, and used to employ safety drivers in the car, ready to take over if needed; Waymo has since eliminated the safety drivers in-car, and now the Pacificas are un-human’d (well, except for the passengers) with safety monitoring happening remotely.
In this video, JJRicks hails a robot ride, and, as seems to be his habit, documents the entire process on video. I suppose luckily for him (but what would be unlucky for anyone who wanted to, you know, get to their destination instead of getting YouTube clicks) things went pretty significantly off the rails.
It’s a long video, but he’s thoughtfully added a bunch of annotations in the timeline, so you can skip around to the good stuff:
It’s interesting to see how this all unfolds. For the first part of the video, the car behaves fairly well. It drives pretty reasonably, and executes some difficult maneuvers like making an unprotected left turn. It also takes some odd shortcuts through neighborhoods, perhaps because data suggested a less challenging path that way? It’s not really clear what the goals are there.
I should also mention that weather and visibility were about as good as one could hope for, and road conditions just about everywhere seemed excellent.
The thing that managed to completely stymie this driving robot were the same things that anyone who has tried autocrossing very likely has already abused: orange traffic cones.
Interestingly, the car’s vision system (which fuses camera, lidar, and other sensors) can clearly see the cones around the corner, as you can see in the visualization screen here:
So the car knows the cones are there, and seems to be attempting to plot a course that sort of avoids them. The presence of the cones has already confused the car, which remains paused at this point before the turn, blocking traffic. It’s not a big deal really, as it’s a side street and at least one other car is able to get around the baffled vehicle.
When the car is stuck and calls for help to Waymo’s roadside assistance, what I really like is how much that delights the passenger, who laughs and actually claps his hands in glee. Really, Waymo couldn’t have hoped for a better passenger to have in the car when it goes to hell like this.
Then, while waiting for the support vehicle, some algorithm in the car’s silicon brains must have arrived at a decision, because the Waymo car then made the turn, fear of cones temporarily overcome, but then stopped in the middle of the lane, another terrifying cone dead ahead.
Looking at the visualization again, we see the cones clearly marked, and the car mostly in the coned-off lane, but with its butt sticking out into the open lane of traffic.
At this point, a Waymo vehicle is on its way to get a human driver in the car to rescue it, as Waymo keeps support vehicles in the area of their cars to handle these types of situations. They claim they don’t assign support vehicles to AVs one-to-one, but JJRicks there seems skeptical of that.
Then, strangely, the AV reverses out of its stuck position (again, this is reversing into an active lane of traffic, not what I’d call a great driving idea) and attempts to move again, but then stopped, which only accomplished having the car block the entire lane, instead of just half the lane.
The support driver is almost at the car, but before they arrive, a road maintenance worker removes the cone from in front of the confused car, and comes to the window to ask what the hell is going on, basically, as other cars honk and line up behind the Waymo van.
Removing the cone must have given the car an idea, because before the Waymo support van could arrive, the car decides to make its escape, and starts driving again.
The Waymo van is covering some ground, and it seems like it “fixed itself” but then its old nemesis Orange Cone shows up, and the car stops yet again:
It seems to be trying to plot some kind of path between the cones? Does it think it’s in a slalom?
Eventually, Waymo’s roadside assistance comes and takes over, but only after the car had stopped in confusion about three times, and ran from the Waymo support van twice.
Nothing seemed really all that dangerous, though there was this near miss:
Did you see that? The Waymo car had to brake suddenly to avoid driving into the car passing on the left. That’s not great.
Waymo did issue an official statement (in the middle of JJRicks’ YouTube video) about the incident:
So, Waymo says it has “already assessed the event and improved [its] operational process,” which I hope is corporate-talk for we explained to our cars what cones are.
Ultimately, this is a fantastic example of both how far AVs have come and how far they have yet to go. This gives us a sense of what needs to get fixed before we actually can hit Level 5 or even fully-working Level 4.
Traffic cones are not unusual objects to encounter on a road, and I’m amazed the car was so confused by them. They’re of a very limited size, immobile, and easy to navigate around. Why was this such an issue, and, even more importantly, why did the car appear to not understand what they were?
The real world is full of so so many more unpredictable and dangerous things than traffic cones; things fall off of cars and trucks all the time, strange animals bolt into traffic, people do unexpected things — Level 5 autonomy is as yet still a dream that will require a lot of work to get to.
Of course, I suppose this is all part of the process of learning! Traffic cones, though, still seem pretty remedial to me.
Technology is amazing, but I wouldn’t hand over your keys to a robot just yet.