Watch the Roborace DevBot 2.0 Race up the Hill at Goodwood Without a Driver

Illustration for article titled Watch the Roborace DevBot 2.0 Race up the Hill at Goodwood Without a Driver
Screenshot: Goodwood Road & Race

Roborace wanted to show the gathered crowds that it had a car capable of driving itself up the hill at Goodwood, at speed. To do that, the autonomous race car company put YouTuber Seb Delanny in the cockpit to drive part of the way up the hill to an area I’m calling the “show-off zone” where most of the donuts and burnouts happen, where people line both sides of the track, where most of the action is. Once there, he got out of the car and waved goodbye as the DevBot 2.0 carried on its merry way continuing up the hill with nobody onboard.

Advertisement

There is a huge difference between running up a driveway on full autonomous LIDAR tech, and driving itself around a circuit with a dozen more of these things jockeying for position, or even navigating the mean streets of the real world at normal speeds. That being said, I’m pretty impressed with how smooth and seemless this self-driving racer seems to be. While I’m sure someone trained in the art of motorsport could still drive quicker than this computer can, this certainly looks faster than I expected it to be.

And here’s the view of the full run, including everything Delanny had to do inside the cockpit from the start to the show-off zone. Then the onboard video continues to show the ghost pilot take over and shoot away. It looks like the car was never really exceeding 100 kph, but that’s still mighty fine for the short climb.

It would seem that when there is no canopy of trees, the car relies partially on a GPS pre-marker to navigate the course, but being that the run up Goodwood is mostly foliage overhead, the car reverts to a pure LIDAR navigation. You can see some of the tech involved in this video Roborace put out last week.

Jalopnik contributor with a love for everything sketchy and eclectic.

Share This Story

Get our newsletter

DISCUSSION

At 0:40 engineer Holly Nall says “It’s really important that the car can localize, it still needs to understand its position in the world.” But in fact humans drove for a century, and most of the time drive today, without knowing at all their car’s “position in the world,” or in other words its geodetic coordinates. People don’t drive in relation to the ITRS, they don’t even drive in relation to information about the next block, they drive in relation to a.) their perception of the part of the world that is within a few seconds, (ordinarily about five, rarely more than ten) drive time of their current location, and b.) their apprehension of the constantly shifting immediate dangers within that very limited range.

Furthermore, all the roads they drive on were designed to be used by drivers with that very limited, constantly updated range of knowledge.

This is the fundamental flaw of robotic cars these days. I’m not saying all humans are flawlessly great drivers, obviously not, but what fallible primates do on the road is very impressive if you think about it. If manufacturers of robotic cars want their computers to drive as well as mere humans do, on roads that were designed for the weaknesses and strengths of human driving ability, then they should emulate the way humans deal with those roads. But they don’t and the way autonomous driving is being implemented these days, they can’t; they don’t even try to do what every driver, even drunk ones, do on a second-by-second basis.

The best of them can say “The car in front of me is slowing down.” But I haven’t yet heard of any that say “Whoa, look at all those brake lights, traffic is slowing down,” much less “It was all woods along this highway but over there are a few houses, I bet we’re coming up on a town, better keep my eyes peeled,” or “Why are there all these pedestrians along the side of the road?”