What It's Like To Ride Around In Nissan's Autonomous CarJason Torchinsky8/27/13 2:08pmFiled to: Autonomous CarsNissanRobot Cars713EditPromoteShare to KinjaToggle Conversation toolsGo to permalinkNissan just announced that by 2020, they'll be ready to sell you an autonomous, self-driving car. And I believe them, since I've just ridden in one of these robo-cars. You want to know what it's like to ride in an autonomous car? Boring. And that's exactly what they want it to be.Riding around in the sensor-packed Nissan Leaf autonomous test cars is really quite mundane, because the cars do pretty much exactly what they're supposed to, with a minimum of fuss. These modified Leafs have a pair of large LCDs on the dash that display what the car is "seeing," and while these are fun to watch, they won't make it to the production vehicles. AdvertisementHere's about 5 minutes of video of me riding in the autonomous Leaf, while asking the engineer some questions:Nissan's autonomous cars use GPS for the actual destination, but almost all the actual driving is handled via real-time, on-board sensors. The cars have visual cameras, radar, sonar, and laser range-finding to figure out where they are in relation to the world around them. The car's visible-light input from the camera is capable of detecting and identifying cars, people, animals, buildings, sasquatches, rolling debris, everything. I asked the engineer how it determined what was what, and he explained it was a combination of scale and visual matching to a database.The idea that the camera is grabbing images and then matching them in a visual database to determine what it's looking at is fascinating. I asked how many objects and cars were in the database. They wouldn't tell me. I asked what about things that were car-scaled, but didn't look like cars? Like the Wienermobile, for example? Would a giant hot dog confuse it? They said giant driving food is no problem.