Nissan just announced that by 2020, they'll be ready to sell you an autonomous, self-driving car. And I believe them, since I've just ridden in one of these robo-cars. You want to know what it's like to ride in an autonomous car? Boring. And that's exactly what they want it to be.
Riding around in the sensor-packed Nissan Leaf autonomous test cars is really quite mundane, because the cars do pretty much exactly what they're supposed to, with a minimum of fuss. These modified Leafs have a pair of large LCDs on the dash that display what the car is "seeing," and while these are fun to watch, they won't make it to the production vehicles.
Here's about 5 minutes of video of me riding in the autonomous Leaf, while asking the engineer some questions:
Nissan's autonomous cars use GPS for the actual destination, but almost all the actual driving is handled via real-time, on-board sensors. The cars have visual cameras, radar, sonar, and laser range-finding to figure out where they are in relation to the world around them. The car's visible-light input from the camera is capable of detecting and identifying cars, people, animals, buildings, sasquatches, rolling debris, everything. I asked the engineer how it determined what was what, and he explained it was a combination of scale and visual matching to a database.
Caprea’s Essential Organic PH Cleanser is just $10 with promo code TEN. Normally $19, this foaming face wash is crafted with organic Monoi oil. It’s meant to target the production of oil secretion while protecting your skin against air pollution. Normally $19, you can save big on this richly-lathering face wash while supporting a brand that keeps the environment top of mind.
The idea that the camera is grabbing images and then matching them in a visual database to determine what it's looking at is fascinating. I asked how many objects and cars were in the database. They wouldn't tell me. I asked what about things that were car-scaled, but didn't look like cars? Like the Wienermobile, for example? Would a giant hot dog confuse it? They said giant driving food is no problem.
When in autonomous operation, the steering wheel lights up with a series of blue LEDs along the wheel's rim. The car's speed is determined by overall conditions and the speed limit, and is currently governed at about 50 mph for the moment. The car will pass a slower car if the average speed becomes slower than desired (by what metric wasn't clear) and can be "asked" to pass if the person in the driver's seat activates the turn indicator.
In fact, even if you're not actually driving, you can guide the car by activating the turn signals, which will then cause the car to find the quickest way to safely change lanes or, in lower-speed, urban environments, take the turn. This brings up interesting questions about is that automated or just very very assisted driving?
Other interesting details: there's a big, red SOS button above the windshield that you can smack if you're riding along and choke on your massive car-hoagie or stab yourself while doing a bit of wood carving on your morning commute. When hit, the SOS button causes the car to come to a safe halt, puts the hazard lights on, and contacts someone, presumably 911 or Nissan's concierge services.
Nissan spoke a bit about their development of the autonomous cars, and some of what they forsee in the future. What I found especially interesting was their assessment of human ability vs. machine ability. Humans come out way behind the machines in terms of reaction speed and ability to process visual information, but what I haven't yet got a good answer on is how they assessed people like machines. Human visual FPS at 10^3 FPS I suppose is relatively easy to determine, but how did they arrive at 10^4 Petaflops for our brains' "Processing Speed?" I'll have to dig into that further, even if I end up finding out the 1Mhz 6502 in my old Apple II is way smarter than me.
I rode in the car as it searched for a parking spot as well as a simulated highway experience, and both times, if you weren't paying attention to the guy not-driving in the front seat, it felt just like being in any car.
When parking, the car can scan for open spots, and can wait for an exiting vehicle to clear a spot, determine it's open, and then take it. I was in it as it parked "Japanese style" which the technician insisted was backing into a spot. It's capable of pulling in nose-first as well, which is apparently what we prefer here in the US.
It can't yet track a person with a bunch of bags walking to their car, which I suggested would be a crucial addition for crowded urban centers. When it locates and finds a person with either keys in hand or a bunch of bags, it should also have a speaker that asks "HELLO. ARE YOU LEAVING?" and when it gets a "yes" back, follows that person back to their car like a crazed stalker.
Nissan's engineers also claimed that their system was "unhackable" thanks to the fact that all information to drive is generated on-board and not wirelessly, save perhaps for the GPS data to determine eventual location. Which seems like the same vulnerability every autonomous car has, but at least the safety and collision-avoidance systems are less likely to be compromised.
Nissan's autonomous cars do seem to work. In the Leaf platform, giving up the experience of driving isn't exactly a huge loss, anyway. And that's likely the kind of cars this system will be sold on. I think GT-Rs will be safe from robo-control, for a while at least.