We're wary of self-driving cars in general, and Google's driverless car technology is anything but proven, but we do have to admit that seeing Google's behind-the-scenes process is extremely cool. Join us as we cautiously examine how Google's trying to make self-driving cars work.
The fascinating piece here is seeing how the self-driving car sees the world, as demonstrated in this video (it gets especially interesting around nine minutes in) from the IEEE International Conference on Intelligent Robots and Systems. A 64-beam laser, radar and cameras are simultaneously detecting traffic lights, reading their outputs, tracking pedestrians, and watching other cars.
Since all of these inputs are being tracked omnidirectionally, situational awareness is partially superior to a normal human. Granted, the car probably can't sense if two meth-addicted transvestite sex workers are having a fight that's about to spill onto the street, but it seems to sense most high likelihood events.
More impressive than how it sees the world is how it computes that data. The video of the autonomous Prius shows it judging the intentions of cars at a four-way stop and, after yielding to people jumping their turn, it eases its way into the intersection to announce its intentions. Using previously collected data about the world, it can sense what's a permanent object (a light pole) and what's new.
Do we believe an autonomous car can make better driving decisions than we can? Absolutely not. Can it make better decisions than some other drivers? It looks that way.
More info on how it works at IEEE Spectrum.
(Hat tip to Jeremy!)