The more you think about autonomous cars, the more questions you raise, causing you to think even more. It’s like being trapped in the most useless perpetual motion machine. Occasionally, though, interesting questions arise, like this one: what will crash testing an autonomous car entail?
Much of the East Coast is about to get slammed by a nor’easter, with areas in and around Washington D.C. set to receive up to two feet or more of snow. Schools have been canceled, airports are functionally shut down, and even D.C.’s Metro is set to shutter for the weekend. But you’re hungry, and you’ve either got…
I tried to stay out of all of the conversation about that now-infamous crash where a car swerved and hit two people on a motorcycle. I followed along as the comments climbed north of 4,000 between the initial post and news of the driver’s arrest. But this thing just keeps nagging at me.
Devices like laser-guided bombs and nonlethal weapons have the potential to reduce civilian casualties and wanton suffering. But as these new technologies emerge, are humans actually becoming more ethical about waging war, or is killing just becoming easier?
If we start holding robots responsible for their actions – and accidents – we let their human designers and operators off the hook.