Google's Waymo Asked People To Test Its Semi-Autonomous Car Tech. What Happened Next Will Not Surprise You

We may earn a commission from links on this page.

Back in 2013, Google’s self-driving car unit ran an experiment. They decided to give their semi-autonomous driving technology to a handful of employees with long highway commutes, as recounted by Waymo CEO John Krafcik at IAA Frankfurt. Because it was not fully tested yet, Google made those employees agree to several stipulations:

  • They must pay attention and keep their eyes on the road at all times
  • They can take your hands off the wheel, but they must be alert
  • Google will be watching them via cameras in the car and if they don’t follow these rules, Google will take the car away from them

You will absolutely believe what happened next:

Texting while driving. Applying makeup. Using a laptop. And, of course, falling asleep. That last one, according to Krafcik, was the instance that made them shut down the project, which only lasted for a few weeks.


That project later became Waymo, the Google subsidiary now entirely dedicated to fully-autonomous driving. In Waymo’s corporate narrative, this short experiment was the catalyst for their “pivot” from semi-autonomous driving aids for highways to full self-driving, no human required technology:

Now, the good news is, we had gained a priceless insight into a real conundrum. Our AutoPilot system was very advanced. It used multiple cameras, radar and lidar, with a massive on-board computer. In fact, it was so advanced that human drivers became too comfortable, too quickly. And so we realized this was a big problem for driver assist technology — the better you make it, the more likely humans would be to trust it too much.

So we pivoted in 2013, with inspired resolve, and with the belief that the only way to solve the problem of roadway safety, and the only way to deliver the opportunity of mobility for all, was to take the human completely out of the loop. We committed then to full autonomy, no driver monitoring, nor driver’s license, required.


Waymo is acknowledging a fundamental truth here. There are only two options for Level 2 autonomy systems like Tesla’s AutoPilot: either it is buggy as hell and provides no real benefit or it works and drivers trust it too much. This is simply human nature and there is no way around it.


In fact, what’s striking about Waymo’s story is how much it sounds like a bizarro world Tesla (the fact that Google even called their experiment “AutoPilot” seems a bit too on the nose here). In the end, Tesla developed a similar technology that would have the same pitfalls and, instead of worrying about how it would affect driver behavior, opted instead to sell it for an extra $6,000 per car while feeding the perception that it is practically full autonomy even when it is very clearly not.