I suppose I may as well start this off with a reminder that no sinister baby-eating organizations are paying me to say bad things about Tesla. I think Tesla designs some amazing cars, and has revolutionized the electric car landscape. There. That said, I also think it get an absurd amount of unsettling adoration, especially for their Level 2 semi-autonomous Autopilot system. Today I saw a tweet that gave Autopilot credit for saving a person’s life, complete with dashcam video, but what I see is a very common situation a human driver would have handled better. Let’s dig in.
Okay, so here’s the tweet in question, complete with video:
So, the situation is that the Tesla’s owner was driving along the I-5 North in Los Angeles, with Autopilot engaged, when a truck changed lanes in front of the Tesla, to which the Tesla reacted by maintaining its speed and crossing over the unbroken white line that marks where the highway divides between the I-5 North and the 101 North, putting the car onto the 101 North instead of its intended path.
The Tesla owner refers to the truck as “changing lanes last minute” and that the actions of the car “saved my life” and that Autopilot is “life saving tech.”
My problem here is that any human driver would have seen that the truck had its turn indicator on, and was making a pretty normal lane change. You can see the turn indicator blinking from the very start of the dashcam clip; the truck was clearly intending to change lanes, and I don’t see how you’d call it “last minute” unless you just started paying attention as it began actually changing lanes.
We’ve all been in this exact same situation, and we’ve all done the same thing: let off the gas a bit to slow down slightly, and let the truck change lanes in front of us. It’s trivial, really. For most drivers, you can’t even remember how many times you’ve done this.
You just slow down a bit. Not a brake stomp, just a slight slowing to let the truck in. No biggie.
Stubbornly maintaining the same speed and ignoring the truck that is clearly signaling a lane change until it actually begins to move into the lane, as Autopilot did, is just bad driving; plus, Autopilot put the car onto the wrong freeway for the driver’s destination, which she did note she corrected in another tweet:
Again, this is shitty driving! Not slowing down for a truck clearly signaling a lane change and then speeding up and whipping back into the other side of the highway split across a solid white line is not technically illegal in California (there seems to be a lot of confusion), but it’s also not good driving by any stretch.
Also, I’m not going to blame her too much for that maneuver, though; I’ve lived in LA, and being shunted to the 101 when you wanted the 5 can be a colossal ass-pain.
On some level, sure, it’s impressive that a machine can navigate a highway at all without immediately slamming into a guard rail or driving into a ditch, no question. But, what happened in this video is definitely not an example of Autopilot saving anyone’s life.
It’s much more of an example of Autopilot not noticing things a human driver would, no problem, and making some poor decisions as a result. It didn’t wreck, sure, but it did drive in a way that could create dangerous results.
Based on the Twitter profile, this person clearly invests a lot of her identity in her admiration for Elon Musk and his various endeavors, and that’s absolutely her right. You do you, even if that means venerating Elon. Have at it.
But I think if you’re going to call out Autopilot as “life saving” then it’s worth some scrutiny, and, in this case at least, I think that accolade is misplaced.
Am I wrong, here? I’m happy to hear what everyone has to say, even the Tesla-stans who will accuse me of being a Luddite who craves to see our highways turn into rivers of blood. So chime in!