Recently, a Tesla fan account shared a video of a Tesla using so-called Full Self-Driving on a two-lane road. As the video plays, you can see the car approach a marked crosswalk. On the center screen, the car’s sensors detect a pedestrian crossing, but unlike the cars traveling in the opposite direction, the Tesla doesn’t stop and yield to the pedestrian. Instead, it continues driving like normal.
“One of the most bullish / exciting things I’ve seen on Tesla Full Self-Driving Beta 11.4.1. It detected the pedestrian, but rather than slamming on the brakes it just proceeded through like a human would knowing there was enough time to do so,” they wrote.
Sorry, but that’s not remotely exciting or amazing in any way. And while it may be what some humans do, it’s also against the law. A pedestrian crossing at a marked crosswalk has the right of way, and you’re supposed to yield to them. The fact that the Tesla didn’t stop is a major problem. Why doesn’t the software know to yield to pedestrians?
Is that just too complex of a problem to solve yet? Is it because Tesla thinks it knows better than the people who make the laws? Either way, it’s clear that FSD shouldn’t be allowed to operate in areas where pedestrians may be present.
The fact that Tesla fans look at that video and see amazing technology and not a major flaw with FSD is also terrifying even if it’s not exactly surprising. They seem to think that Teslas should be allowed to break the law if the car thinks it can do so safely. In this one instance, no one got hurt, but that’s not the point. The fact that you’re a special fancy person in a special fancy car doesn’t give you the right to ignore the rules of the road. And it’s not like FSD has a perfect track record for safety.
At a time when pedestrian deaths continue to rise, we need to be doing more to keep people safe, not cheering on irresponsible uses of half-baked technology.