This Tesla Using FSD Beta Trying To Drive Into A Cyclist Is Comedy Gold

I mean, the possibility of the danger here isn't funny, but the whole setup really is.

We may earn a commission from links on this page.
Image for article titled This Tesla Using FSD Beta Trying To Drive Into A Cyclist Is Comedy Gold
Screenshot: YouTube

We all know that there’s plenty of wildly dedicated, glassy-eyed Tesla fans out in the world who will seize your arm at a party and talk toward you about the Glory of Elon and the Good News about Tesla, gripping your bicep tighter and tighter until you emit an involuntary gasp of pain. Sometimes they make videos showing off the latest Full Self-Driving (FSD) Beta release, and breathlessly remind you how incredible and life changing it all is. Then, sometimes, we get videos like this one, where the praise must pause as the car tries to turn right into a cyclist.

This video is especially good because it features one of the Elon-Tesla Co-Prosperity Sphere’s most ardent cheerleaders, Omar Qazi (who tweets under the handle Whole Mars Catalog), who provides a near-constant flow of praise and justification of the FSD system and Tesla in general. This is why this incident works so well: the comedic timing is ideal, as the car decides to turn right into that cyclist just after Omar says “you can actually make thousands of people drive safer just with a software update.”

Advertisement

And then the car tries to steer right into the dude on the bike.

Here, watch:

Gold. It’s gold.

The reactions right after the driver has to grab control of the car to spare the cyclist are very telling, too.

Advertisement
Advertisement

Omar starts to laugh a reaction I can hardly blame him for, while HyperChange, the driver, asks “are we gonna have to cut that?”

Advertisement

Immediately after that, the driver claims “it wouldn’t have hit him,” which I suppose may be true, but, then again, the fucking thing did turn the wheel to steer directly at the cyclist, so I’m not so sure I’d trust anything at that point.

From this point on, Omar goes into full justification mode, suggesting that the incident doesn’t need to be cut from the video (a good call, though pretty telling that the first instinct of the video poster was to ask about cutting the incident) and noting that the FSD Beta system “knew something was up” because it was “beeping at you.”

Advertisement

Okay, sure, it was beeping. The driver also suggests they didn’t get “that close” and that they agree the cyclist “didn’t notice,” which really doesn’t make anything better.

Omar goes on to suggest that this sort of thing is “shocking to people because it’s new,” though I may take the bold position that it’s shocking to people because it fucking turned right toward a cyclist who was clearly visible for no good discernible reason. I think maybe that’s a bigger shock than, you know, “newness.”

Advertisement

The justifications that follow here are really fascinating. Omar claims the system “functioned exactly as designed” because “it detected that there’s a potentially dangerous situation” you know, the dangerous situation caused by that very same system deciding to turn the steering rack of the car so the car’s vector of travel intersected directly with a guy on a bike and as a result of this situation “the car started beeping at you to make sure it had your attention.”

Let’s just think about this logic for a minute. That’d be like saying you did the right thing by calling 911 while you were attempting to stab someone, because you were alerting the authorities of a potentially dangerous situation where a person, you, is lunging with a knife at another person.

Advertisement

This is crazy talk. The system that alerted the driver is the same system that turned the car at the guy on the bike. That’s how the system is designed? Jeezis.

Then the driver says something that’s even stranger, philosophically:

“Well, the whole time you’re driving on human pilot, you’re making your car avoid hitting a biker. You’re constantly making your car avoid hitting a biker...but then you’re surprised that you’re doing it for one second while on FSD Beta.”

Advertisement

What the fuck is going on, here? Okay, first “human pilot” is just a silly way to talk about driving, and if you’re characterizing normal driving as “constantly making your car avoid hitting” anything, let alone a person on a bike, then I’d have to say your fundamental view of driving is deeply, dangerously wrong.

How did this dude drive before? Did he just steer at various walls and fences and trees and dogs and clutch at the steering wheel in a panic as he got close to the various obstacles, until he eventually pinballed his way to his destination?

Advertisement

When you drive, you’re controlling the car. You’re not “making your car avoid hitting” things, you’re adjusting the speed and direction yourself, on a path free of hitting things. A car is not this thing that wants to whip around whacking into everything that you have to jump in and stop that’s closer to what’s happening with systems like FSD. The process is not the same at all. The car has no agency when it’s under “human pilot.”

People aren’t upset because they had to jump in to not hit a dude for “one second,” they’re pissed because the idiot car tried to do it at all, which human drivers generally know not to do.

Advertisement

All of this thinking here is just embarrassing poison.

I’m not sure if this particular drive was a factor in this, but Omar did seem to lose some of his enthusiasm for FSD Beta, based on some recent Twitter activity:

Advertisement

I understand your annoyance, there.

Tesla’s done some amazing things, no question. There’s good reasons why they’re the best-known EV at the moment, and why they’re so popular. And, for all of its flaws, there’s a lot of impressive technology in the FSD Beta package.

Advertisement

But, it’s not done, not by a long shot. It’s not being deployed or tested responsibly, and the reactions to its clear failings, the excuses, the strange mental Cirque du Soleil-type gymnastics needed to justify it, this is all really bad.

There’s ways to approach the problem of automated driving well, and this is very much not it.

Advertisement

This time, nobody got hurt, and it was all pretty funny. This time.