Elon Musk Promises You Can Pay For 'Full Self-Driving' Subscription In 2021 Even Though It Won't Really Exist Then

Illustration for article titled Elon Musk Promises You Can Pay For 'Full Self-Driving' Subscription In 2021 Even Though It Won't Really Exist Then
Image: Jason Torchinsky/Tesla/Twitter

For many Tesla owners and fans, the deceptively-named Full Self-Driving (FSD) system that Tesla CEO Elon Musk has been promising is a very exciting and desirable thing. Of course, it’s also a $10,000 option, and not everyone has that kind of scratch just sitting around. Benevolent Elon, though, has a solution for you: You can pay for it by subscription starting early next year! Hot damn! Too bad it’s not really “full self-driving,” though.

Advertisement

Here’s Elon’s reply to a tweeted question that revealed the plan:

Okay, so, Tesla leasers can pay for FSD monthly early next year, and then they won’t have to drop $10,000 for a system that has yet to be released, instead just paying about $100 or more a month for a system that has yet to be released.

And, it’s worth mentioning that while what we’ve seen of the FSD’s capabilities is very impressive, with exciting videos from Tesla showing it off, like this one:

...it is in no way a Level 5 autonomous system, and remains a Level 2 semi-autonomous system, no matter how capable it is. Why is that? Well, I’ll let Tesla explain in their own words (emphasis mine):

Autopilot and Full Self-Driving Capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment. While these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous.

See that? It does a hell of a lot of driving on its own, no question, but, like any Level 2 system, it requires the driver to be ready to take over with no notice, at any time at all. This means that if you want an autonomous vehicle so you can sleep or watch movies or have a mind-clearing wank or simply not pay attention to driving, you are very much out of luck, because FSD will not do that for you.

Tesla is very aware of this, of course, and they say the same thing over and over in several paragraphs about Autopilot and FSD:

The currently enabled features require active driver supervision and do not make the vehicle autonomous. The activation and use of these features are dependent on achieving reliability far in excess of human drivers as demonstrated by billions of miles of experience, as well as regulatory approval, which may take longer in some jurisdictions. As these self-driving features evolve, your car will be continuously upgraded through over-the-air software updates.

...

Before enabling Autopilot, the driver first needs to agree to “keep your hands on the steering wheel at all times” and to always “maintain control and responsibility for your vehicle.” Subsequently, every time the driver engages Autopilot, they are shown a visual reminder to “keep your hands on the wheel.”

...

As with all Autopilot features, you must be in control of your vehicle, pay attention to its surroundings and be ready to take immediate action including braking. This feature is in Beta and may not stop for all traffic controls. While Traffic and Stop Sign Control is enabled on surface streets with Autosteer active, your speed will be confined to the noted limit. Please review the Owner’s Manual for additional information, instructions for use and warnings.

Advertisement

This brings us to the inherent problems with Level 2 systems: people do not work well with them. This is not a tech problem, it’s a conceptual problem, and it’s not just me who thinks so.

Advertisement

FSD’s additional capabilities are going to make the problems we’ve seen with people sleeping or watching movies or not paying attention because their Tesla’s Autopilot is doing most of the work far worse because the car can now do even more until it can’t.

Without any sorts of redundancies in its camera-dependent (as in, no LiDAR) FSD hardware, anything from a mud puddle to a swarm of insects to ice or snow to bird shit could completely disable the car’s ability to drive, at which point it will ask the person behind the wheel to take over, possibly at highway speeds, with zero awareness if the person inside is even awake.

Advertisement

All this effort spent into figuring out how to make a car drive better under human supervision is meaningless for the goal of full autonomy without a reasonable plan about how to handle failovers and handoffs to the human driver.

Really, the effort put into the impressive spectrum of FSD capabilities would have been better spent figuring out how to handle safe failovers, but I’m not surprised it didn’t go that way, because safe failovers aren’t just not sexy, but they inherently deal with the flaws in these autonomous systems, and that’s bad for marketing.

Advertisement

But, until a self-driving car can safely get out of harm’s way when its abilities are compromised, we’re all stuck at Level 2, no matter how full-featured that Level 2 system is.

Figuring out how to safely deal with a car with compromised sensors or cameras on its own is by no means an easy task. How should it navigate if it can no longer see? Will it need to communicate its impaired state to any nearby vehicles, to request data about the immediate surroundings or to at least warn them to keep clear?

Advertisement

Maybe. But that would require industry-wide standards and co-operation, and we know how carmakers suck at that, too.

Tesla’s FSD is dangerous, period, because it’s a Level 2 system that requires human vigilance and at the same time does everything it can to convince the human that they do not need to be vigilant. It’s inherently, conceptually flawed, and every Tesla stan who’s about to tell me how advanced the tech is and how good it is and how I’m the one who’s murdering thousands of people because I’m standing in the way of Elon’s Golden Path can fuck right off.

Advertisement

Tesla’s FSD, whether you’re paying for it monthly or by dropping a briefcase full of $20s on the desk of a Tesla store, is a problem. The best thing they can do is put development resources into figuring out how to get past the Level 2 barrier and make a system that’s actually capable of safe failovers and hand-offs.

Once the human is not required to always be paying attention and ready to leap into action, these systems can actually start doing some good. Until then, it’s just a showy, dangerous tech jerk-off that isn’t helping anybody.

Advertisement

So, yeah, you can pay for this monthly in 2021, if you want. Bravo.

Senior Editor, Jalopnik • Running: 1973 VW Beetle, 2006 Scion xB, 1990 Nissan Pao, 1991 Yugo GV Plus, 2020 Changli EV • Not-so-running: 1977 Dodge Tioga RV (also, buy my book!: https://rb.gy/udnqhh)

DISCUSSION

d-livs
D-Livs

I fucking hate Jalopnik. This is not a website for car enthusiasts — it’s a website for people who recognise good things 15 years after the fact.

Your skepticism sucks. I’ve ridden in engineering prototypes of self driving cars. Have you, Jason?

They work. There are multiple companies developing this. Try being excited for technology for once.

Instead of the current tech journalism model whereby people who could never land tech jobs and possess no skills whatsoever opine and militate furiously about it, imagine we had the sportscaster model whereby former great technologists comment on developments in real time. Twitter is 1000x more valuable where you can find and follow the people who have worked on these things. Hell, linkedin is great as people like Sam Livingstone write articles dripping with substance reviewing car design. Car Design News is worth a subscription!

The dangerous thing here, is Jalopnik brainwashing the unwashed masses, teaching them falsehoods and doubling down on the incorrect definition of autopilot.