I’ve always been a little baffled by the intense brand loyalty some people feel for car companies, but I think this particular expression is far more unsettling than any “rather push a whatever than drive a whatever” hat. I’m talking about a very happy Tesla owner who has posted an update to his will (most likely non-binding) on a Tesla forum that would prevent Tesla from being sued in case he’s killed while Autopilot is active. Oh boy.
This browser does not support the video element.
In the unlikely case you’re not aware, Autopilot is what Tesla calls their semi-autonomous driving system. Autopilot is not a fully-autonomous system, and some argue that it’s not really completely finished, and that Tesla owners are acting as default beta-testers every time they use the system.
While the system generally works quite well and has an impressive overall track record, it’s not flawless, and there have been some accidents that may be related to Autopilot use, including at least one fatal accident.
These accidents, along with the recent pedestrian death due to an impact with an Uber test car operating autonomously may be what inspired a Tesla owner with the online handle Gixxxerking to post this on the Tesla owner’s forum:
I’m making this post public just in case...
I’m a willing participant in Tesla’s goal to revolutionize transportation. I realize that risk have to be taken to develop and mature any technology. I do not believe Tesla has any intent to harm me or my family. I also do not expect perfection.
Having said that, if I’m ever injured or killed while AP is active and it’s determined to be a result of the technology, DO NOT SUE TESLA on my behalf or pass any legislation. I understand the risk and believe the risk are acceptable. I’m a willing participant in the development of technology that makes driving easier, safer and automated. My expectation is that in a very short time as a result of people who are willing to use this technology, most and eventually all cars will be self driving which will ultimately save many hundreds of thousands of lives. This goal is worth the risk to my own safety to allow companies willing to take the risk to mature the technology.
Huh. Okay, first, nobody really thinks this will be legally binding as a will. And this may all just be a troll, which is definitely not outside the realm of possibility. So let’s just skip that technical argument and think about what’s being said here.
First, I fully understand and support the right of a driver to drive a car that’s not really safe. Pretty much all of the ancient, tiny shitboxes I drive are deathtraps, by modern standards, but I know the risks and I’m happy to drive them.
Still, this is something different. The car being referred to here isn’t some out-of-date vintage car, it’s a cutting-edge modern car. The way Gixxxerking words his “will” addendum seems to imply that he’s aware that by using Autopilot, he’s accepting that it’s still, effectively, a product in development, and as such is imperfect, and those imperfections could, conceivably, end up killing him.
And, if that happens, he’d rather nobody bother Tesla about it.
There’s a lot going here. Remember, this is a die-hard Tesla supporter, someone who so believes in what Tesla is trying to do that he’s literally willing to put his own life on the line. This intense Tesla fan also, incredibly, feels the need to include this sentence in his post:
That’s sort of insane, right? What’s going on when we even feel the need to make statements like that about a carmaker at all? I mean, sure, I genuinely don’t think Tesla is interested in harming anyone or anyone’s family, but at the very least this should hint at a PR issue for Tesla.
The idea that this Tesla owner and fan is happy to risk his life to help Tesla gather the data they need to perfect their autonomous driving system begs the question of what do all the other Tesla owners think?
Every car you drive has the associated risk of getting killed in a crash. We all know this, and we accept the risk. But if you die because of a design flaw in, say, your Lexus’ power steering motor, would you ask to not sue Lexus because their power steering system, fully developed, could one day help make driving safer?
Also, it’s not just the Tesla driver who would be affected by an accident caused by a failure of Autopilot; the car is being used on public roads, after all, and there’s any number of failure scenarios that could involve other people or vehicles.
Gixxerking confirmed that the post is “100% genuine” and, while he knows it’s not legally binding, he stands by the content.
I can’t help but feel like this post, this whole idea, is indicative of some real, deep-seated problems in the Tesla community. It shows that, even among Tesla’s biggest supporters, there’s a perception that Autopilot isn’t ready for prime time, while at the same time the esteem the Tesla owners have for the company and its products is such that they’re willing to literally die and leave the company blameless, even at the risk of depriving their families.
Perhaps it’s just this one owner who feels this way. And, perhaps he’s on to something, in a roundabout way: should Autopilot be considered still a Beta system? Should people who choose to use it be informed that it’s still being developed? Should other drivers be made aware when they’re around a car using the system?
I don’t think any Tesla under the control of Autopilot is automatically some menacing deathtrap by a long shot, and I believe such technologies have great potential to make driving much safer.
But, I have to admit, this “will” is strange and unsettling. Tesla isn’t some altruistic, benevolent force who just want to make the world better. They’re a company, they want to make money, and if their mistakes end up killing people, there’s no reason they shouldn’t be held responsible, no matter how cool you think they are.