I really should just make a find-and-replace template for this article, because we’ve had to write about idiots sleeping at the wheels of their Teslas driving on Autopilot so many times now. If these are the ones getting reported, it’s very likely this happens far more often. That, I suppose, is why we should cover all of these: because this is stupid and dangerous, and will get people killed. Autopilot is not autonomous driving. So don’t fucking sleep at the wheel if it’s on, dummies. Our current dummy is a 20-year-old from British Columbia, who was sleeping and speeding.
The incident happened on July 19, when Royal Canadian Mounted Police got a call about a 2019 Tesla Model S speeding on the highway, with both front seats fully reclined, and both occupants sawing maple logs.
The speed limit on that stretch of Highway 2 about 60 miles south of Edmonton is 110 kph, or about 68 mph, and radar readings from the cops clocked the Tesla as doing 150 kph, which is over 93 mph.
According to RCMP Sgt. Darrin Turnbull, who spoke with CBC News, the Tesla was actively accelerating over the speed limit:
“We believe the vehicle was operating on the autopilot system, which is really just an advanced driver safety system, a driver assist program. You still need to be driving the vehicle,” Turnbull said.
“But of course, there are aftermarket things that can be done to a vehicle against the manufacturer’s recommendations to change or circumvent the safety system.”
After the responding officer activated emergency lights on their vehicle, the Tesla automatically began to accelerate.
Vehicles that were ahead of the Tesla moved out of the way, but the Telsa began accelerating, Turnbull said.
“Nobody appeared to be in the car, but the vehicle sped up because the line was clear in front.”
I do like how the Tesla accelerated after the cop put the flashing blue lights on, suggesting that either Autopilot does not recognize cop lights, or is programmed to cheese it when it sees the flashing blues.
The report doesn’t mention it, but I guess the driver was finally awoken enough to pull the car over since it doesn’t appear that Autopilot was interested in that.
Once again, it seems we need to remind Tesla owners that they do not own autonomous cars. Autopilot, despite all the confusion, is only a driver-assist tool, a Level 2 semi-autonomous system that requires the driver to be ready to take control with zero warning.
Tesla has safeguards, specifically steering-wheel torque sensors to detect hands on the wheel and warning messages on the car’s displays, but these can be defeated in a variety of ways, including ways that a sleeping person in a fully-reclined seat can do.
That reclined seat is especially troubling—if the system needed to have the human take over, the time it would take to get up and in driving position would be far too long. And the system can fail in any number of ways, even something as simple as dirt or an insect obscuring the view of a camera or sensor.
This is less a technological issue as much as it is a conceptual one: humans simply are not good at “vigilance tasks” like monitoring Autopilot. This has been known for years.
I’ve reached out to Tesla for comment, but they never get back to any media requests, so don’t get your hopes up.