Alleged Whistle Blower Shares Raw Video Of Self Driving Semi Truck Crashing Into Median 🤯

An internal document, which was videoed by WSJ, states the truck abruptly veered left because a person in the cab didn’t properly reboot the autonomous driving system before engaging it. That caused the AI to execute a left-turn command that was 2.5 minutes old. If the truck was traveling 65 mph, that command was supposed to take place nearly three miles down the road… which isn’t good. That command should have been erased from the system, but it wasn’t.

Advertisement

On its website, TuSimple acknowledged the investigation and said it is taking responsibility to find and resolve safety issues.

Researchers at Carnegie Mellon University dispute that it was all human error. They say common safeguards – like making sure the system can’t respond to commands more than a couple hundredths-of-a-second old or making it so that an improperly-functioning self-driving system can’t be engaged – would have prevented the crash. They also suggest it may be a good idea for the system not to permit an autonomously driven truck from making such a sharp turn while driving at 65 mph.

“This information shows that the testing they are doing on public roads is highly unsafe,” said Phil Koopman, an associate professor at Carnegie Mellon who has contributed to international safety standards for autonomous vehicles, referring to the company’s disclosures.

TuSimple said that after the accident, it modified its autonomous-driving system so that a human can’t engage it unless the computer system is fully functional. A former TuSimple engineer said the move was long overdue. The TuSimple spokesman, in response, said the April accident was the only one in which a company truck was responsible for an accident.

Advertisement

Even though this crash had two people on board, TuSimple is also testing “Ghost Rider” trucks without drivers on public roads. That started back in December of 2021. That was only supposed to happen after 500 practice runs, but it’s reported the company completed less than half of that number before the December drive.

This accident follows years of management pushing back against what some former employees say were big time safety and security issues.

In late 2021, a group of employees raised some of these issues with the legal department, according to people familiar with the matter. A presentation included the company’s alleged failure to check software regularly for vulnerabilities and use of unencrypted communications to manage trucks, which could provide an opening for hackers to intercept data going between engineers and the vehicles’ systems, the people said.

Safety drivers, meanwhile, have flagged concerns about failures in a mechanism that didn’t always enable them to shut off the self-driving system by turning the steering wheel, a standard safety feature, other people familiar with the matter said. Company management dismissed the safety drivers’ concerns, the people said.

Advertisement

A spokesperson for TuSimple says the company “actively solicits and reviews flags, concerns and risks our employees identify so they can be addressed.”

TuSimple has been a leader in autonomous truck development since it launched in 2015. It’s backed by UPS, U.S. Xpress and Volkswagen.