Manufacturers To Blame For Self-Driving Car Crashes, Says UK Lawmaker

Legal experts in the UK published a new report outlining who is at fault when a self-driving car crashes.

We may earn a commission from links on this page.
A photo of a woman sat in a VW self-driving concept car at a museum in the UK.
Who is to blame when a self-driving car crashes?
Photo: Leon Neal / Staff (Getty Images)

If many automotive experts are to be believed, we’re on the cusp of a self-driving revolution. And that means that as automakers finalize their autonomous-driving technology, lawmakers must write the rules that govern self-driving cars. And that includes deciding who is at fault when one crashes.

Since 2018, researchers at the Law Commission in the UK have been doing just this as they try to outline the possible rules for self-driving cars. And today, the organization has published its findings.

The headline takeaway from the report is that when self-driving tech is engaged, “the person in the driving seat would no longer be responsible for how the car drives.”


Instead, the commission says that all responsibilities for any incidents would rest squarely on the manufacturer of the technology. This means that the “user-in-charge” of an autonomous vehicle would have immunity from a wide range of offenses – such as dangerous driving, speeding or running a red light.

Instead, the Authorized Self-Driving Entity (ASDE) would hold all responsibility for any autonomous car on the road. This means that if a self-driving vehicle operates in a criminal or unsafe manner, the company that made it or authorized its use on the road would be liable for regulatory sanctions.


Under the proposed framework, the driver of a self-driving car would be demoted to being a “user-in-charge” whenever self-driving features are engaged. They would only be responsible for “other driver duties,” such as carrying insurance, checking loads or ensuring that children wear seat belts.

Finally, in vehicles like autonomous buses or taxis, which may be authorized to drive themselves without anyone in the driver seat, any occupant would legally be considered a passenger. In this case, a licensed operator would be responsible for overseeing the journey.

A photo of a blue Tesla electric car.
Tesla currently markets its Level 2 driver-assist system as “Full Self-Driving” capabilities.
Photo: Tesla

And I guess this makes sense. When it comes to self-driving tech, the manufacturers want you to put as much faith in their machine as you would a taxi driver, since this technology is advertised as being driverless. In that case, you, the passenger, wouldn’t be to blame if your Uber driver ran a red light.


But the commission hasn’t just set out to place the blame for any incidents on someone. Oh no — it also called for clarity on the definitions of autonomous driving tech.

According to the Law Commission:

“The report recommends introducing a new Automated Vehicles Act, to regulate vehicles that can drive themselves. It recommends drawing a clear distinction between features which just assist drivers, such as adaptive cruise control, and those that are self-driving.”


This is good news, as there remains a lot of confusion about what is a self-driving car and what is a high-tech driver assist. And if you want to understand the distinction better, Jason Torchinsky went into great detail about the levels of autonomous driving here.

So to avoid this confusion for anyone marketing a self-driving car or anyone considering buying one, the Law Commission has called for “safeguards to stop driver assistance features from being marketed as self-driving.”


It suggests that this would help to minimize the risk of collisions caused by members of the public “thinking that they do not need to pay attention to the road while a driver assistance feature is in operation.”

The report has now been presented to the governments in England, Scotland and Wales. Each will now decide whether to accept the commissions’ recommendations and introduce legislation to bring them into effect.