If A Self-Driving Car Causes A Catastrophic Accident, Who’s Responsible?

Posted in: Car Insurance, News.

 Accident, Who's Responsible
Accident, Who’s Responsible

Self-driving cars are better drivers than any human could ever be. But that doesn’t mean they won’t make mistakes, or encounter situations they weren’t prepared for. And when that does happen, we need to be able to determine who’s culpable for the damage they cause.

On March 19, 2018, a woman in Tempe Arizona was struck and killed by a self-driving car. It marked the first time an autonomous vehicle caused a fatality. Moreover, it served as a sobering reminder for many.

A reminder that, for all the advances we’ve made in the autonomous driving space, for all the advanced sensor technology and artificial intelligence can help make the roads safer, driverless cars are still fallible. They still make mistakes. They still encounter situations they cannot adequately respond to.

They can, if not properly managed, still result in a tragic death.

Safety concerns aside, there’s also the legal implications of self-driving cars to consider. Namely, who can be held culpable when a self-driving car is involved in a catastrophic accident? Who should be held responsible?

Even in an accident involving human drivers, fault and responsibility are not easily determined. Bringing AI into the mix only further complicates matters. Was it a software issue that caused the accident, or did a human driver do something unsafe?

In the case of the woman killed by the self-driving Uber, she was attempting to cross the road. She was clearly engaged in unsafe behavior. Yet had there been a human driver behind the wheel, they very likely would have seen her and stopped in time.

Fortunately, the answer to this question can be found within autonomous vehicles themselves. They are, in essence, mobile computers – data-rich platforms filled with myriad sensors and monitoring systems. This information can be retrieved at the scene of an accident to help reconstruct the sequence of events leading up to the incident.

Yet that’s assuming the party that controls the vehicle and its sensors allow that information to be collected without tampering. That’s assuming the data remains intact, or that the vehicle stores that data at all. That’s assuming that, even with the sensor data, it’s clear what caused the accident.

Beyond that, the rules do seem fairly clear-cut. If the technology on a vehicle fails in any way, it’s that vehicle’s owner and operator who is held liable to anyone they injure.

Currently, the rules are fairly clear-cut. If the technology on a vehicle fails, then it’s the driver who is held responsible. Yet what if they had no control over the car – what if their vehicle was completely autonomous?

That’s where it gets a bit more complicated. There currently exists no legal framework for fully autonomous cars. And without a court case to test our current laws and regulations, it’s not likely any will surface.

Uber settled out of court with the family of the woman their self-driving car killed. Most other accidents involving autonomous vehicles, similarly, have not gone to court. No one, after all, wants to be the one to set a negative precedent for the entire industry.

And until someone does – until we have a clear legal precedent – the question of liability in an accident involving an autonomous car will remain exactly that.

Ryan B. Bormaster is the managing attorney at Bormaster Law. The law firm practices in a number of areas but specializes in 18 Wheeler Accidents, Accidents with Commercial Vehicles such as Work Trucks and Catastrophic Injuries of all kinds. 

 

 


Tags: ,


We’re Somewhere Near You

Find a Blue School Of Motoring instructor somewhere near you and become one step closer to learning to drive.

Ascot | Bath | Bracknell | Crowthorne | Frome | Reading | Slough | Taunton | Trowbridge | WindsorWokingham | Yeovil