It had to happen eventually. A self-driving car struck and killed a pedestrian in Arizona, and already car companies, driverless-software companies, and politicians are pointing fingers and distancing themselves from whatever went wrong.
And the issue hasn't even made it to court yet. When it does, the blame game will get even more intense - and more confusing.
What Happened in the Arizona Uber Crash?
The investigation is in the early stages, but this is what we know so far:
- A woman was walking her bicycle across the road - not at a crosswalk - when a self-driving car operated by Uber struck and killed her.
- The vehicle never slowed or changed its course to avoid the pedestrian.
- Dash-cam video from the car shows that the "safety driver" was not looking at the road in the seconds leading up to the accident.
- The operator also did not have his hands hovering above the steering wheel, which backup drivers are instructed to do so they can quickly take control of the vehicle in case of an emergency.
This shouldn't be a surprise to anyone - as I've written before, the software they are using is not fool-proof and they've already identified problems with the cars recognizing and avoiding bicycles on the road.
Arizona's Governor Stops the Self-Driving Car Program
Uber quickly announced it would voluntarily "pause" its self-driving vehicle program, but Arizona Governor Doug Ducey still announced that the state will no longer allow the company to use autonomous vehicles. He said the accident was an "unquestionable failure" to meet Arizona's commitment to safety.
When Uber started the self-driving tests in Arizona, Ducey was an enthusiastic supporter. In fact, he used an executive order to invite Uber to test its vehicles in Arizona - without enacting any regulations on the technology. The governor's office posted a video of Ducey taking part in the first test-drive in the state.
Rival makers of driverless technology, including Intel and Waymo, have said their systems would have avoided the deadly accident, implying that the software itself - or the designer - is to blame.
How Will Courts Decide Who Is Liable for Autonomous Car Crashes?
So who is at fault when a self-driving car hits someone?
It's possible the backup driver could share some liability if it turns out he did, in fact, disregard safety procedures by not watching the road and not keeping his hands near the steering wheel.
But surely the "backup driver" can't be solely responsible when a "self-driving" car hits and kills someone. So who is to blame? Is it the vehicle manufacturer? The designer of the driverless software? These are questions that can only be answered over time as these kinds of cases make their way into court.
The Arizona case has another interesting complication - the woman who was killed was crossing the road without using a crosswalk. Does that mean it was her fault and neither the safety driver nor the car has any liability?
Under South Carolina's comparative negligence system, if a jury decided she was more than 50 percent responsible, her family could not collect damages.
At some point, the courts will have to figure out how to determine liability for self-driving vehicle accidents. But maybe not yet. If there is a lawsuit filed over this accident- and it's safe to assume there will be - Uber and anyone else named in the suit could settle the case before trial. Then we'll have to wait a while longer to figure out who to blame when self-driving cars kill people.
South Carolina hasn't had any self-driving car incidents yet, but it is coming. Although Arizona has temporarily stopped testing on their roads, each autonomous car company is determined to be the leader of the pack as they break into a new and profitable market.
If you've been involved in a car accident in SC, your Myrtle Beach personal injury lawyer on the Axelrod team can help you to recover maximum compensation from the at-fault party. Call today at (843) 916-9300 or fill out our contact form to set up a free initial consultation about your case.