LET'S HAVE A VOTE AND HEAR YOUR SAY: Who'll Be Responsible When Self-Driving Car Crashes?
Fully self-driving cars aren't quite here yet, but they're coming.
Gov. Jay Inslee signed an executive order earlier this month welcoming companies to use Washington roads as a testing ground for driverless cars.
With the vast majority of collisions caused by human error, self-driving cars carry the potential to dramatically decrease crashes on the road. But, especially while self-driving cars share the road with standard ones, it's unlikely that collisions will disappear entirely.
An Uber self-driving car crashed in Arizona in March, and a Tesla in autopilot mode crashed in Florida last year, killing its driver.
Regardless of who or what was at fault in those instances, self-driving cars will still have to interact with drivers, bicyclists and pedestrians. Which raises the question for this week's Traffic Lab Q&A: Who's responsible if a self-driving car is involved in - or causes - a crash?
"If an accident occurs and it's deemed the error was on the part of the self-driving auto, who gets sued?" asked John McLain, of Whidbey Island. "Do you seek redress from the vehicle's owner or from the manufacturer? If two self-driving vehicles collide, how would insurance companies sort out who pays for the property damage?"
None of the answers to those questions are settled, but scholars and experts in the field tend to have two, general assumptions: Widespread use of self-driving cars will mean fewer collisions, and there will be some shifting of liability from drivers to manufacturers.
"In comparison to the automotive industry today, the automated driving industry will likely bear a bigger slice of a smaller pie of total crash costs," writes Bryant Walker Smith, a law professor at the University of Southern California and an expert on the law of self-driving cars.
States are generally responsible for establishing liability and insurance rules and, indeed, the federal government's guidance on self-driving cars, released last year, put the onus on the states.
"States should consider how to allocate liability among (autonomous vehicle) owners, operators, passengers, manufacturers and others when a crash occurs," the guidance from the U.S. Department of Transportation says.
In Washington state, as of now, there's no clear answer. And there really isn't much of an answer anywhere else either.
Inslee's executive order only requires companies testing self-driving cars to carry car insurance, as any driver has to do under state law.
Kara Klotz, a spokeswoman for state Insurance Commissioner Mike Kreidler, said they're monitoring the issue, but don't currently have regulations specific to self-driving cars.
"It's a whole new way of thinking about auto insurance, and we let the insurance industry kind of innovate and come up with products, and then we'll review those," Klotz said.
And the insurance industry's response, generally, is: We're working on it.
Michigan late last year became the first state to pass legislation addressing self-driving cars and insurance. That law says that if an autonomous vehicle's operating system is at fault for a collision, then the manufacturer is responsible.
Right now, in the very early stages of a possible wave of self-driving cars, the insurance industry is most focused on getting access to the information it will need to determine who or what is at fault in a collision, said Bob Passmore, an assistant vice president with the Property Casualty Insurers Association of America.
Currently, if two cars collide, either the drivers work it out, or police or insurers talk to the drivers and examine the evidence to make a determination of who's at fault.
As cars become more automated, that's going to shift, with drivers becoming less able to say what caused a collision.
"It's not just what you did or didn't do, it's what the car did or did not do," Passmore said. "The only way to talk to a vehicle that drives itself is to get the data and the information that the vehicle used to make its decisions."
The insurance industry wants access to that data. Carmakers want to make sure their proprietary self-driving systems - oftentimes top secret - won't be disclosed. And drivers have a justifiable concern about privacy.
There's already a broad set of product liability law that has been used in famous automotive cases - the Ford Pinto's fuel system, Firestone tires and Takata air bags, to name a few.
The challenge, experts say, will be to take existing law and use it to determine if a self-operating system is at fault.
That could be done on a case-by-case basis - was the self-driving car speeding? Did it run a red light? - or it could require changes to the law.
"We're taking a sort of existing body of law and we're applying that body of law to new facts," said Jim Whittle, an associate general counsel for the American Insurance Association. "Will it fit? Will it need to be adjusted? Do we want to have a more efficient system for resolving these matters? Those are the sort of public-policy questions that may need to evolve."
No one knows exactly when those questions are going to need to be answered. Partially autonomous vehicles are already here. Fully autonomous ones - with no driver input necessary - are likely only a few years away. But it could be decades before they dominate the roadways.
"A year from now, my guess is this is going to be the number 1 issue in the industry," said Kenton Brine, president of the NW Insurance Council. "The technology is already ahead of the organization of the system and how we're insuring it."
LET'S VOTE TO THIS AND LETS HAVE YOUR SAY.. Thanks