Who gets the blame when driverless cars crash? Everybody.
You’re unlikely to be able to buy your own completely automated driverless car in the next few years, ABA members learned Friday afternoon. But participants at “Driverless Cars in the Fast Lane: Liability Ahead” did learn more about how autonomous vehicles could change the technological, safety, legal and insurance landscapes at the event organized by the ABA’s Section of Science and Technology Law.
Sarah Thornton (PDF), a graduate student in mechanical engineering at CARS, the Center for Automotive Research at Stanford, explained some of the basics of how driverless cars work. If level zero of automation is a conventional car, and level one is a car with cruise control, she said, level three might be partial automation of the kind now available through the “autopilot” feature of the Tesla Model S, in which the driver retains some responsibilities. Level four, full automation, might not be commercially available for a couple of human generations, she said. Despite recent advances, she said, there are lots of technical problems to solve.
Another technical discussion came from Thomas Lue (PDF), senior counsel of Google’s Advanced Technology and Projects group. For Google, he said, the primary motivation to develop autonomous cars is safety. Federal statistics say about 35,000 Americans are killed in car crashes every year; and in almost every case, human error is the cause. Google’s self-driving cars are at Thornton’s level four in some cases, he said, but they need testing—particularly on local roads full of traffic signals, obstacles and other hazards. Google hopes to start that testing in four cities soon.
Law professor Bryant Walker Smith of the University of South Carolina discussed some of the legal aspects of getting driverless cars on the road. There are no major legal barriers to the use of the cars, he said, but there are minor issues in state law—for example, he said, the state of New York requires that drivers keep one hand on the wheel at all times. What happens when manufacturers introduce autonomous vehicles that don’t even have steering wheels?
Smith suggested that the few states that have already passed laws are jumping the gun. The federal National Highway Traffic Safety Administration has signaled its interest by suggesting that “drivers” need not be human for regulatory purposes; any guidance it issues could help states come up with robust laws.
Stephen Wu of the Silicon Valley Law Group in San Jose, California, both moderated the program and discussed product liability. In his introduction, Wu mentioned a fatal crash earlier this year in which the driver was using the autopilot feature of Tesla’s Model S. Who’s responsible for that crash? In a conventional car crash, it could be a driver, an automaker, a parts manufacturer or perhaps a local government that didn’t maintain the road. In the crash of an autonomous vehicle, he said, you can add to that list software developers, hackers, security providers, data storage providers and more. There’s also a danger that data collected by a “smart car” could be shared inappropriately, compromising privacy or safety.
Wu also raised the question of ethics—should a driverless car be programmed to save as many people as possible or save its passengers? This is also a consideration for Thornton’s lab. Cars may “talk” to each other to avoid multicar crashes, suggested Wu, a former chair of the Science & Technology Law section.
Finally, Laura Ruettgers, special counsel to the law firm of Severson & Werson in San Francisco, discussed the insurance implications of autonomous vehicles. Massive safety strides could put insurers out of business, she said—or possibly shift the insurance burden to manufacturers rather than end users.
Ruettgers analyzed the Tesla crash, which took place in Florida last May. The autopilot feature uses a third-party technology, and that company or Tesla could be blamed, she said—but witnesses suggested that the driver may have been watching a movie when he should have been monitoring the road, so Tesla could blame driver error. As long as drivers still share some responsibility, she said, this kind of finger-pointing could be common.
“Is this the end of insurance as we know it? Not yet,” she said. “Your insurance premiums might come down, but nobody is out of business.”
Follow along with our full coverage of the 2016 ABA Annual Meeting.