Robot Rules
Robots may now be confined to sweeping living rooms and working assembly lines, but futurists and attorneys agree they are destined to take on a much greater role soon. Bill Gates has compared the development of robots to the earliest personal computers in the 1970s.
However, unlike that stationary pile of microchips on your desktop, robots have the potential to act in the real world. Attorneys and legal scholars are now puzzling over how harmful actions of robots will be assigned liability, and particularly how robotic maneuvers will fit into traditional legal concepts of responsibility and agency.
“Robots present a unique situation,” says Ryan Calo, a fellow at the Stanford Law School’s Center for Internet and Society. “Like the computer, it runs on software, but it can touch you. It doesn’t have a particular purpose like a lawn mower or a toaster; it’s more like a platform that you can program to do all kinds of different things. And it can act on the world, so it has legal repercussions. It might be very difficult to ascertain where the liability lies when a robot causes some physical harm.”
One possible avenue would be to view the robot as an agent of its owner, who would be presumed liable for the robot’s actions, but Calo says it’s not so simple. He has blogged about robotics and the law, and led a panel on the subject last year.
“Let’s say you rent a robot from the hospital to take care of Granny, and the neighborhood kids hack into the robot and it menaces Granny and she falls down the stairs. Who’s liable?” he asks. Possibilities include the hospital (which released it), the manufacturer (it’s easy to hack into), the neighborhood kids, or the consumer who failed to do something easy like update the software.
LEVELS OF EXPOSURE
Damages are another puzzler. for defective software, society applies the economic-loss rule so that a consumer could sue Microsoft for the value of the software, but not for any subsequent harm the defective software may have caused.
“Society tolerates Microsoft Word eating your thesis, but it won’t tolerate a robot running into somebody,” Calo says. “If you look at cases where computers have caused physical injury, then you could recover—for example, if the computer gave a cancer patient too much radiation.”
Calo favors limited immunity for the robot industry, similar to section 230 of the federal Communications Decency Act, which gives “interactive computer services” immunity for information put on their sites.
“There is a real danger if there is complete legal uncertainty and you have a couple of bad incidents involving sympathetic plaintiffs,” Calo says. “That could put a chill on robot development.”
Attempts to get comment for this article from manufacturers of today’s robots were unsuccessful.
Stephen S. Wu, a partner at Cooke Kobrick & Wu in Los Altos, Calif., and incoming chair of the ABA’s Section of Science & Technology Law, says society is prepared to deal with robots in the next five to 10 years. But cases for product liability involving robots could ramp up with the anticipated increased use of automated machines, Wu says, citing a study that predicts half the highway miles in 2030 will be driven robotically.
“In the longer run, though, you have to think about robots having intelligence that could surpass our own,” Wu says. “And then we could say if the robots are more intelligent than we are, maybe they should have intellectual property rights.”
Dan Siciliano, faculty director of the corporate governance center at Stanford University, says robot law will most likely develop as an extension of traditional legal concepts. He expects plaintiffs lawyers will “name everybody” in the putative suit for damage caused by robots. “You will see liability work its way up the chain,” he says.
And don’t be surprised to see some green lizard on your TV one day touting a brand-new form of society’s most traditional risk-spreading tool: robot insurance.