In May 2016, a man died after crashing his Tesla Model S while in “autopilot mode,” a semi-autonomous feature that includes automated steering to keep the car in one lane. The accident led to questions over who was at fault: the driver or the car?
Tesla maintained it was “human error” that caused the accident—a truck turned into the lane without warning—and stated it had always made clear that the autopilot mode should be engaged only in certain circumstances (highway driving, clear lane markers).
In September, the National Transportation Safety Board found that both humans and machines were at fault. Tesla said it would evaluate the board’s findings while emphasizing that “autopilot is not a fully self-driving technology and drivers need to remain attentive at all times.”
See our cover story: The Dangers of Digital Things
Attribution: Text by Victor Li; photo by Shutterstock.com.