Does facial recognition technology incorporate racial bias?
Image from Shutterstock.
Police departments across the country are using facial recognition software to match images captured by their surveillance cameras with photos of fugitives or mug shot databases. But are police failing to account for the possibility of mismatches and failed matches that could affect blacks more than whites?
Police in Chicago, Los Angeles, Dallas and West Virginia are acquiring or considering more advanced camera surveillance systems, some of which could capture facial images and make identifications, the Atlantic reports. Sheriff’s departments in Florida and Southern California, meanwhile, use smartphone or tablet facial recognition systems to check images of drivers and others against mug shot databases. Florida and several other states put every driver’s license photo into the system.
“With the click of a button,” the Atlantic reports, “many police departments can identify a suspect caught committing a crime on camera, verify the identity of a driver who does not produce a license, or search a state driver’s license database for suspected fugitives.”
But recent research highlights a potential problem. The algorithms used in facial recognition systems may perform worse in identifying African American than white faces.
One 2012 study of of three facial recognition systems found that the algorithms used performed 5 percent to 10 percent worse when identifying blacks rather than whites. Another study, conducted in 2011, found that algorithms developed in China, Japan and South Korea had more success identifying Asian than Caucasian faces. And algorithms developed in France, Germany and the United States did a better job identifying Caucasians.
Errors in the system could mark innocent citizens as crime suspects. “And though this technology is being rolled out by law enforcement across the country, little is being done to explore—or correct—for the bias,” the article says.
The article is written by Clare Garvie, a law fellow at the Center on Privacy & Technology at Georgetown Law, and by Jonathan Frankle, a staff technologist at the center.
Hat tip to the Marshall Project.
Wrong word in the fifth paragraph corrected at 5:25 p.m.