Facial Recognition Have a Racial Bias Problem

Police in L.A. are using facial-recognition software to identify suspects. The software is more likely to misidentify or fail to identify African Americans than other races. This could result in innocent citizens being marked as suspects in crimes. Little is being done to explore or correct the bias in the software. The technology is being rolled out by law enforcement across the U.S., but little is done to address it.

Sheriff’s departments across Florida and Southern California have been outfitted with smartphone or tablet facial recognition systems. With the click of a button, many police departments can identify a suspect caught committing a crime on camera. The facial-recognition algorithms used by police are not required to undergo public or independent testing to determine accuracy or check for bias before being deployed on everyday citizens. More worrying still, the limited testing that has been done on these systems has uncovered a pattern of racial bias.

In 2010, NIST observed that accuracy rates had improved tenfold between each round of testing. But research suggests that the improving accuracy rates are not distributed equally. Many algorithms display troubling differences in accuracy across race, gender, and other demographics. A study conducted in 2012 that used a collection of mug shots from Pinellas County, Florida to test the algorithms of three commercial vendors also uncovered evidence of racial bias.

2021auto9

Facial-Recognition Software Might Have a Racial Bias Problem