Facial Recognition Software Is Biased Based on Gender and Race: Study

facial recognition website

Facial recognition technology is no longer a gimmick. It’s increasingly being used in more pedestrian ways and is showing no signs of slowing down. Today, apart from being used for unlocking your smartphones, facial recognition has made it to every nook and cranny. In fact, the technology is being used by Chinese police officers to identify suspects.

However, according to new research out of MIT’s Media Lab, it looks like facial recognition technology is subject to biases based on gender and race. Joy Buolamwini, a researcher at the MIT Media Lab, built a dataset of 1,270 faces. She then tested the accuracy of three facial recognition systems from Microsoft, IBM, and Megvii (a Chinese firm). Surprisingly, the result showed inaccuracies in gender identification.

When the systems were shown the photos of lighter-skinned males, it was able to identify them pretty easily. However, the system misidentified up to 12 percent of darker-skinned males; and up to 35 percent of darker-skinned females.

“Overall, male subjects were more accurately classified than female subjects replicating previous findings (Ngan et al., 2015), and lighter subjects were more accurately classified than darker individuals.” – Joy Buolamwini

To this, IBM said that they had steadily improved its facial analysis software and was “deeply committed” to “unbiased” and “transparent” services. Microsoft, on the other hand, said, “We have already taken steps to improve the accuracy of our facial recognition technology” and that it was investing in research “to recognize, understand and remove bias.” And lastly, Megvii did not reply.

That being said, this isn’t the first time that facial recognition technology has been proven to be inaccurate. Back in 2015, Google was called out by a software engineer when the Photos app identified his black friend as “gorillas.”

Comments 0
Leave a Reply