Research reveals weakness of face recognition systems in color and women

The lack of emotion from computer systems should reasonably eliminate any prejudice against groups of people. However, as these are also programmed by humans, their bias is sometimes transferred to machines.

This reveals at least recent US government research on facial recognition systems that the system’s ability to match two images of a person depends on the demographic characteristics of the sample. Specifically, face recognition systems show that they have an inability to recognize people of color in relation to Caucasians as well as women.

One by one, the team saw higher rates of false positives for Asian and African-American faces than for Caucasian images. The differences vary from 10 to 100 times, depending on each algorithm.

For example, Microsoft’s face recognition technology has 10 times more errors for women of color than men of color. In general, racial minorities are much more likely to be mistaken than white people.

Even government scientists confirm that surveillance technology is flawed and biased. A false match can lead one to lose their flight, to long interrogations, to watchlist placements, to dangerous confrontations with the police, to wrongful arrests or worse. – Jay Stanley, political analyst at American Civil Liberties Union

Note that Democratic presidential candidate Bernie Sanders has pledged to abolish police facial recognition technology as part of his platform for criminal law reform.

15170cookie-checkResearch reveals weakness of face recognition systems in color and women