Data Science Investment Counter —Funding raised by UK data science companies in 2018.
£ 5.640 Million

Machine learning fails the racial bias test

A new study by Lauren Rhue, Information Systems PhD Student at NYU Stern School of Business, showed that emotional analysis technology assigns more negative emotions to black men’s faces than white men’s faces.

Published last month on phys.org, the study aims to better understand the hidden bias in artificial intelligence software.

In order to do so, Rhue used a data set of 400 NBA player photos from the 2016 to 2017 season. This because players are similar in their clothing, athleticism, age and gender, the researcher said. Also, the players look at the camera in the picture in the same way, since the analysed photos were professional portraits.

Rhue scanned the images through two well-known types of emotional recognition software. Both assigned black players more negative emotional scores on average, even if their smile was wider than white players.

When considering, for example, the official NBA pictures of Darren Collison and Gordon Hayward. Both players are smiling, and, according to the facial recognition and analysis program Face++, Darren Collison and Gordon Hayward have similar smile scores – 48.7 and 48.1 out of 100, respectively.

Despite this, Face++ rates Hayward’s expression as 59.7 per cent happy and 0.13 per cent angry and Collison’s expression as 39.2 per cent happy and 27 per cent angry. Collison is viewed as nearly as angry as he is happy and far angrier than Hayward – despite the facial recognition program itself recognising that both players are smiling.

On the other hand, Microsoft’s Face API viewed both men as happy, although Collison is viewed as less happy than Hayward, with 98 and 93 per cent happiness scores, respectively. Despite his smile, Collison is even scored with a small amount of contempt, whereas Hayward has none.

Rhue noticed how the same pattern emerged across all the NBA pictures. On average, Face++ rated black faces as twice as angry as white faces.

“This observation aligns with other research,” Rhue said, “which suggests that black professionals must amplify positive emotions to receive parity in their workplace performance evaluations.” Studies also show that people perceive black men as more physically threatening than white men, even when they are the same size.

“Some researchers argue that facial recognition technology is more objective than humans,” she added, “But my study suggests that facial recognition reflects the same biases that people have.

“Until facial recognition assesses black and white faces similarly, black people may need to exaggerate their positive facial expressions – essentially smile more – to reduce ambiguity and potentially negative interpretations by the technology.”

“Although innovative, artificial intelligence can perpetrate and exacerbate existing power dynamics, leading to disparate impact across racial/ethnic groups”, Rhue said. “Some societal accountability is necessary to ensure fairness to all groups because facial recognition, like most artificial intelligence, is often invisible to the people most affected by its decisions.”

SHARE THIS ARTICLE:
BY SHACK15

Co-working space and blog dedicated to all things data science.

Subscribe to our newsletter