Data Science Investment Counter —Funding raised by UK data science companies in 2018.
£ 5.640 Million

New AI tool can detect race and gender bias

Researchers at Penn State and Columbia University have developed a new artificial intelligence (AI) tool able to detect discrimination based on race or gender.

The researchers trained the algorithm using various types of data. In regards to gender discrimination, they used data from the US Census Bureau to determine whether men were receiving higher salaries than women.

“We found evidence of gender-based discrimination in salary”, said Vasant Honavar, Professor at Pennsylvania State University.

“Specifically, we found that the odds of a woman having a salary greater than $50,000 per year is only one-third that for a man. This would suggest that employers should look for and correct, when appropriate, gender bias in salaries,” he explained.

The program was also trained to spot race-related bias. In order to do so, the researchers used the New York City Police Department’s stop-and-frisk programme data to determine whether people of colour were discriminated in arrests made after stops.

“You cannot correct for a problem if you don’t know that the problem exists,” Honavar said. “To avoid discrimination on the basis of race, gender or other attributes you need effective tools for detecting discrimination. Our tool can help with that.”

The news comes amidst AI already proving to be able to pick up bias from its training data. For example, last October, Amazon scrapped an AI recruiting tool that showed bias against women. That because, historically, women hiring rates at the company were lower than today, so the AI behaved accordingly.

More recently, Google announced the company was working towards making its artificial intelligence and machine learning models more transparent in order to tackle bias.

“Our tool,” Honavar said, “can help ensure that such systems do not become instruments of discrimination, barriers to equality, threats to social justice and sources of unfairness.”

The study’s results appeared at The Web Conference event in San Francisco in May.

Image via Pixabay and Max Pixel.

SHARE THIS ARTICLE:
BY SHACK15

Co-working space and blog dedicated to all things data science.

Subscribe to our newsletter