Artificial intelligence: How to avoid racist algorithms

There is growing concern that many of the algorithms that make decisions about our lives – from what we see on the internet to how likely we are to become victims or instigators of crime – are trained on data sets that do not include a diverse range of people.

The result can be that the decision-making becomes inherently biased, albeit accidentally.

Try searching online for an image of “hands” or “babies” using any of the big search engines and you are likely to find largely white results.

In 2015, graphic designer Johanna Burai created the World White Web project after searching for an image of human hands and finding exclusively white hands in the top image results on Google.

Her website offers “alternative” hand pictures that can be used by content creators online to redress the balance and thus be picked up by the search engine.

Google says its image search results are “a reflection of content from across the web, including the frequency with which types of images appear and the way they’re described online” and are not connected to its “values”.

Ms Burai, who no longer maintains her website, believes things have improved.

“I think it’s getting better… people see the problem,” she said.

“When I started the project people were shocked. Now there’s much more awareness.”

The Algorithmic Justice League (AJL) was launched by Joy Buolamwini, a postgraduate student at the Massachusetts Institute of Technology, in November 2016.

She was trying to use facial recognition software for a project but it could not process her face – Ms Buolamwini has dark skin.

“I found that wearing a white mask, because I have very dark skin, made it easier for the system to work,” she says.

“It was the reduction of a face to a model that a computer could more easily read.”

It was not the first time she had encountered the problem.

Five years earlier, she had had to ask a lighter-skinned room-mate to help her.

“I had mixed feelings. I was frustrated because this was a problem I’d seen five years earlier was still persisting,” she said.

“And I was amused that the white mask worked so well.”

Ms Buolamwini describes the reaction to the AJL as “immense and intense”.

This ranges from teachers wanting to show her work to their students, and researchers wanting her to check their own algorithms for signs of bias, to people reporting their own experiences.

And there seem to be quite a few.

One researcher wanted to check that an algorithm being built to identify skin melanomas (skin cancer) would work on dark skin.

“I’m now starting to think, are we testing to make sure these systems work on older people who aren’t as well represented in the tech space?” Ms Buolamwini says.

“Are we also looking to make sure these systems work on people who might be overweight, because of some of the people who have reported it? It is definitely hitting a chord.”


Source: Bbc


Related posts