World

Facebook blames facial recognition software after mistaking Black men for ‘primates’

Facebook users who watched a year-old tabloid video featuring Black men were shown an auto-generated prompt asking if they would like to “keep seeing videos about Primates.”

Former Facebook content design manager, Darci Groves called on Facebook to fix the egregious matter after a friend sent her a screenshot of the racist prompt.

Facebook labels Black men as ‘primates’

The video was initially published by the Daily Mail on 27 June 2020 and featured clips of a white man who called the cops after an altercation with Black civilians. Taking to social media, Groves said:

Advertisement

“Um. This ‘keep seeing’ prompt is unacceptable, Facebook. And despite the video being more than a year old, a friend got this prompt yesterday”.

Groves also called on her friends and followers – specifically those who are still employed at Facebook – to escalated the matter.

Recommendation feature disabled

The Daily Mail’s video in question – titled ‘White man calls cops on Black men at marina’ – made no mention of primates or monkeys.

Advertisement

Clearly, Facebook’s topic recommendation feature didn’t get the memo. A Facebook spokesperson said the feature had been disabled.

“We disabled the entire topic recommendation feature as soon as we realised this was happening so we could investigate the cause and prevent this from happening again.”

Facebook says AI is ‘not perfect’

In a statement to the New York Times, Facebook said it was making improvements to its artificial intelligence-driven (AI) prompt recommendations, but added, “We know it’s not perfect”.

Advertisement

The spokesperson referred to the incident as a “clearly unacceptable error” and apologised to anyone “who may have seen these offensive recommendations”.

This is not the first time facial recognition software gets the blame for racist blunders when dealing with images of people who are not white.

Racist facial recognition software

Several studies have shown facial recognition software are biased. Researchers at the National Institute of Standards and Technology said algorithms still falsely identify African-American and Asian faces.

Advertisement

The study evaluated face recognition algorithms submitted by industry and academic developers on their ability to perform different tasks and found the software easily identifies caucasian faces.  

Back in 2015, Google faced backlash when its algorithm tagged Brooklyn-born Jacky Alcine and his friend as gorillas.

Advertisement

For more news your way

Download our app and read this and other great stories on the move. Available for Android and iOS.

Published by
By Cheryl Kahla