- Advertisement -

Facebook recently apologized after an AI program mistakenly labeled a video featuring black men as “about primates.”

Video, deployment of Shows clips of black men and police officers on June 27, 2020 by The Daily Mail. An automatic prompt asked users if they would like to “continue to watch videos about primates” even though the video clearly had no connection or primate-related content.

advertisement

Fallen Service member Karim Nikoi’s mother’s Instagram account ‘wrongly’ deleted

“As we have said, while we have improved our AI, we know it is not perfect, and we have to make further progress,” Facebook said in a statement. the new York Times. “We apologize to all who may have seen these objectionable recommendations.”

- Advertisement -

A former Facebook content designer flagged the issue after a friend forwarded a screenshot of the prompt. A product manager for Facebook Watch reportedly called the error “unacceptable” and said the company would look “into the root cause.”

Gharial flew the drone from the air, smoke came out of his mouth

Facebook spokesman Danny Lever said in a statement, “This was clearly an unacceptable error and as soon as we realized this was happening, we disabled the entire theme’s recommendation feature, so we’re looking for the cause.” Can check and prevent it from happening again.” USA Today.

Facebook immediately disabled the AI ​​program responsible for the error.

Facebook uses biased fact-checkers to ‘suppress’ conservative satire, Babylon Bee CEO says

Technology companies have dealt with similar issues in the past, with some critics claiming that facial recognition technology is biased against people of color.

In 2015 Google Photos mistakenly labeled photos of black people as “gorillas”, for which Google apologized and attempted to correct the error. Wired later found that the solution was to block the words “gorilla,” “chimp,” “chimpanzee” and “monkey” from the search.

Facebook did not respond to Granthshala News’ request for comment.