(San Francisco) A Facebook recommendation algorithm asked users if they wanted to see more “primates videos” within a UK desert video showing black people, The New York Times Friday.
Video daily Mail, over a year ago, titled “White Man Calls Cops Against Black Men in Marina”. It only shows people, not monkeys.
Below, the question “Do you see more primate videos?” With “Yes/Reject” options displayed on some users’ screen, according to a screenshot posted on Twitter by Darci Groves, the social networking giant’s former designer.
And she commented on this by saying, “It’s scandalous,” calling on her former colleagues on Facebook to escalate the matter.
“This is clearly an unacceptable mistake,” a Facebook spokesperson responded, at the request of AFP. “We apologize to anyone who came across these insulting recommendations.”
She said the California group deactivated the recommendation tool on this topic “as soon as we noticed what was happening in order to investigate the causes of the problem and prevent its recurrence.”
She continued, “As we said, even though we’ve improved our AI systems, we know it’s not perfect and that we have some progress to make.”
The case highlights the limits of AI technologies, which the platform regularly highlights in its efforts to build a personalized feed for each of its 3 billion monthly users.
They also use it extensively in content moderation to identify and block problematic messages and photos before they are even seen.
But Facebook, like its competitors, is regularly accused of not fighting racism and other forms of hate and discrimination.
The topic raises more tension as several civil society organizations accuse social networks and their algorithms of contributing to the division of American society, in the context of the Black Lives Matter (Black Lives Matter) movement demonstrations.
“Proud thinker. Tv fanatic. Communicator. Evil student. Food junkie. Passionate coffee geek. Award-winning alcohol advocate.”