Twitter is examining why its artificial intelligence sometimes cuts black people’s faces out of photos.
The social-media giant uses technology called a neural network to create the cropped previews of photos that users see as they scroll through their feeds. But users discovered that the system often hones in on white faces when they’re pictured in the same image as black faces.
“We tested for bias before shipping the model and didn’t find evidence of racial or gender bias in our testing, but it’s clear that we’ve got more analysis to do,” Twitter spokeswoman Liz Kelley tweeted Sunday after users pointed out the issue, which was reported earlier by The Verge. “We’ll open source our work so others can review and replicate.”
Programmer Tony Arcieri demonstrated the problem with an image that pictured both Senate Majority Leader Mitch McConnell and former President Barack Obama. The photo preview only showed McConnell’s face — even when Arcieri switched the position of their headshots and the color of their ties.
The same thing happened with another image featuring two cartoon characters from “The Simpsons”: Lenny, who is white, and Carl, who is black. Twitter cropped out Carl and only showed Lenny in the preview.
A third example with two men in suits showed the white man in the preview and cut out the black man. But a Twitter honcho pointed out that other factors, such as the color of the background, may have played a role.
“In this example it’s the brighter background being used to make the cropping decision,” Dantley Davis, Twitter’s chief design officer, said in a tweet. “If we stopped cropping photos this would go away, which is on the team’s mind.”
Twitter shares were down 1 percent at $39.75 as of 12:58 p.m. Monday.