Google Photos' image recognition feature is mostly quite accurate in predicting what's what. But on the rare occasions when it does go wrong it goes terribly, shockingly and racially wrong.
As mentioned above, the software does a pretty fine job of spotting objects in photos and classifying them into albums, which is why African-American programmer Jacky Alcine was shocked when he found that the same software thought that his photos with a fellow black friend were that of two gorillas and put them in a folder of the same name.
Google Photos, y'all fucked up. My friend's not a gorilla. pic.twitter.com/SMkMCsNVX4— diri noir avec banan (@jackyalcine) June 29, 2015
He called the tech giant on Twitter and provided proof of the bug. To Google's credit, their engineers immediately took notice, sorted out the problem and also issued an apology.
@jackyalcine Thank you for telling us so quickly! Sheesh. High on my list of bugs you *never* want to see happen. ::shudder::— Yonatan Zunger (@yonatanzunger) June 29, 2015
Later on, a Google spokesperson told Yahoo Tech: “We’re appalled and genuinely sorry that this happened. We are taking immediate action to prevent this type of result from appearing. There is still clearly a lot of work to do with automatic image labeling, and we’re looking at how we can prevent these types of mistakes from happening in the future.”
While it looks like an honest mistake and one can't really accuse Google of anything here, one thing is clear that Google Photos and other similar software need to work harder on their ability to recognize darker toned people.