Google Left Red-Faced As Its Software Tagged Black Friends As Gorillas

by
Zohaib Ahmed
The mistake certainly wasn't intentional but it shows that Google Photos need to up its game to avoid more faux pas like these.

Google Photos' image recognition feature is mostly quite accurate in predicting what's what. But on the rare occasions when it does go wrong it goes terribly, shockingly and racially wrong.

As mentioned above, the software does a pretty fine job of spotting objects in photos and classifying them into albums, which is why African-American programmer Jacky Alcine was shocked when he found that the same software thought that his photos with a fellow black friend were that of two gorillas and put them in a folder of the same name.

He called the tech giant on Twitter and provided proof of the bug. To Google's credit, their engineers immediately took notice, sorted out the problem and also issued an apology.

Later on, a Google spokesperson told Yahoo Tech: “We’re appalled and genuinely sorry that this happened. We are taking immediate action to prevent this type of result from appearing. There is still clearly a lot of work to do with automatic image labeling, and we’re looking at how we can prevent these types of mistakes from happening in the future.” 

While it looks like an honest mistake and one can't really accuse Google of anything here, one thing is clear that Google Photos and other similar software need to work harder on their ability to recognize darker toned people.

Carbonated.TV