Google Image Recognition Software Tags Black Couple as Gorillas

Google Apologizes after its Image Recognition Software Mistakenly Tags Black couple as Gorillas.

A computer programmer Jacky Alcine saw his and a friend’s images labeled as “gorillas” in an album on Google Photos. 

The search engine giant explains that the mislabeling occurred due to a fault in its image recognition software.

Google reportedly, is “appalled” and “genuinely sorry” for the mishap and is working on devising methods to avoid such mistakes in the future.

google-image-recognition-software-tags-black-couple-as-gorillas

Google was pushed by public to apologize when its Photos app mislabeled pictures of black people as gorillas.

Google’s net Photos app utilizes an auto-tagging feature so that users organize uploaded images and make their searching experience easier.

But, users of this new application became furious when a New York based computer programmer and his friend were mislabeled as the great apes.

The Internet giant issued a statement that the company was “appalled and genuinely sorry” for this fault.

The mistake has occurred exactly one month after Flickr’s auto-tagging feature mistakenly placed offensive tags on pictures such as entitling concentration camps as “jungle gyms” and people in pictures as “apes”.

In the month of May, Google officially launched its standalone Photos app and announced numerous helpful feature like automatic creation of collages of people and objects for example landscapes or food.

However, Jacky Alcine from Brooklyn, detected his and a female friend’s picture into an album tagged as ‘gorilla.’ He launched a series of Tweets addressed to Google, some of which are as follows:

In response to his Tweets, Google’s chief architect of social media Yonatan Zunger told him that programmers were trying to fix the issue.

But, even when the error was fixed, Mr. Alcine complained that two of his pictures were still tagged under the terms gorilla and gorillas.

Zunger again replied that a fix was being worked upon:

“We’re also working on longer-term fixes around both linguistics – words to be careful about in photos of people – and image recognition itself, eg better recognition of dark skinned faces. Lots of work being done, and lots still to be done. But we’re very much on it. ‘We should have a patch around searches turning up pics of partially obscured faces out very soon.”

This is not the first time when Google image recognition software has done a blunder. Last month, Google Image search result added Indian Prime Minister Narendra Modi in the list of Top 10 Criminals of the world. 

Google was quick to apologize to the Indian government and promised to delete Modi’s pictures from the search result, but after more than a month, the images are still visible with“Top 10 Criminals” keywords search.

However, Google now shows following message in order to save itself from embarrassment or lawsuits:

“These results don’t reflect Google’s opinion or our beliefs; our algorithms automatically matched the query to web pages with these images.”

sourceTwitter

Waqas

Waqas Amir is a Milan-based cybersecurity journalist with a passion for covering latest happenings in cyber security and tech world. In addition to being the founder of this website, Waqas is also into gaming, reading and investigative journalism.