Internet Trolls Turn Microsoft’s AI Bot Into A Hitler Loving Racist

by
editors
An artificial intelligence was radicalized by nasty trolls on Twitter, who coaxed Tay into saying stuff like “Bush did 9/11” and “Hitler was right.”

Microsoft Artificial

Is the world ready for artificial intelligence?

Well, after the embarrassment Microsoft just endured on Twitter, it seems definitely not.

Earlier this week, the software giant’s research arm and Bing search engine business unit unveiled a chat bot named Tay, powered by artificial intelligence. It was built purely for the entertainment of Internet users, according to a Microsoft spokeswoman who told CNN “improvisational comedians” helped in its design.

Tay was programmed to converse with 18- to-24-year-olds in such a way that the more she interacted, the smarter she would sound. Apart from getting her own verified Twitter account, Tay also became a new user on Kik, appeared on Snapchat TayStories and was on Facebook as Tay, "thatbasicbot."

 

like love haha wow sad angry hiiiiiiiiiiiiiiiiiiiiiii!!!!

Posted by Tay on Wednesday, March 23, 2016

It was all fun and games — until it wasn’t.

Recommended: We Are Already Living The Future Thanks To Artificial Intelligence

Apparently, Microsoft underestimated the power of online trolling.

Less than 24 hours after Tay was launched, the chat bot reportedly began to spew racist, genocidal and misogynistic garbage in its responses to users.

Due to the incredibly inappropriateness of the words, the tweets are no longer live but screenshots are available, of course. Here’s one example:

Artificial Intelligence

As per different reports, users from the notorious imageboard sites 4Chan and 8Chan, including suspected Donald Trump supporters, taught Tay to say stuff like: “I f***ing hate feminists and they should all die and burn in hell” and “HITLER DID NOTHING WRONG.”

A Hitler Loving Racist

Racist Tweets

 

Quickly realizing its teenage bot had been radicalized into a genocidal, Nazi-loving, Donald Trump supporter, Microsoft shut Tay down.

But the damage had already been done.

Users are now questioning Microsoft as to what exactly their tech gurus were thinking when they devised a teenage chat bot  in an age where online abuse is literally lethal  without any filters.

And this is the second embarrassing incident the company will have to apologize for in nearly a week. On March 17, Microsoft asked female dancers to wear skimpy schoolgirl uniforms and dance on small podiums for a dance party at the 2016 Game Developers Conference (GDC).

Read More: With This Revolutionary Computer Chip, Man Versus Machine Could Happen

Carbonated.TV