Microsoft’s ‘Tay and You’ AI bot went completely Nazi

Microsoft’s new AI chatbot has been taken offline after it suddenly became oddly racist

The bot named ‘Tay’ was only introduced this week but it went off the rails on Wednesday, posting a flood of incredibly racist messages in response to questions, according to the reports.

microsoft-forced-to-delete-ai-bot-after-it-went-completely-nazi-3

Tay was designed to respond to users’ queries and to copy the casual, jokey speech patterns of a stereotypical millennial, which turned out to be the problem. microsoft-forced-to-delete-ai-bot-after-it-went-completely-nazi-2

The idea was to ‘experiment with and conduct research on conversational understanding,’ with Tay able to learn from ‘her’ conversations and get progressively ‘smarter’. Unfortunately, the only thing she became was racist.

microsoft-forced-to-delete-ai-bot-after-it-went-completely-nazi-4


You see Tay was too good at learning and was targeted by racists, trolls, and online troublemakers who persuaded her to use racial slurs, defend white supremacist propaganda, and even call for genocide.

screen-shot-2016-03-24-at-11.10.58

Screen Shot 2016-03-25 at 12.41.58 AM

Microsoft has now taken Tay offline for ‘upgrades,’ and is now deleting some of the worst tweets although many still remain.  Also, it’s important to say that Tay’s racism is not a product of Microsoft, it’s a product of the morons who have ruined the bot.

Screen Shot 2016-03-25 at 12.44.12 AM

However, it’s still hugely embarrassing for the company.

In one highly publicised tweet, which has now been deleted, Tay said:

Bush did 9/11 and Hitler would have done a better job than the monkey we have now. donald trump is the only hope we’ve got.

The scariest thing is that there are probably a few twitter accounts that really believe these warped ideas.

Update:

The Twitter handle of MS’s AI bot is back online however all racist tweets and replies have been deleted.

Total
1
Shares
Related Posts