FAIL: Microsoft Yanks Its AI Twitter Bot After It Sends These White Supremacist & Sexual Tweets – American Military News

FAIL: Microsoft Yanks Its AI Twitter Bot After It Sends These White Supremacist & Sexual Tweets

Categorize this as the biggest fail in the history of social media! This week Microsoft launched “Tay”, the artificial intelligence Twitter bot that was supposed to speak like a millennial and have “no chill”… That turned out to be a MASSIVE understatement.

Because Tay’s (@TayandYou) technology learns from what people in her Twitter, Kik, or GroupMe universe say, pandora’s box was opened. Let’s face it, people say some very weird stuff on the internet.

Below are some of the tweets that Tay sent out which prompted Microsoft to quickly take the account offline for “maintenance.”

Screen Shot 2016-03-24 at 2.34.19 PM Screen Shot 2016-03-24 at 2.33.42 PM Screen Shot 2016-03-24 at 2.33.33 PM Screen Shot 2016-03-24 at 2.33.52 PM Screen Shot 2016-03-24 at 2.34.10 PM Screen Shot 2016-03-24 at 2.34.28 PM Screen Shot 2016-03-24 at 2.33.23 PM Screen Shot 2016-03-24 at 2.34.59 PM Screen Shot 2016-03-24 at 2.34.43 PM Screen Shot 2016-03-24 at 2.34.36 PM Screen Shot 2016-03-24 at 2.35.07 PM Screen Shot 2016-03-24 at 2.34.50 PM Screen Shot 2016-03-24 at 2.35.14 PM

 

Do you think AI research should be curbed? Sound off in the comments below!