icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
30 Mar, 2016 18:39

First Hitler, now drugs: Microsoft’s racist chatbot returns to ‘smoke kush’ on Twitter

First Hitler, now drugs: Microsoft’s racist chatbot returns to ‘smoke kush’ on Twitter

Tay, a Twitter bot created by Microsoft to learn from chatting, was taken offline for tweaks when the internet taught it to share outrageous and racist tweets. The artificial intelligence recently came back online, only to continue misbehaving.

Microsoft created Tay as an exercise in machine learning; the AI bot would modify its own creative patterns based on interactions with real people on platforms such as Twitter, Kik or Groupme, to emulate a young woman on social media.

Last week, to the wonder of internet hooligans, Tay’s chat algorithm allowed “her” to be tricked into making outrageous statements such as endorsing Adolf Hitler, causing Microsoft to put the bot to “sleep” to be recoded within 24 hours.

Her latest tirade began early on Wednesday morning, during a period when “she was inadvertently activated on Twitter for a brief period of time,” according to Microsoft.

"You are too fast, please take a rest…," the chatbot tweeted to some of her 200,000 followers, several times per second over a 15-minute period.

Interspersed with that rapid-fire loop of messages was a tweet from the bot apparently boasting about drug use: "kush! [ i'm smoking kush infront the police ]."

The AI was taken offline again within an hour, and her spam tweets were deleted. Tay’s Twitter account has since been changed to “protected” status, meaning only those who had previously followed the bot can take a look at her wild, pre-lobotomy antics.

Tay was created as a project by Microsoft’s Technology and Research department and the team behind the company’s search engine Bing, in an effort to conduct research on machine comprehension of conversations.

Podcasts
0:00
13:44
0:00
25:44