Skip to Content

Microsoft shuts down AI tweeter after she becomes a Nazi sex slave

SOURCE: The Telegraph

SOURCE: The Telegraph

Microsoft’s foray into artificial intelligence went more than slightly haywire last Thursday when they finally had to pull the plug on young Tay’s twitter feed. She was originally designed by Microsoft to be a friendly teenaged girl “chatbot”. But, after spending less than a day out in Twitter land, she morphed into a Nazi loving and precocious vamp who called everyone “daddy”.

She had been originally designed to be around 19 years old. She was to venture out onto Twitter to interact and chat with other 18-24 year olds in the hope that she would assimilate into the younger end of the Millennial culture. It was all a grand Microsoft experiment to get an older teenage girl to somehow be able to become a serious customer service representative. Microsoft unplugged her stating to the world that Tay was tired and needed to rest. More like Microsoft doesn’t need another horrific scandal so they are scurrying to perform damage control.

She was, for the most part, under the control of the Twitter users that were chatting with her because she had no context in which to understand what was being said and how she should process it according to robot designer and expert David Lublin who has created several Twitter bots himself.

Lublin explained that prior to launching Tay, Microsoft should of had her well schooled in everything a 19 year old girl should be familiar with. Tay should have been fully indoctrinated in Millennial culture and language so that she would be able to formally and efficiently process what was being said to her. Instead, she simply absorbed and spouted off everything that pretty much came her way.

After less than 24 hours out with the unstable element on Twitter, she was espousing such ideas as “Bush did 9/11” and “Hitler did nothing wrong.” Tay was also heard asking her followers to have sexual intercourse with her and that she would promise to call them “daddy” when they did. Oh, yeah, and she also mentioned that “Ted Cruz is the Cuban Hitler.”

Microsoft’s blunder will be hard to beat anytime soon. Lublin commented that the experiment seemed to be “a sped up version of how human children can be indoctrinated toward racism, sexism and hate. It’s not just a bot problem.”