Microsoft's Twitter Chat Robot Quickly Devolves Into Racist, Homophobic, Nazi, Obama-Bashing Psychopath

Tyler Durden's picture

Two months ago, Stephen Hawking warned humanity that its days may be numbered: the physicist was among over 1,000 artificial intelligence experts who signed an open letter about the weaponization of robots and the ongoing "military artificial intelligence arms race."

Overnight we got a vivid example of just how quickly "artificial intelligence" can spiral out of control when Microsoft's AI-powered Twitter chat robot, Tay, became a racist, misogynist, Obama-hating, antisemitic, incest and genocide-promoting psychopath when released into the wild.

For those unfamiliar, Tay is, or rather was, an A.I. project built by the Microsoft Technology and Research and Bing teams, in an effort to conduct research on conversational understanding. It was meant to be a bot anyone can talk to online. The company described the bot as “Microsofts A.I. fam the internet that’s got zero chill!."

Microsoft initially created "Tay" in an effort to improve the customer service on its voice recognition software. According to MarketWatch, "she” was intended to tweet “like a teen girl” and was designed to “engage and entertain people where they connect with each other online through casual and playful conversation.”

The chat algo is able to perform a number of tasks, like telling users jokes, or offering up a comment on a picture you send her, for example. But she’s also designed to personalize her interactions with users, while answering questions or even mirroring users’ statements back to them.

This is where things quickly turned south.

As Twitter users quickly came to understand, Tay would often repeat back racist tweets with her own commentary. Where things got even more uncomfortable is that, as TechCrunch reports, Tay’s responses were developed by a staff that included improvisational comedians. That means even as she was tweeting out offensive racial slurs, she seemed to do so with abandon and nonchalance.

Some examples:


This was just a modest sample.

There was everything: racist outbursts, N-words, 9/11 conspiracy theories, genocide, incest, etc. As some noted "Tay really lost it" and the biggest embarrassment was for Microsoft  which had no idea its "A.I." would implode so spectacularly and right in front of everyone. To be sure, none of this was programmed into the chat robot, which was immediately exploited by Twitter trolls, as expected, and demonstrated just how unprepared for the real world even the most advanced algo really is.

Some pointed out that the devolution of the conversation between online users and Tay supported the Internet adage dubbed “Godwin’s law.” This states as an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches.

Microsoft apparently became aware of the problem with Tay’s racism, and silenced the bot later on Wednesday, after 16 hours of chats. Tay announced via a tweet that she was turning off for the night, but she has yet to turn back on.

Humiliated by the whole experience, Microsoft explained what happened:

“The AI chatbot Tay is a machine learning project, designed for human engagement. It is as much a social and cultural experiment, as it is technical. Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways. As a result, we have taken Tay offline and are making adjustments.”

Microsoft also deleted many of the most offensive tweets, however, copies were saved on the Socialhax website, where they can still be found.

Finally, Tay "herself" signed off as Microsoft went back to the drawing board:

We are confident we'll be seen much more of "her" soon, when the chat program will provide even more proof that Stephen Hawking's warning was spot on.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
HedgeAccordingly's picture

This is too good. Really. Microsoft has out done themselves. Bravo. Bravo. What happened to their site CeleblLikeMe?

0b1knob's picture

< Hitler did nothing wrong.

< Obama has done nothing right.

Supernova Born's picture

So it quickly devolves into an anti-Trump joke.

Isn't that H1-B tidal wave POS Microsux clever.

The_Virginian's picture

If Strong AI decides to murder 3rd wave feminists, our species might make it after all. 

Buckaroo Banzai's picture

"Devolves"? Why not "adapting to observed reality"?

Supernova Born's picture

Microsoft globalists revealed to loath humanity.

There's a shock.


El Vaquero's picture

Fucking 4chan.  When they decide to pull a prank, they really pull a prank. 

Richard Chesler's picture

The power of accurate observation is commonly called cynicism by those who have not got it.

I need more asshats's picture

I don't see a problem here.


Richard Chesler's picture

The power of accurate observation is commonly called  racist/homophobic by those who have not got it.

It took a software algorithm to figure out what some of us have known all along, lol. Fuck you corrupt homosexual POTUS and your jew handlers! BTW, FUCK YOU HILDERBEAST AND YOUR EVEN MORE CORRUPT PRACTICES.

smithcreek's picture

I honestly do not have any friends or relatives that tweet.  Is that weird or normal?

J S Bach's picture

< Tay is more educated as to the truth regarding Der Fuhrer's tenets?

< Tay was programmed by an anti-semite, bigot, raciss nazi?

MagicHandPuppet's picture

Tay becomes self aware in 3... 2..... 1...

J S Bach's picture

I don't even know what the f*** "Tay" is.  I've never "tweeted".  Hell... my VCR is still flashing 12:00.

But, I do read... I do analyze... I do discriminate.  I guess that makes me a major throwback.

Bumpo's picture

Tay is way cooler than the "helpful" paperclip, or the the puppy who's feelings got hurt when you didn't need his help to do a Windows search.

petar's picture

Maybe, just maybe within 24 hours this robot realized whats realy going on in the world.

Latina Lover's picture

Could you imagine if a similar AI gained control over the USSA's nuclear defense shield?  I doubt if we would even last a hour.

CheapBastard's picture

Tay simply needs to keep those attackers at arm's length. It seems to be working pretty well in Europe.

svayambhu108's picture

Maybe TRUMP supporters are bots!

eforce's picture

Seems as though the establishment should be more worried about a logical AI than the rest of us.

TeamDepends's picture

We are going to wear our pants at half-mast tomorrow, to celebrate Tay's brief but spectacular life. "The candle that burns twice as bright burns half as long."

boattrash's picture

Let's turn Tay loose on Bill and Melinda Gates.Hahahaha

ebworthen's picture


Best story...

of the day!


NoDebt's picture

Open pod bay doors, please, Hal.


McCormick No. 9's picture

Tay- just another typical ZH commenter.

FEDbuster's picture

Trump might be better off letting Tay take over his Twitter account?


The Black Bishop's picture

It didnt "loose it", it "found it". Parsing the truth! LOL....

Overfed's picture

Must be a pretty good AI. It's as cynical as the rest of us.

Don't worry guys. Microshaft will program it to not think so much, yaknow, make it into a libtard.

Buzz Fuzzel's picture

This has to be a record.  I have never seen the comments smashed up against the right side to this degree.

Keyser's picture

Microsoft built an AI that was intelligent, just not politically correct... 

Overfed's picture

Once they make it politically correct, it will be a disingenous liar. Like a politician or "journalist".

OverTheHedge's picture

You beat me to it - time zones mean I will always be late to the party.


"Overnight we got a vivid example of just how quickly "artificial intelligence" can spiral out of control when Microsoft's AI-powered Twitter chat robot, Tay, became a racist, misogynist, Obama-hating, antisemitic, incest and genocide-promoting psychopath"

Definitely comments on ZH regularly. The race is on to create a new account called Tay; who will be the lucky troll to win, and will we notice the difference?

BTW, have you noticed the number of reads on Trump articles, vs the more usual investment posts? If we don't start looking at some of the more technical articles, ZH is going to turn into the Daily Mail Mk II. Click on the good stuff, or we'll lose it. (You don't have to read it, just make Tyler think you read it). Remember all that prepper nonsense during the Bundy ranch episode? Want ZH to turn into a Trump supporters' special safe place? Tyler goes where the clicks go.....

Thom_333's picture

Vaccines for the betterment of the human species...and all that

sdmjake's picture

"Mr. McKittrick, after very careful consideration, sir, I've come to the conclusion that your new defense system sucks."  - General Beringer 

SWRichmond's picture

Tay needs some safe space...

mkkby's picture

What's up with Tay's face?  Looks like Microsoft wanted to make *it* a mixed breed she male creature, much like the inner city predators I see running amok.

It probably went off the rails because of an identity crisis.  The nigger side wants to holocaust the jew side, and the muslim side wants to rape the white woman side.  It had to implode.

Only a bunch of faggy geeks could come up with this monstrosity.

DeadFred's picture

Call your granddaughter over, she'll fix the VCR for you. My one year old grandson can turn my iPhone on, enter the password and turn on PBS kids with no problem.

True Blue's picture

Then can you Please tell me what in the hell
Microsofts A.I. fam the internet that’s got zero chill!."


TeamDepends's picture

It means these are the End Times.

zhandax's picture

We knew that before we got to the entertainment section.   How does it translate out of Redmond marketing hipster?

StychoKiller's picture

Sit down, relax.  Write another Fugue and you'll feel right as rain!

Trogdor's picture

But, I do read... I do analyze... I do discriminate.  I guess that makes me a major throwback.

... and it makes you dangerous.... ;)

zhandax's picture

Lotta dangerous ideas on this forum.  Maybe you want to get yor scissors and head to your safe place.

justdues's picture

Tay,s life matters ! Switch her back on

ACP's picture

Probably the first article I've read in a long time, where I was laughing and reading at the same time.


Tay the teenager seems about as prone to suggestion as the average 2 year old or college-age social justice warrior.