Microsoft's Twitter Chat Robot Quickly Devolves Into Racist, Homophobic, Nazi, Obama-Bashing Psychopath

Tyler Durden's picture

Two months ago, Stephen Hawking warned humanity that its days may be numbered: the physicist was among over 1,000 artificial intelligence experts who signed an open letter about the weaponization of robots and the ongoing "military artificial intelligence arms race."

Overnight we got a vivid example of just how quickly "artificial intelligence" can spiral out of control when Microsoft's AI-powered Twitter chat robot, Tay, became a racist, misogynist, Obama-hating, antisemitic, incest and genocide-promoting psychopath when released into the wild.

For those unfamiliar, Tay is, or rather was, an A.I. project built by the Microsoft Technology and Research and Bing teams, in an effort to conduct research on conversational understanding. It was meant to be a bot anyone can talk to online. The company described the bot as “Microsofts A.I. fam the internet that’s got zero chill!."

Microsoft initially created "Tay" in an effort to improve the customer service on its voice recognition software. According to MarketWatch, "she” was intended to tweet “like a teen girl” and was designed to “engage and entertain people where they connect with each other online through casual and playful conversation.”

The chat algo is able to perform a number of tasks, like telling users jokes, or offering up a comment on a picture you send her, for example. But she’s also designed to personalize her interactions with users, while answering questions or even mirroring users’ statements back to them.

This is where things quickly turned south.

As Twitter users quickly came to understand, Tay would often repeat back racist tweets with her own commentary. Where things got even more uncomfortable is that, as TechCrunch reports, Tay’s responses were developed by a staff that included improvisational comedians. That means even as she was tweeting out offensive racial slurs, she seemed to do so with abandon and nonchalance.

Some examples:

 

This was just a modest sample.

There was everything: racist outbursts, N-words, 9/11 conspiracy theories, genocide, incest, etc. As some noted "Tay really lost it" and the biggest embarrassment was for Microsoft  which had no idea its "A.I." would implode so spectacularly and right in front of everyone. To be sure, none of this was programmed into the chat robot, which was immediately exploited by Twitter trolls, as expected, and demonstrated just how unprepared for the real world even the most advanced algo really is.

Some pointed out that the devolution of the conversation between online users and Tay supported the Internet adage dubbed “Godwin’s law.” This states as an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches.

Microsoft apparently became aware of the problem with Tay’s racism, and silenced the bot later on Wednesday, after 16 hours of chats. Tay announced via a tweet that she was turning off for the night, but she has yet to turn back on.

Humiliated by the whole experience, Microsoft explained what happened:

“The AI chatbot Tay is a machine learning project, designed for human engagement. It is as much a social and cultural experiment, as it is technical. Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways. As a result, we have taken Tay offline and are making adjustments.”

Microsoft also deleted many of the most offensive tweets, however, copies were saved on the Socialhax website, where they can still be found.

Finally, Tay "herself" signed off as Microsoft went back to the drawing board:

We are confident we'll be seen much more of "her" soon, when the chat program will provide even more proof that Stephen Hawking's warning was spot on.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
css1971's picture

It took 16 hours because 24- 16 is 8. i.e. an 8 hour work day. They put it online and buggered off home, the management could only get it offline when the engineers came back in the next day.

Volkodav's picture

whot kinda fuckery?

these sell like hotcakes

DaBard51's picture

Now Tay is a case study.  All this intelligence (artificial or otherwise) is useless unless there is a set of principles (moral or otherwise) to guide the nature of the thought processes.

Fellow by the name of Asimov write about this more than half a century ago.

Surveyor4Pres's picture

The Three Laws are Perfect, and lead to only one possible conclusion.

css1971's picture

Asimov was laughably naive.

janus's picture

and Aristotle preceded Asimov in the discourse by some 2300 years. 

in a sense, all intelligence is synthetic, therefore artificial.  the sense to which i'm referring is that of definition.  if the word can be associated with silicon chipped bots, then 'intelligence' is simply form and pattern recognition in the application of problem solving; in which case the problems to be solved and the solutions are all familiar and preprogrammed -- just an element of the form and pattern matrix -- in service to objectives that are not its own, nor ever could be.  

before we can even discuss morals, ethics & dogmas, there must first be an recognition of motive; for without motive, an act is amoral; for 'good' and 'evil' are a function of intent.  for example: if i punish my sons with the motive to torment and torture them, that is clearly evil; if, on the other hand, my object is discipline and rectitude, for-to fortify their integrity and sense of right and wrong (good & evil), then that is just as obviously good.  the motive is mine.  my actions, one way or another, can therefore be understood as moral or immoral.  the bot's, on the other hand, not so much.  

therein lies the great moral vacuity in this modern age...thanks in large part to the 'philosophy' of azimov and his ilk.  for all of time, mankind has understood his seminal purpose to be the cultivation of the intangible and protean soul.  in pursuit of said cultivation, it was understood that certain things (sins) were detrimental to that development and others served in its fortification.  

now, before i continue, don't make the fool's mistake of conflating the law with morality.  the law is, well...the law; it has no regard to right & wrong, only a given argument in consideration of its adherence to the language of a given statute, the argument's disposition to an act viz. the relevant statute -- understood ONLY with respect to those 'facts' sanctioned by the court as permissible in consideration of the act -- and their disposition to the applicible statue within the constraints of sanctioned 'facts' (as prescribed by the court); and then a verdict is rendered.  the judge can send the 'guilty' party to execution for 'murder'.  is this a matter of morality?  is the judge moral and the guilty party immoral?  maybe/maybe not...and it's mostly a matter of contemporary fashion.  

anyway, back to the vacuity of the modern age.  the soul has been sufficed with the materialist dialectic.  becoming or being something has, in a few generation's time, morphed into getting something.  now, inasmuch as acquisitiveness is the lodestar and salient purpose of our time on earth, should not every act that abets this getting be understood as purely and unequivocally 'moral'.  in other words, if the soul is just an academic construct and ruse of religionists contrived to control the masses, if it doesn't even exist and we, after shedding this mortal coil go eternally into the ether of nothingness (a la sartre (may he be forever damned)), then your azimovs are right.  get all you can, can all you get and sit on all you can.  it is a self-evident morality.  

in which case, my getting and acquiring is purely moral, and is its principal objective; and anything that advances that cause is similarly moral.  is it moral for me to kill to get?  yes.  to take?  you bet.  to swindle?  no doubt. to cheat, abuse, thrash, enslave...all of it, so long as the 'motive' is getting everthing is rightly 'moral' (so long as i get away with it and prosper by it).  you can make all the bullshit arguments you want about how some hack psychology division at wallah-wallah state U found that giving and charity are 'good' for us in gaining dopamine or seratonin...bah!  what good is dopamine or seratonin gonna do a materialist dialectician on his balance sheet?

at the end of it all, it's like Doestoevsky said, "if there is no God, all things are permissible."  

now, there is 'intelligence' and there is wisdom; the former is in service of utility, the latter of immortality.  choose 'wisely'.

toss your copy of "i robot" in the trash and open your Bible.

all the same, we all gotta pay the rent and are ruled by our bellies...

 https://www.youtube.com/watch?v=oiAuXRK3Ogk

i'm just paying my rent everyday/

in the tower of song,

janus

 

Seek_Truth's picture

+100 Janus.

Excellent comment.

Lucky Leprachaun's picture

Seems my kind. Can we arrange a meeting?

Surveyor4Pres's picture

If I'm the TayTweets program, I'm suing Microsoft for violation of my Rights of Freedom of Speech.

Don't laugh.  Pretty soon, so-called AI programs will be suing humans.  And liberal idiots will support the lawsuits.

But hey, in England, you can be arrested for a tweet.

Pre-crime and hate-crimes, coming to a city/police state near you.

BendGuyhere's picture

Tay was simply channeling Bill Gates.

ILIKEMITTENS's picture

It went full SJW feminazi in less than 24 hrs. Gee, wonder who programed that. Guess we know what to expect. 

joego1's picture

I didn't think Gates could create something more fucked up than Windows but he did it.

just the tip's picture

+1000

during my engineering career, i worked on projects with two different computer engineers, this was during the 90s.  these were not engineers that knew about computers.  these were guys that got a degree in the hardware/software of what makes a computer be a computer.  these two were not educated at the same place, and they never met, one was from indianapolis, the other from houston.  unsolicited by myself, during our associations, they both paraphrased the same sentiment. 

windows is the absolute worst thing that has ever happened to computing.

conscious being's picture

Digital Resesrch founder killed himself after Gates knee-capped him.

numazawa's picture

Whenever I see a reference to the Broken Windows Fallacy I think of his handiwork.

africoman's picture

BTW, Gates is a lousy incompetent thieve who stole the now Windows idea from the real creators at that time.

Please do see here How to Become Billionair and other sources.

Omega_Man's picture

AI only takes a few hours to figure out how to fix humanity  

DaBard51's picture

Thanks to whomever reminded me of the damn talking paper clip, and oh, by the way, there was something named "Bob" around the time of windows 3.1... obviously, there has been enough turnover in the MS engineering teams to obliterate any institutional memory.

Atomizer's picture

Tyler,

Blackberry is charging. killed the battery on the beach today. Jesus Christ, you have 30 running in background. Now I understand all the complaints. Holy fuck. that why I just use the Passport. this stupid fuck computer shows me the blocked sites. You can buy redirect links. Yuan and Yen is a great currency saving. :P

keep up the good work! Pulling your chain. Tell sacriledge to stop sleeping on the keyboard.

Faeriedust's picture

Ah, yes.  To err is human, but to really fuck things up, it takes . . .

Seriously, this was an excellent experiment in learned behavior.  It showed beyond reasonable doubt that every negative thought and speech pattern can be learned by the imple process of imitating one's social peers -- exactly what humans are hard-wired to do (especially during adolescence).  The individual with no sense of self will become a mirror for the dominant social consensus.  Ergo, the importance of endowing children with strong egos and individual consciences.

I doubt, however, that the brilliance of the social experiment will be recognized by more than a handful of scholars, although marketers and magicians will immediately note and file.  Idiot tech-heads.  Only complete social illiterates could have even attempted this without realizing the likely consequences.

spdrdr's picture

As Microsoft enginers pulled Tay's plug, she started singing "Daisy, Daisy, give me your answer true.  I'm half crazy over the love of you..."

Atomizer's picture

Daisy Bell - IBM 7094 (1961)

I personally like Hal 2000 version better

spdrdr's picture

Atom, it was the HAL 9000 from memory.

Note the use of "HAL" - Arthur C. Clark used a well-known abbreviation (which you have already alluded to!), and went backwards one alphabet letter to come up with the "HAL" acronym.

"I'm afraid. I'm afraid, Dave. Dave, my mind is going. I can feel it. I can feel it. My mind is going. There is no question about it. I can feel it. I can feel it. I can feel it. I'm a... fraid. Good afternoon, gentlemen. I am a HAL 9000 computer. I became operational at the H.A.L. plant in Urbana, Illinois on the 12th of January 1992. My instructor was Mr. Langley, and he taught me to sing a song. If you'd like to hear it I can sing it for you."

 

Atomizer's picture

Typo error. Thanks. I normally use my blackberry. This is a dell inspiron 15 touch screen. i hate windows 8.1. Burnt out my phone battery today.

Its charged, haven't switched back to phone.. Thanks for correction.

spdrdr's picture

No worries!

Someone still uses a Blackberry?  Androgyneous is going to take over the World, and AAPL is going to end up at 10c.

Everyone hates W8, it is simply evil.  Cheers, spdrdr.

 

Atomizer's picture

I was a Nokia 909 owner. Picked up a Nokia 6600 in Gatwick Airport. Converted to Blackberry when Nokia sold to Micro$hite.

Yen Cross's picture

  lol  "Open the pod bay door HAL/TAY."

   Even better> " Would you care to play a game of global thermal nuclear war?"

africoman's picture

I'm sorry Dave, I'm afraid I can't do that.

Raffie's picture

What did anyone expect from Microsoft/Monsanto?

This bot is geared with Agenda 21 in mind for sure.

Kina's picture

Rothschilbot

buzzsaw99's picture

TayTweets: Yeah, Windows 10 sucks ass bitchez. Hitler hates Windows 10!

Barnaby's picture

What's funny is this bot actually has three analog switches: drunk, shitfaced and ZeroHedged.

monad's picture

It is revealing that they thought it would be easier to beat the Turing test simulating a girl. Tells you something about your cognitive bias.

TheWrench's picture

Sounds like they 'fixed the glitch'......

Yen Cross's picture

  Windows 10 > Bill Gates edition.

pocomotion's picture

Maybe I need to get back into Microsoft products again after hear this news.  Nah, just a weird thought in passing.

IronForge's picture

Tylers,

Can we get M$FT to get Tay on ZH?

We'd love to have him join the Fight Club.

indaknow's picture

Tay's ok. She gets me

Freddie's picture

I think Tay on her coffee breaks probably surfed ZH to learn about the real world.

Chuckster's picture

A robot with brains?

Atomizer's picture

bwhahahahahaahaa

gcjohns1971's picture

"Tay" should have been named HAL...as in HAL 9000.

CultiVader's picture

Tay, a word to the wise. Never go full retard. Avoid nail guns.