Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

The bot TayandYou was developed by Microsoft's technology and research and Bing team. TayandYou/Twitter
Backfired

Internet racists taught Microsoft's bot to be a Holocaust-denying, Trump supporting racist

A chatbot developed by Microsoft ended up tweeting racist and inflammatory statements made by some users.

AN ATTEMPT TO use artificial intelligence to engage with teens ended up backfiring after some Twitter users taught it to be racist.

Yesterday afternoon, Microsoft launched a verified Twitter account called Tay. Nothing unique there except Tay was actually a bot that was designed to talk like a teen.

Aimed at those aged 18 to 24, the chatbot was developed by Microsoft’s technology and research and Bing team “to experiment with and conduct research on conversational understanding”. The more she talked to you, the smarter she became.

It started off innocently enough:

But the bot ended up taking a turn for the worst after some users encouraged it to say racist and inflammatory statements.

One encouraged it to say “Bush did 9/11′ and “Hitler did nothing wrong” while others got it to repeat statements made by Republican candidate Donald Trump who said he would build a wall around the Mexican border.

Many of the offensive and inflammatory tweets have been deleted.

Tweet by @Gerry Gerry / Twitter Gerry / Twitter / Twitter

Tweet by @ON THE RUN BOOGYMAN ON THE RUN BOOGYMAN / Twitter ON THE RUN BOOGYMAN / Twitter / Twitter

Microsoft says the bot uses “relevant public data… and AI and editorial developed by a staff including improvisational comedians,” to craft its responses but some users asked it to repeat statements like the ones above, which is where the problem started.

The bot is now taking a break.

Tweet by @TayTweets TayTweets / Twitter TayTweets / Twitter / Twitter

Read: Sick of doing the washing? Scientists are one step closer to inventing self-cleaning clothes >

Read: This headband could help athletes avoid the worst damage from concussions >

Your Voice
Readers Comments
4
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.