Tay The Racist AI Bot

Racism and online abuse has always been one of the biggest criticisms of social media.

Earlier this week, we saw a flurry of racist messages aimed at Man United footballer, Paul Pogba. This whole episode reminded me of Tay, the racist AI Bot.

A trip to Old Trafford with Conversocial colleagues.

In March 2016, while I was working for Conversocial, one of the leading social media software companies, our developers were speaking about this new bot built by Microsoft. Tay, the Twitter artificial intelligence bot, was described by Microsoft as “an experiment in conversational understanding.” Not just any conversation that is, Microsoft aimed to dive deep into ‘teen’ conversational styles; OMG, Totes, tbh - the whole shebang.

Tay was not built to just answer questions but had the ability to tell jokes, stories and even make memes. This was a really big step in AI at the time. Tay learned from conversations with other users.

Unsurprisingly, Tay’s conversations did not stay innocent for long. Shortly after Tay was launched, Twitter users started messaging Tay with racist, misogynistic remarks. Thus, Tay started to mimic these engagements and became a full blown bigot.

An example of one of Tay's Tweets