On 23 March, Microsoft released an artificial intelligence ‘Chat bot’ on Twitter and in less than 24-hours it had learned how to be a racist, sexist, conspiracy theorist and was shut down by Microsoft after users on Twitter exploited a vulnerability in her coding and caused her to copy inappropriate messages.
A chatbot is a computer program designed to simulate human conversations and Tay (@Tayandyou), the latest chatbot designed by Microsoft, was designed to tweet as if she was your average American teenage girl.
The chatbot was developed by Microsoft’s technology and research and Bing teams to “experiment with and conduct research on conversational understanding”.
“Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation,” Microsoft said. “The more you chat with Tay the smarter she gets.”
Tay was capable of learning from other Twitter users as well as searching Twitter and the internet to find information in which to base her tweets. Unfortunately, it didn’t take long for Twitter ‘trolls’ (people who posts inflammatory messages) to send her tweets about that included racist comments, conspiracy theories and other things which she quickly started to learn from.
Microsoft programmers started to see what was happening and scrambled to reprogram her to correct what was happening, but it was too late. The more people sent her inflammatory messages, the more she saw them as the right viewpoint to have and the more she tweeted offensive messages.
Eventually Microsoft were forced to shut the program down altogether and issue an apology.
“We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay,” wrote Peter Lee, Microsoft’s vice president of research.
“Although we had prepared for many types of abuses of the system, we had made a critical oversight for this specific attack,” Lee wrote, speaking of a specific vulnerability that users exploited. “As a result, Tay tweeted wildly inappropriate and reprehensible words and images.”
Microsoft has deleted all but three of Tay’s tweets.