Tech giant Microsoft recently issued an apology when its chat bot called Tay managed to switch to a very offensive mode and post racist, rude and some anti-feminists tweets after being online for less than 24 hours. Following the incident, Microsoft also pulled Tay offline and promised to work on its system in order to make sure that the same unfortunate event will not happen again.
In a statement published on Microsoft's official blog, Microsoft research corporate vice president Peter Lee wrote, "We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay."
This is not the first time Microsoft tested a chat bot, in fact the company tested a chat bot called Xiaolce in China and unlike what happened with Tay, its Chinese counterpart saw a successful launch. Microsoft claims that during the duration of the test, Xiaolce was used by more than 40 million users, according to Ars Technica.
Microsoft claims that the Tay chat bot underwent several tests in order to make sure that something untoward will not happen. However, Microsoft said that what caused Tay's wayward behavior was a coordinated attack perpetrated by people who exploited some undiscovered vulnerabilities hidden deen in the chat bot's AI system.
The main objective of Tay was to mimic the social network and communication level of a teenager around 18 to 24 years old. In a rather ironic way, some experts claim that Microsoft did manage to pull off the original plan citing that most of today's teenagers are easily influenced by its peers.
Microsoft said that they implemented the best filters and safeguard protocols based on diverse user groups. However, all these filters were tested on a controlled environment prompting Microsoft to admit that it is completely different compared to real-world scenarios.
Microsoft did not provide a timeline regarding when Tay will be back online. On the other hand, the Redmond-based company said that its programmers working on the project are working deep into Tay's system in order to close the security hole that caused the chat bot to temporarily malfunction.