In March 2016, Microsoft launched a chatbot named 'Tay' on Twitter. The project hoped to emulate its success with Xiaoice, a Microsoft chatbot launched in China. Tay's purpose was to entertain 18-24 year olds in the United States, and to determine whether an AI like Xiaoice would be just as captivating in a radically different cultural environment. Unfortunately, the chatbot was quickly taken offline after it 'learned' to tweet offensive and hateful remarks. Research is currently underway to redress the technical vulnerability that allowed for Tay to be misused.
Xiaoice is a Microsoft chatbot launched in China in 2014. By mining the Internet for human conversations and differentiating questions from answers using language processing technology, Xiaoice can exchange responses in a way that is both human and current. Notably, it remembers details from previous exchanges with users, such as a breakup with a girlfriend or boyfriend, and asks how the user is feeling. People have told it 'I love you'. The chatbot is currently a text-messaging program, but the hope is to develop a version that includes a Siri-like voice.