/* ---- Google Analytics Code Below */

Tuesday, October 11, 2016

Cultural Differences in Chatbots

Microsoft chatbots: Sweet XiaoIce vs foul-mouthed Tay
Cultural differences, eh?   .... by Katyanna Quach

AI chatbots can act like social experiments, offering a glimpse into human culture – for the good or the bad.

Microsoft and Bing researchers found this out when they trialled their chatbots on China’s hugely successful messaging platform, WeChat, and on Twitter.

The Chinese chatbot, XiaoIce, went viral within 72 hours and has over 40 million users in China and Japan.

Distinguished engineer and general manager of Future Social Experiences Labs at Microsoft, Lili Cheng, was part of a team that built XiaoIce. Following the success in China, Microsoft decided to try it on US Twitter. “What could go wrong?,” she said at a presentation at the O’Reilly Artificial Intelligence conference.

The audience laughed because they knew what went wrong. Microsoft’s Twitter bot, Tay, rapidly descended into a racist, sexist wreck. Tay was pulled from the internet and Microsoft issued an apology shortly after.

Whilst XiaoIce was "acting" cute and had functions to help users fall asleep by counting sheep or recognising different breeds of dogs, Tay was busily denying the Holocaust.

Both chatbots had learned how to interact by mining the internet for conversations on social media. But Tay was manipulated into being offensive because it was attacked, said Peter Lee, Corporate Vice President at Microsoft Research.    .... " 

No comments: