I was late to learn about this, but here from Microsoft Research about an aborted chatbot called Tay, operating on Twitter, that apparently turned abusive. No mention here of MS's much better known bot Cortana. You do wonder that any kind of chatbot that attempts casual chatter might be prone to this. Our own experiments were closely examined by lawyers.
" ... For context, Tay was not the first artificial intelligence application we released into the online social world. In China, our XiaoIce chatbot is being used by some 40 million people, delighting with its stories and conversations. The great experience with XiaoIce led us to wonder: Would an AI like this be just as captivating in a radically different cultural environment? Tay – a chatbot created for 18- to 24- year-olds in the U.S. for entertainment purposes – is our first attempt to answer this question. ... "
Tuesday, September 20, 2016
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment