/* ---- Google Analytics Code Below */

Saturday, January 11, 2020

People Too Trusting of Virtual Assistants

Another look at the general idea of creating a 'personality' of assistants.   Anthropomorphic features like names, speaking style, voice, gender,  etc.  can influence how we perceive trust, truth, value, uses.     Of course just because things look more like humans does not create required trust.  Again, many issues of needed context are important here. Combining human and machine capabilities is now often being used.

People Too Trusting of Virtual Assistants
University of Waterloo News

Researchers at the University of Waterloo in Canada have found that people tend to share increasingly more with online agents because of their tendency to assign them personalities and physical features like age, facial expressions, and hairstyles. The researchers asked 10 men and 10 women to interact with three conversational agents—Alexa, Google Assistant, and Siri. The team then interviewed each participant to determine their perception of the agents' personalities and what they would look like, before asking each person to create an avatar for each agent. The researchers found Siri was most frequently described as disingenuous and cunning, while Alexa was perceived as genuine and caring. Said University of Waterloo researcher Anastasia Kuzminykh, "How an agent is perceived impacts how it's accepted and how people interact with it; how much people trust it, how much people talk to it, and the way people talk to it."

No comments: