/* ---- Google Analytics Code Below */

Monday, November 15, 2021

AI That Feels

 A space we examined with how products could link with consumers.   


AI systems with emotional intelligence could learn faster and be more helpful


IN THE PAST YEAR, have you found yourself under stress? Have you ever wished for help coping? Imagine if, throughout the pandemic, you'd had a virtual therapist powered by an artificial intelligence (AI) system, an entity that empathized with you and gradually got to know your moods and behaviors. Therapy is just one area where we think an AI system that can recognize and interpret emotions could offer great benefits to people.

Our team hails from Microsoft's Human Understanding and Empathy group, where our mission is to imbue technology with emotional intelligence. Why? With that quality, AI can better understand its users, more effectively communicate with them, and improve their interactions with technology. The effort to produce emotionally intelligent AI builds on work in psychology, neuroscience, human-computer interaction, linguistics, electrical engineering, and machine learning.

Lately, we've been considering how we could improve AI voice assistants such as Alexa and Siri, which many people now use as everyday aides. We anticipate that they'll soon be deployed in cars, hospitals, stores, schools, and more, where they'll enable more personalized and meaningful interactions with technology. But to achieve their potential, such voice assistants will require a major boost from the field of affective computing. That term, coined by MIT professor Rosalind W. Picard in a 1997 book by the same name, refers to technology that can sense, understand, and even simulate human emotions. Voice assistants that feature emotional intelligence should be more natural and efficient than those that do not.

Consider how such an AI agent could help a person who's feeling overwhelmed by stress. Currently, the best option might be to see a real human psychologist who, over a series of costly consultations, would discuss the situation and teach relevant stress-management skills. During the sessions, the therapist would continually evaluate the person's responses and use that information to shape what's discussed, adapting both content and presentation in an effort to ensure the best outcome.

While this treatment is arguably the best existing therapy, and while technology is still far from being able to replicate that experience, it's not ideal for some. For example, certain people feel uncomfortable discussing their feelings with therapists, and some find the process stigmatizing or time-consuming. An AI therapist could provide them with an alternative avenue for support, while also conducting more frequent and personalized assessments. One recent review article found that 1 billion people globally are affected by mental and addictive disorders; a scalable solution such as a virtual counselor could be a huge boon.

There's some evidence that people can feel more engaged and are more willing to disclose sensitive information when they're talking to a machine. Other research, however, has found that people seeking emotional support from an online platform prefer responses coming from humans to those from a machine, even when the content is the same. Clearly, we need more research in this area.

About 1 billion people globally are affected by mental disorders; a scalable solution such as an AI therapist could be a huge boon.

In any case, an AI therapist offers a key advantage: It would always be available. So it could provide crucial support at unexpected moments of crisis or take advantage of those times when a person is in the mood for more analytical talk. It could potentially gather much more information about the person's behavior than a human therapist could through sporadic sessions, and it could provide reminders to keep the person on track. And as the pandemic has greatly increased the adoption of telehealth methods, people may soon find it quite normal to get guidance from an agent on a computer or phone display.

For this kind of virtual therapist to be effective, though, it would require significant emotional intelligence. It would need to sense and understand the user's preferences and fluctuating emotional states so it could optimize its communication. Ideally, it would also simulate certain emotional responses to promote empathy and better motivate the person.

The virtual therapist is not a new invention. The very first example came about in the 1960s, when Joseph Weizenbaum of MIT wrote scripts for his ELIZA natural-language-processing program, which often repeated users' words back to them in a vastly simplified simulation of psychotherapy. A more serious effort in the 2000s at the University of Southern California's Institute for Creative Technologies produced SimSensei, a virtual human initially designed to counsel military personnel. Today, the most well-known example may be Woebot, a free chatbot that offers conversations based on cognitive behavioral therapy. But there's still a long way to go before we'll see AI systems that truly understand the complexities of human emotion.  .... ' 

No comments: