/* ---- Google Analytics Code Below */

Monday, April 19, 2021

The Army Studies Real Time Conversation

Considerable, interesting piece on the topic. Leads to the open question about how we converse with our robot assistants.   And leads on to other kinds of hybrid, cooperative work.   And AI to provide useful information about the meaning of statements and commands in context.    Expect lots more in this space in the coming years.

Army researchers create pioneering approach to real-time conversational AI  by The Army Research Laboratory  in Techexplore.

Spoken dialogue is the most natural way for people to interact with complex autonomous agents such as robots. Future Army operational environments will require technology that allows artificial intelligent agents to understand and carry out commands and interact with them as teammates.

Researchers from the U.S. Army Combat Capabilities Development Command, known as DEVCOM, Army Research Laboratory and the University of Southern California's Institute for Creative Technologies, a Department of Defense-sponsored University Affiliated Research Center, created an approach to flexibly interpret and respond to Soldier intent derived from spoken dialogue with autonomous systems.

This technology is currently the primary component for dialogue processing for the lab's Joint Understanding and Dialogue Interface, or JUDI, system, a prototype that enables bi-directional conversational interactions between Soldiers and autonomous systems.

"We employed a statistical classification technique for enabling conversational AI using state-of-the-art natural language understanding and dialogue management technologies," said Army researcher Dr. Felix Gervits. "The statistical language classifier enables autonomous systems to interpret the intent of a Soldier by recognizing the purpose of the communication and performing actions to realize the underlying intent."

For example, he said, if a robot receives a command to "turn 45 degrees and send a picture," it could interpret the instruction and carry out the task.

To achieve this, the researchers trained their classifier on a labeled data set of human-robot dialogue generated during a collaborative search-and-rescue task. The classifier learned a mapping of verbal commands to responses and actions, allowing it to apply this knowledge to new commands and respond appropriately....  " 

No comments: