/* ---- Google Analytics Code Below */

Sunday, July 04, 2021

Channeling the Inner Voice of Robots

Like consciousness or a cognitive infrastructure that feeds into our decisions and broad behavior like a consciousness?     If it produces better measure of decisions, why not.   Consciosness, inner voice 

Channeling the Inner Voice of Robots  By Samuel Greengard, Commissioned by CACM Staff, June 29, 2021

Philosophers, psychologists, and neuroscientists have long studied how and why humans talk to themselves as they navigate tasks, manage decisions, and solve problems. "Inner speech is the silent conversation that most healthy human beings have with themselves," says Alain Morin, a professor in the Department of Psychology at Mount Royal University in Alberta, Canada.

Now, as robotics marches forward and devices increasingly rely on a combination of machine learning and conventional programming to navigate the physical world, researchers are exploring ways to imbue machines with an inner voice. This "self-consciousness" could provide insight and feedback into how and why these systems make decisions, while helping them improve at various tasks.

The impact on service robots, computer speech systems, virtual assistants, and autonomous vehicles could be significant. "A cognitive architecture for inner speech may be the first step toward functional aspects of robot consciousness. It may represent the beginning of a new domain for human-robot interactions," explains Antonio Chella, professor of robotics and director of the RoboticsLab at the University of Palermo in Italy.

Robot Talk

Applying the principles of human speech and cognition to machines is a steep challenge.  "Consciousness is a very complex and fuzzy term," observes Angelo Cangelosi, professor of machine learning and robotics at the University of Manchester in the U.K. "While machines may not be aware in the way humans are aware, the idea of modeling speech characteristics to enrich interactions could deliver deeper insight into machine behavior."

The idea is taking shape. In April 2021, Chella and University of Palermo research fellow Arianna Pipitone equipped a robot named Pepper from SoftBank Robotics with a cognitive architecture that models inner speech. This allowed the robot to reason and interact at a deeper level—and to generate vocal feedback about how it arrived at answers and actions. This is possible because the parameters and attributes of the inner voice are different than those for outward expression, the researchers note.  ... ' 

No comments: