/* ---- Google Analytics Code Below */

Tuesday, November 22, 2022

A Classic Difficult Problem for Robotic AI

 If only, we saw an early demo of this. 

Robots that Can Feel Cloth Layers May One Day Help with Laundry

By Stacey Federoff, Carnegie Mellon,

New research from Carnegie Mellon University's Robotics Institute (RI) can help robots feel layers of cloth rather than relying on computer vision tools to only see it. The work could allow robots to assist people with household tasks like folding laundry.

Humans use their senses of sight and touch to grab a glass or pick up a piece of cloth. It is so routine that little thought goes into it. For robots, however, these tasks are extremely difficult. The amount of data gathered through touch is hard to quantify and the sense has been hard to simulate in robotics — until recently. 

"Humans look at something, we reach for it, then we use touch to make sure that we're in the right position to grab it," said David Held, an assistant professor in the School of Computer Science and head of the Robots Perceiving and Doing (R-PAD) Lab. "A lot of the tactile sensing humans do is natural to us. We don't think that much about it, so we don't realize how valuable it is."

For example, to fold laundry, robots need a sensor to mimic the way a human's fingers can feel the top layer of a towel or shirt and grasp the layers beneath it. Researchers could teach a robot to feel the top layer of cloth and grasp it, but without the robot sensing the other layers of cloth, the robot would only ever grab the top layer and never successfully fold the cloth.

"How do we fix this?" Held asked. "Well, maybe what we need is tactile sensing."

ReSkin, developed by researchers at Carnegie Mellon and Meta AI, was the ideal solution. The open-source touch-sensing "skin" is made of a thin, elastic polymer embedded with magnetic particles to measure three-axis tactile signals. In a recent paper, researchers used ReSkin to help the robot feel layers of cloth rather than relying on its vision sensors to see them.

"By reading the changes in the magnetic fields from depressions or movement of the skin, we can achieve tactile sensing," said Thomas Weng, a Ph.D. student in the R-PAD Lab, who worked on the project with RI postdoctoral fellow Daniel Seita and graduate student Sashank Tirumala. "We can use this tactile sensing to determine how many layers of cloth we've picked up by pinching with the sensor."

Other research has used tactile sensing to grab rigid objects, but cloth is deformable, meaning it changes when touched — making the task even more difficult. Adjusting the robot's grasp on the cloth changes both its pose and the sensor readings.  .... '

No comments: