/* ---- Google Analytics Code Below */

Monday, September 07, 2020

Using Sound to Classify

Have always promoted the idea of using sound sensors, and sound delivery to activate contextual spaces. Here Carnegie Mellon explores this, calibrating it for sounds. We were more interested in how human behavior could change, their perception was influenced. Here its about robotics and particularly about classification.

Sounds of Action: Using Ears, Not Just Eyes, Improves Robot Perception
Carnegie Mellon University
Byron Spice
August 14, 2020

Carnegie Mellon University researchers have conducted the first large-scale study of interactions between sound and robotic action to determine if sounds could help robots distinguish between objects and identify specific sound-causing actions. The team compiled a dataset from simultaneous video and audio recordings of 60 common objects as they slid or rolled around a tray attached to a robot arm and crashed into its sides, cataloging 15,000 interactions in all. The researchers also collected data by having the robot arm push objects along a surface. They learned, for example, that a robot could use knowledge gleaned from the sound of one set of objects to predict the physical properties of previously unseen objects. Robots that used sound were able to successfully classify objects 76% of the time.

No comments: