A reminder that any kind of data that results from interaction with its environment can be probed for pattern. And thus provide learnable details.
This AI Uses Echo Location to Identify What You are Doing in Wired by Sophia Chen
GUO XINHUA WANTS to teach computers to echolocate. He and his colleagues have built a device, about the size of a thin laptop, that emits sound at frequencies 10 times higher than the shrillest note a piccolo can sustain. The pitches it produces are inaudible to the human ear. When Guo’s team aims the device at a person and fires an ultrasonic pitch, the gadget listens for the echo using its hundreds of embedded microphones. Then, employing artificial intelligence techniques, his team tries to decipher what the person is doing from the reflected sound alone.
The technology is still in its infancy, but they’ve achieved some promising initial results. Based at the Wuhan University of Technology, in China, Guo’s team has tested its microphone array on four different college students and found that they can identify whether the person is sitting, standing, walking, or falling, with complete accuracy, they report in a paper published today in Applied Physics Letters. While they still need to test that the technique works on more people, and that it can identify a broader range of behaviors, this demonstration hints at a new technology for surveilling human behavior. ... "
Tuesday, May 28, 2019
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment