/* ---- Google Analytics Code Below */

Monday, December 30, 2019

Reconstructing Spoken Words from NonHuman Brains

Fascinating challenge.   Implications for future understanding of animal brains and human speech perception.

Researchers Reconstruct Spoken Words as Processed in Nonhuman Primate Brains
Brown University
By Kevin Stacey

Researchers at Brown University used a brain-computer interface to reconstruct English words from neural signals recorded in the brains of rhesus macaque monkeys. The researchers recorded the activity of neurons in their brains while the primates listened to recordings of one- or two-syllable individual English words and macaque calls. The team processed the neural recordings using algorithms designed to recognize neural patterns associated with particular words; then, the neural data was translated into computer-generated speech. The research showed that recurrent neural networks produced the highest-fidelity reconstructions compared to other tested algorithms. Brown's Arto Nurmikko said, “The same microelectrodes we used to record neural activity in this study may one day be used to deliver small amounts of electrical current in patterns that give people the perception of having heard specific sounds.”)."  ... " 

No comments: