/* ---- Google Analytics Code Below */

Tuesday, June 12, 2018

Artificial Synapses

Synapses are where the learning takes place.  Energy efficient is good, learning faster and more useably, better yet.

AI Could Get 100 Times More Energy-Efficient with IBM's New Artificial Synapses  By Technology Review 

Neural networks are the crown jewel of the AI boom. They gorge on data and do things like transcribe speech or describe images with near-perfect accuracy (see "10 breakthrough technologies 2013: Deep learning").... 

From Technology Review

" ... The catch is that neural nets, which are modeled loosely on the structure of the human brain, are typically constructed in software rather than hardware, and the software runs on conventional computer chips. That slows things down.

IBM has now shown that building key features of a neural net directly in silicon can make it 100 times more efficient. Chips built this way might turbocharge machine learning in coming years.

The IBM chip, like a neural net written in software, mimics the synapses that connect individual neurons in a brain. The strength of these synaptic connections needs to be tuned in order for the network to learn. In a living brain, this happens in the form of connections growing or withering over time. That is easy to reproduce in software but has proved infuriatingly difficult to achieve with hardware, until now. .... 

 ....  The IBM researchers demonstrate the microelectronic synapses in a research paper published in the journal Nature. Their approach takes inspiration from neuroscience by using two types of synapses: short-term ones for computation and long-term ones for memory. This method “addresses a few key issues,” most notably low accuracy, that have bedeviled previous efforts to build artificial neural networks in silicon, says Michael Schneider, a researcher at that National Institute of Science and Technology who is researching neurologically inspired computer hardware. .... " 

No comments: