/* ---- Google Analytics Code Below */

Thursday, May 23, 2019

Lifelong Learning with Nets?

Considerable challenge.  True we can get more data over time to improve our learning.  But my guess is that the architecture nets will also change as well.  How will that change how we use data to produce solutions?  Yes, animals and humans continue to learn,  but also seem to have a limit to what they can learn based on the structure of knowledge and how its presented.     Below an abstract,  a number of technical link references in the article itself.

Lifelong Learning in Artificial Neural Networks By Gary Anthes  Communications of the ACM, June 2019, Vol. 62 No. 6, Pages 13-15   10.1145/3323685

Over the past decade, artificial intelligence (AI) based on machine learning has reached break-through levels of performance, often approaching and sometimes exceeding the abilities of human experts. Examples include image recognition, language translation, and performance in the game of Go.

These applications employ large artificial neural networks, in which nodes are linked by millions of weighted interconnections. They mimic the structure and workings of living brains, except in one key respect—they don't learn over time, as animals do. Once designed, programmed, and trained by developers, they do not adapt to new data or new tasks without being retrained, often a very time-consuming task.

Real-time adaptability by AI systems has become a hot topic in research. For example, computer scientists at Uber Technologies last year published a paper that describes a method for introducing "plasticity" in neural networks. In several test applications, including image recognition and maze exploration, the researchers showed that previously trained neural networks could adapt to new situations quickly and efficiently without undergoing additional training.

"The usual method with neural networks is to train them slowly, with many examples; in the millions or hundreds of millions," says Thomas Miconi, the lead author of the Uber paper and a computational neuroscientist at Uber. "But that's not the way we work. We learn fast, often from a single exposure, to a new situation or stimulus. With synaptic plasticity, the connections in our brains change automatically, allowing us to form memories very quickly.   .... "

No comments: