/* ---- Google Analytics Code Below */

Wednesday, April 17, 2019

Faster and Smaller Neural Nets

Fascinating development.   Smaller usually means faster with training nets.   Smaller can also mean easier implementation at the IOT edge.  Now will they be as accurate?   It is all about more efficient perception.  Closer to human.   Technical piece in Google AI.  Intro below, more at the link:


MorphNet: Towards Faster and Smaller Neural Networks  in Google AI.   Wednesday, April 17, 2019

Posted by Andrew Poon, Senior Software Engineer and Dhyanesh Narayanan, Product Manager, Google AI Perception 

Deep neural networks (DNNs) have demonstrated remarkable effectiveness in solving hard problems of practical relevance such as image classification, text recognition and speech transcription. However, designing a suitable DNN architecture for a given problem continues to be a challenging task. Given the large search space of possible architectures, designing a network from scratch for your specific application can be prohibitively expensive in terms of computational resources and time. Approaches such as Neural Architecture Search and AdaNet use machine learning to search the design space in order to find improved architectures. An alternative is to take an existing architecture for a similar problem and, in one shot, optimize it for the task at hand.  .... " 

No comments: