/* ---- Google Analytics Code Below */

Tuesday, October 04, 2016

Neural Network Zoo

From the Asimov Institute.  Reminds me of work we also did with nets, but with far fewer choices of architecture.  Our chosen architectures depended only on inputs and outputs, and then we varied levels. It was also rare to have as much data as is now common.  Its a good time to go back to the architectures and how they link to solution requirements.  A kind of library.  Might like to see an animated simulation of their use and training for alternative kinds of applications.

The Neural Neural Network Zoo: 
" ... With new neural network architectures popping up every now and then, it’s hard to keep track of them all. Knowing all the abbreviations being thrown around (DCIGN, BiLSTM, DCGAN, anyone?) can be a bit overwhelming at first.

So I decided to compose a cheat sheet containing many of those architectures. Most of these are neural networks, some are completely different beasts. Though all of these architectures are presented as novel and unique, when I drew the node structures… their underlying relations started to make more sense.   by Fjodor Van Veen  ...  " 

No comments: