/* ---- Google Analytics Code Below */

Wednesday, July 05, 2017

Overview of Recurrent Neural Networks

An overview of recurrent neural networks, and a tour of different types used for deep learning applications.   Note in particular, such problems that require sequencing, like forecasting.  Once again, nicely done by Jason Brownlee.  Subscribe to his work and look at his other writings and services,

A Tour of Recurrent Neural Network Algorithms for Deep Learning   by Jason Brownlee

Recurrent neural networks, or RNNs, are a type of artificial neural network that add additional weights to the network to create cycles in the network graph in an effort to maintain an internal state.

The promise of adding state to neural networks is that they will be able to explicitly learn and exploit context in sequence prediction problems, such as problems with an order or temporal component.

In this post, you are going take a tour of recurrent neural networks used for deep learning.   .... 

How top recurrent neural networks used for deep learning work, such as LSTMs, GRUs, and NTMs.
How top RNNs relate to the broader study of recurrence in artificial neural networks.
How research in RNNs has led to state-of-the-art performance on a range of challenging problems.

Note, we’re not going to cover every possible recurrent neural network. Instead, we will focus on recurrent neural networks used for deep learning (LSTMs, GRUs and NTMs) and the context needed to understand them.

Let’s get started.  .... " 

No comments: