Been re-examining neural networks for time series models and forecasting. In the long ago work of modeling with neural nets we had determined it was not useful, but new architectures of recurrent Neural nets RNN make it worth another look. Here is an examination with Tensorflow
Building Recurrent Neural Networks in Tensorflow
Posted by Ahmet Taspinar in DSC
Recurrent Neural Nets (RNN) detect features in sequential data (e.g. time-series data). Examples of applications which can be made using RNN’s are anomaly detection in time-series data, classification of ECG and EEG data, stock market prediction, speech recogniton, sentiment analysis, etc.
This is done by unrolling the data into N different copies of itself (if the data consists of N time-steps) .
In this way, the input data at the previous time steps t_n - 1, t_n - 2, t_n - 3, ... , t_0 can be used when the data at timestep t_n is evaluated. If the data at the previous time steps is somehow correlated to the data at the current time step, these correlations are remembered and otherwise they are forgotten.
By unrolling the data, the weights of the Neural Network are shared across all of the time steps, and the RNN can generalize beyond the example seen at the current timestep, and beyond sequences seen in the training set. .... "
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment