-
Book Overview & Buying
-
Table Of Contents
-
Feedback & Rating
Machine Learning for Finance
By :
Another method to make order matter within neural networks is to give the network some kind of memory. So far, all of our networks have done a forward pass without any memory of what happened before or after the pass. It's time to change that with a recurrent neural network (RNN):

The scheme of an RNN
RNNs contain recurrent layers. Recurrent layers can remember their last activation and use it as their own input:

A recurrent layer takes a sequence as an input. For each element, it then computes a matrix multiplication (W * in), just like a Dense layer, and runs the result through an activation function, such as relu. It then retains its own activation. When the next item of the sequence arrives, it performs the matrix multiplication as before, but this time it also multiplies its previous activation with a second matrix (
). The recurrent layer adds the result of both operations together and passes it through the activation function again.
In Keras, we can use a simple RNN as follows...