This chapter covers advanced techniques for recurrent neural networks.
The techniques seen in Chapter 2, Classifying Handwritten Digits with a Feedforward Network, for feedforward networks, such as going deeper with more layers, or adding a dropout layer, have been more challenging for recurrent networks and require some new design principles.
Since adding new layers increases the vanishing/exploding gradient issue, a new technique based on identity connections as for Chapter 7, Classifying Images with Residual Networks has proved to provide state-of-the-art results.
The topics covered are:
Variational RNN
Stacked RNN
Deep Transition RNN
Highway connections and their application to RNN