Book Image

Deep Learning with Keras

By : Antonio Gulli, Sujit Pal
Book Image

Deep Learning with Keras

By: Antonio Gulli, Sujit Pal

Overview of this book

This book starts by introducing you to supervised learning algorithms such as simple linear regression, the classical multilayer perceptron and more sophisticated deep convolutional networks. You will also explore image processing with recognition of handwritten digit images, classification of images into different categories, and advanced objects recognition with related image annotations. An example of identification of salient points for face detection is also provided. Next you will be introduced to Recurrent Networks, which are optimized for processing sequence data such as text, audio or time series. Following that, you will learn about unsupervised learning algorithms such as Autoencoders and the very popular Generative Adversarial Networks (GANs). You will also explore non-traditional uses of neural networks as Style Transfer. Finally, you will look at reinforcement learning and its application to AI game playing, another popular direction of research and application of neural networks.
Table of Contents (16 chapters)
Title Page
Credits
About the Authors
About the Reviewer
www.PacktPub.com
Customer Feedback
Preface

Long short term memory — LSTM


The LSTM is a variant of RNN that is capable of learning long term dependencies. LSTMs were first proposed by Hochreiter and Schmidhuber and refined by many other researchers. They work well on a large variety of problems and are the most widely used type of RNN.

We have seen how the SimpleRNN uses the hidden state from the previous time step and the current input in a tanh layer to implement recurrence. LSTMs also implement recurrence in a similar way, but instead of a single tanh layer, there are four layers interacting in a very specific way. The following diagram illustrates the transformations that are applied to the hidden state at time step t:

The diagram looks complicated, but let us look at it component by component. The line across the top of the diagram is the cell state c, and represents the internal memory of the unit. The line across the bottom is the hidden state, and the i, f, o, and g gates are the mechanism by which the LSTM works around the...