Book Image

Keras 2.x Projects

By : Giuseppe Ciaburro
Book Image

Keras 2.x Projects

By: Giuseppe Ciaburro

Overview of this book

Keras 2.x Projects explains how to leverage the power of Keras to build and train state-of-the-art deep learning models through a series of practical projects that look at a range of real-world application areas. To begin with, you will quickly set up a deep learning environment by installing the Keras library. Through each of the projects, you will explore and learn the advanced concepts of deep learning and will learn how to compute and run your deep learning models using the advanced offerings of Keras. You will train fully-connected multilayer networks, convolutional neural networks, recurrent neural networks, autoencoders and generative adversarial networks using real-world training datasets. The projects you will undertake are all based on real-world scenarios of all complexity levels, covering topics such as language recognition, stock volatility, energy consumption prediction, faster object classification for self-driving vehicles, and more. By the end of this book, you will be well versed with deep learning and its implementation with Keras. You will have all the knowledge you need to train your own deep learning models to solve different kinds of problems.
Table of Contents (13 chapters)

Long short-term memory in Keras

As we said in Chapter 6, Movie Reviews Sentiment Analysis Using Recurrent Neural Networks, LSTM is a particular architecture of RNN.

RNNs are based on the need to preserve a memory of past events; this behavior is not possible with normal networks, and that is why RNNs are used in areas where the classic networks do not produce results, such as the prediction of time series (weather, quotations, and so on) that refer to previous data.

An LSTM network consists of cells (LSTM blocks) that are linked together. Each cell is, in turn, composed of three types of ports: input gate, output gate, and forget gate. They implement the write, read, and reset functions on the cell memory, respectively.

So, the LSTM modules are able to regulate what is stored and deleted. This is possible thanks to the presence of various elements called gates, which are composed...