Book Image

Deep Learning with Keras

By : Antonio Gulli, Sujit Pal
Book Image

Deep Learning with Keras

By: Antonio Gulli, Sujit Pal

Overview of this book

This book starts by introducing you to supervised learning algorithms such as simple linear regression, the classical multilayer perceptron and more sophisticated deep convolutional networks. You will also explore image processing with recognition of handwritten digit images, classification of images into different categories, and advanced objects recognition with related image annotations. An example of identification of salient points for face detection is also provided. Next you will be introduced to Recurrent Networks, which are optimized for processing sequence data such as text, audio or time series. Following that, you will learn about unsupervised learning algorithms such as Autoencoders and the very popular Generative Adversarial Networks (GANs). You will also explore non-traditional uses of neural networks as Style Transfer. Finally, you will look at reinforcement learning and its application to AI game playing, another popular direction of research and application of neural networks.
Table of Contents (16 chapters)
Title Page
Credits
About the Authors
About the Reviewer
www.PacktPub.com
Customer Feedback
Preface

Keras functional API


The Keras functional API defines each layer as a function and provides operators to compose these functions into a larger computational graph. A function is some sort of transformation with a single input and single output. For example, the function y = f(x) defines a function f with input x and output y. Let us consider the simple sequential model from Keras (for more information refer to: https://keras.io/getting-started/sequential-model-guide/):

from keras.models import Sequential
from keras.layers.core import dense, Activation

model = Sequential([
   dense(32, input_dim=784),
   Activation("sigmoid"),
   dense(10),
   Activation("softmax"),
])

model.compile(loss="categorical_crossentropy", optimizer="adam")

As you can see, the sequential model represents the network as a linear pipeline, or list, of layers. We can also represent the network as the composition of the following nested functions. Here x is the input tensor of shape (None, 784) and y is the output tensor...