Book Image

Advanced Deep Learning with Keras

By : Rowel Atienza
Book Image

Advanced Deep Learning with Keras

By: Rowel Atienza

Overview of this book

Recent developments in deep learning, including Generative Adversarial Networks (GANs), Variational Autoencoders (VAEs), and Deep Reinforcement Learning (DRL) are creating impressive AI results in our news headlines - such as AlphaGo Zero beating world chess champions, and generative AI that can create art paintings that sell for over $400k because they are so human-like. Advanced Deep Learning with Keras is a comprehensive guide to the advanced deep learning techniques available today, so you can create your own cutting-edge AI. Using Keras as an open-source deep learning library, you'll find hands-on projects throughout that show you how to create more effective AI with the latest techniques. The journey begins with an overview of MLPs, CNNs, and RNNs, which are the building blocks for the more advanced techniques in the book. You’ll learn how to implement deep learning models with Keras and TensorFlow 1.x, and move forwards to advanced techniques, as you explore deep neural network architectures, including ResNet and DenseNet, and how to create autoencoders. You then learn all about GANs, and how they can open new levels of AI performance. Next, you’ll get up to speed with how VAEs are implemented, and you’ll see how GANs and VAEs have the generative power to synthesize data that can be extremely convincing to humans - a major stride forward for modern AI. To complete this set of advanced techniques, you'll learn how to implement DRL such as Deep Q-Learning and Policy Gradient Methods, which are critical to many modern results in AI.
Table of Contents (16 chapters)
Advanced Deep Learning with Keras
Contributors
Preface
Other Books You May Enjoy
Index

InfoGAN


To enforce the disentanglement of codes, InfoGAN proposed a regularizer to the original loss function that maximizes the mutual information between the latent codes c and

:

(Equation 6.1.3)

The regularizer forces the generator to consider the latent codes when it formulates a function that synthesizes the fake images. In the field of information theory, the mutual information between latent codes c and

is defined as:

(Equation 6.1.4)

Where H(c) is the entropy of the latent code c and

is the conditional entropy of c, after observing the output of the generator,

. Entropy is a measure of uncertainty of a random variable or an event. For example, information like, the sun rises in the east, has low entropy. Whereas, winning the jackpot in the lottery has high entropy.

In Equation 6.1.4, maximizing the mutual information means minimizing

or decreasing the uncertainty in the latent code upon observing the generated output. This makes sense since, for example, in the...