So far in this chapter, we have studied two kinds of generative models—GANs and VAEs—but there is also another kind, known as flow-based generative models, which directly learn the probability density function of the data distribution, which is something that the previous models do not do. Flow-based models make use of normalizing flows, which overcomes the difficulty that GANs and VAEs face in trying to learn the distribution. This approach can transform a simple distribution into a more complex one through a series of invertible mappings. We repeatedly apply the change of variables rule, which allows the initial probability density to flow through the series of invertible mappings, and at the end, we get the target probability distribution.
Hands-On Mathematics for Deep Learning
By :
Hands-On Mathematics for Deep Learning
By:
Overview of this book
Most programmers and data scientists struggle with mathematics, having either overlooked or forgotten core mathematical concepts. This book uses Python libraries to help you understand the math required to build deep learning (DL) models.
You'll begin by learning about core mathematical and modern computational techniques used to design and implement DL algorithms. This book will cover essential topics, such as linear algebra, eigenvalues and eigenvectors, the singular value decomposition concept, and gradient algorithms, to help you understand how to train deep neural networks. Later chapters focus on important neural networks, such as the linear neural network and multilayer perceptrons, with a primary focus on helping you learn how each model works. As you advance, you will delve into the math used for regularization, multi-layered DL, forward propagation, optimization, and backpropagation techniques to understand what it takes to build full-fledged DL models. Finally, you’ll explore CNN, recurrent neural network (RNN), and GAN models and their application.
By the end of this book, you'll have built a strong foundation in neural networks and DL mathematical concepts, which will help you to confidently research and build custom models in DL.
Table of Contents (19 chapters)
Preface
Section 1: Essential Mathematics for Deep Learning
Free Chapter
Linear Algebra
Vector Calculus
Probability and Statistics
Optimization
Graph Theory
Section 2: Essential Neural Networks
Linear Neural Networks
Feedforward Neural Networks
Regularization
Convolutional Neural Networks
Recurrent Neural Networks
Section 3: Advanced Deep Learning Concepts Simplified
Attention Mechanisms
Generative Models
Transfer and Meta Learning
Geometric Deep Learning
Other Books You May Enjoy
Customer Reviews