Book Image

Hands-On Mathematics for Deep Learning

By : Jay Dawani
Book Image

Hands-On Mathematics for Deep Learning

By: Jay Dawani

Overview of this book

Most programmers and data scientists struggle with mathematics, having either overlooked or forgotten core mathematical concepts. This book uses Python libraries to help you understand the math required to build deep learning (DL) models. You'll begin by learning about core mathematical and modern computational techniques used to design and implement DL algorithms. This book will cover essential topics, such as linear algebra, eigenvalues and eigenvectors, the singular value decomposition concept, and gradient algorithms, to help you understand how to train deep neural networks. Later chapters focus on important neural networks, such as the linear neural network and multilayer perceptrons, with a primary focus on helping you learn how each model works. As you advance, you will delve into the math used for regularization, multi-layered DL, forward propagation, optimization, and backpropagation techniques to understand what it takes to build full-fledged DL models. Finally, you’ll explore CNN, recurrent neural network (RNN), and GAN models and their application. By the end of this book, you'll have built a strong foundation in neural networks and DL mathematical concepts, which will help you to confidently research and build custom models in DL.
Table of Contents (19 chapters)
1
Section 1: Essential Mathematics for Deep Learning
7
Section 2: Essential Neural Networks
13
Section 3: Advanced Deep Learning Concepts Simplified

Vector spaces and subspaces

In this section, we will explore the concepts of vector spaces and subspaces. These are very important to our understanding of linear algebra. In fact, if we do not have an understanding of vector spaces and subspaces, we do not truly have an understanding of how to solve linear algebra problems.

Spaces

Vector spaces are one of the fundamental settings for linear algebra, and, as the name suggests, they are spaces where all vectors reside. We will denote the vector space with V.

The easiest way to think of dimensions is to count the number of elements in the column vector. Suppose we have , then . is a straight line, is all the possible points in the xy-plane, and is all the possible points in the xyz-plane—that is, 3-dimensional space, and so on.

The following are some of the rules for vector spaces:

  • There exists in V an additive identity element such that for all .
  • For all , there exists an additive inverse such that .
  • For all , there exists a multiplicative identity such that .
  • Vectors are commutative, such that for all , .
  • Vectors are associative, such that .
  • Vectors have distributivity, such that and for all and for all .

A set of vectors is said to be linearly independent if , which implies that .

Another important concept for us to know is called span. The span of is the set of all linear combinations that can be made using the n vectors. Therefore, if the vectors are linearly independent and span V completely; then, the vectors are the basis of V.

Therefore, the dimension of V is the number of basis vectors we have, and we denote it dimV.

Subspaces

Subspaces are another very important concept that state that we can have one or many vector spaces inside another vector space. Let's suppose V is a vector space, and we have a subspace . Then, S can only be a subspace if it follows the three rules, stated as follows:

  • and , which implies that S is closed under addition
  • and so that , which implies that S is closed under scalar multiplication

If , then their sum is , where the result is also a subspace of V.

The dimension of the sum is as follows: