Book Image

Practical Discrete Mathematics

By : Ryan T. White, Archana Tikayat Ray
Book Image

Practical Discrete Mathematics

By: Ryan T. White, Archana Tikayat Ray

Overview of this book

Discrete mathematics deals with studying countable, distinct elements, and its principles are widely used in building algorithms for computer science and data science. The knowledge of discrete math concepts will help you understand the algorithms, binary, and general mathematics that sit at the core of data-driven tasks. Practical Discrete Mathematics is a comprehensive introduction for those who are new to the mathematics of countable objects. This book will help you get up to speed with using discrete math principles to take your computer science skills to a more advanced level. As you learn the language of discrete mathematics, you’ll also cover methods crucial to studying and describing computer science and machine learning objects and algorithms. The chapters that follow will guide you through how memory and CPUs work. In addition to this, you’ll understand how to analyze data for useful patterns, before finally exploring how to apply math concepts in network routing, web searching, and data science. By the end of this book, you’ll have a deeper understanding of discrete math and its applications in computer science, and be ready to work on real-world algorithm development and machine learning.
Table of Contents (17 chapters)
1
Part I – Basic Concepts of Discrete Math
7
Part II – Implementing Discrete Mathematics in Data and Computer Science
12
Part III – Real-World Applications of Discrete Mathematics

Understanding eigenvalues, eigenvectors, and orthogonal bases

In this section, we will learn about the mathematical concepts behind PCA, such as eigenvalues, eigenvectors, and orthogonal bases. We will also learn how to find the eigenvalues and eigenvectors for a given matrix.

Many real-world machine learning problems involve working with a lot of feature variables; sometimes in the millions. This not only makes it harder for us to store the data due to its massive size but also leads to the slower training of machine learning models, making it harder for us to find an optimal solution. In addition, there is a chance that you are overfitting your model to the data. This problem is often referred to as the curse of dimensionality in the field of machine learning.

A solution to this curse of dimensionality is to reduce the dimensionality of datasets that have many feature variables. Let's try to understand this concept with the help of an example dataset: pizza.csv. This...