Book Image

Practical Discrete Mathematics

By : Ryan T. White, Archana Tikayat Ray
Book Image

Practical Discrete Mathematics

By: Ryan T. White, Archana Tikayat Ray

Overview of this book

Discrete mathematics deals with studying countable, distinct elements, and its principles are widely used in building algorithms for computer science and data science. The knowledge of discrete math concepts will help you understand the algorithms, binary, and general mathematics that sit at the core of data-driven tasks. Practical Discrete Mathematics is a comprehensive introduction for those who are new to the mathematics of countable objects. This book will help you get up to speed with using discrete math principles to take your computer science skills to a more advanced level. As you learn the language of discrete mathematics, you’ll also cover methods crucial to studying and describing computer science and machine learning objects and algorithms. The chapters that follow will guide you through how memory and CPUs work. In addition to this, you’ll understand how to analyze data for useful patterns, before finally exploring how to apply math concepts in network routing, web searching, and data science. By the end of this book, you’ll have a deeper understanding of discrete math and its applications in computer science, and be ready to work on real-world algorithm development and machine learning.
Table of Contents (17 chapters)
1
Part I – Basic Concepts of Discrete Math
7
Part II – Implementing Discrete Mathematics in Data and Computer Science
12
Part III – Real-World Applications of Discrete Mathematics

Summary

In this chapter, we learned about eigenvalues, eigenvectors, and orthogonal bases and how these concepts connect to form a basis for dimensionality reduction. We then learned about the two types of dimensionality reduction methods – feature elimination and feature extraction. We discussed the different steps of performing Principal Component Analysis which falls into the feature extraction category for dimensionality reduction. We used the implementation of PCA from scikit-learn to apply the algorithm to our dataset, where we reduced the features in our pizza dataset from 7 to 2 and visualized the data. We were able to easily tell that the nutrients present in the pizzas manufactured by different companies were different. Lastly, we applied PCA to the MNIST dataset and figured out that only 300 principal components were needed to capture 90% of the variance in the dataset, as compared to the 784 feature variables that we had originally, reducing the dimensionality by...