Book Image

A Handbook of Mathematical Models with Python

By : Dr. Ranja Sarkar
Book Image

A Handbook of Mathematical Models with Python

By: Dr. Ranja Sarkar

Overview of this book

Mathematical modeling is the art of transforming a business problem into a well-defined mathematical formulation. Its emphasis on interpretability is particularly crucial when deploying a model to support high-stake decisions in sensitive sectors like pharmaceuticals and healthcare. Through this book, you’ll gain a firm grasp of the foundational mathematics underpinning various machine learning algorithms. Equipped with this knowledge, you can modify algorithms to suit your business problem. Starting with the basic theory and concepts of mathematical modeling, you’ll explore an array of mathematical tools that will empower you to extract insights and understand the data better, which in turn will aid in making optimal, data-driven decisions. The book allows you to explore mathematical optimization and its wide range of applications, and concludes by highlighting the synergetic value derived from blending mathematical models with machine learning. Ultimately, you’ll be able to apply everything you’ve learned to choose the most fitting methodologies for the business problems you encounter.
Table of Contents (16 chapters)
1
Part 1:Mathematical Modeling
4
Part 2:Mathematical Tools
11
Part 3:Mathematical Optimization

Linear algebra for PCA

PCA is an unsupervised method used to reduce the number of features of a high-dimensional dataset. An unlabeled dataset is reduced into its constituent parts by matrix factorization (or decomposition) followed by ranking of these parts in accordance with variances. The projected data representative of the original data becomes the input to train ML models.

PCA is defined as the orthogonal projection of data onto a lower dimensional linear space called the principal subspace, done by finding new axes (or basis vectors) that preserve the maximum variance of projected data; the new axes or vectors are known as principal components. PCA preserves the information by considering the variance of projection vectors: the highest variance lies on the first axis, the second highest on the second axis, and so forth. The working principle of the linear transformation called PCA is shown in Figure 3.2. It compresses the feature space by identifying a subspace that captures...