Book Image

A Handbook of Mathematical Models with Python

By : Dr. Ranja Sarkar
Book Image

A Handbook of Mathematical Models with Python

By: Dr. Ranja Sarkar

Overview of this book

Mathematical modeling is the art of transforming a business problem into a well-defined mathematical formulation. Its emphasis on interpretability is particularly crucial when deploying a model to support high-stake decisions in sensitive sectors like pharmaceuticals and healthcare. Through this book, you’ll gain a firm grasp of the foundational mathematics underpinning various machine learning algorithms. Equipped with this knowledge, you can modify algorithms to suit your business problem. Starting with the basic theory and concepts of mathematical modeling, you’ll explore an array of mathematical tools that will empower you to extract insights and understand the data better, which in turn will aid in making optimal, data-driven decisions. The book allows you to explore mathematical optimization and its wide range of applications, and concludes by highlighting the synergetic value derived from blending mathematical models with machine learning. Ultimately, you’ll be able to apply everything you’ve learned to choose the most fitting methodologies for the business problems you encounter.
Table of Contents (16 chapters)
1
Part 1:Mathematical Modeling
4
Part 2:Mathematical Tools
11
Part 3:Mathematical Optimization

LDA – the difference from PCA

LDA and PCA are linear transformation methods; the latter yields directions or PCs that maximize data variance and the former yields directions that maximize the separation between data classes. The way in which the PCA algorithm works disregards class labels.

LDA is a supervised method to reduce dimensionality that projects the data onto a subspace in a way that maximizes the separability between (groups) classes; hence, it is used for pattern classification problems. LDA works well for data with multiple classes; however, it makes assumptions of normally distributed classes and equal class covariances. PCA tends to work well if the number of samples in each class is relatively small. In both cases, though, observations ought to be much higher relative to the dimensions for meaningful results.

LDA seeks a projection that discriminates data in the best possible way, unlike PCA, which seeks a projection that preserves maximum information in...