Book Image

Training Systems using Python Statistical Modeling

By : Curtis Miller
Book Image

Training Systems using Python Statistical Modeling

By: Curtis Miller

Overview of this book

Python's ease-of-use and multi-purpose nature has made it one of the most popular tools for data scientists and machine learning developers. Its rich libraries are widely used for data analysis, and more importantly, for building state-of-the-art predictive models. This book is designed to guide you through using these libraries to implement effective statistical models for predictive analytics. You’ll start by delving into classical statistical analysis, where you will learn to compute descriptive statistics using pandas. You will focus on supervised learning, which will help you explore the principles of machine learning and train different machine learning models from scratch. Next, you will work with binary prediction models, such as data classification using k-nearest neighbors, decision trees, and random forests. The book will also cover algorithms for regression analysis, such as ridge and lasso regression, and their implementation in Python. In later chapters, you will learn how neural networks can be trained and deployed for more accurate predictions, and understand which Python libraries can be used to implement them. By the end of this book, you will have the knowledge you need to design, build, and deploy enterprise-grade statistical models for machine learning using Python and its rich ecosystem of libraries for predictive analytics.
Table of Contents (9 chapters)

Singular value decomposition

In this section, we will discuss singular value decomposition, and demonstrate how to use it.

SVD is a matrix decomposition technique from linear algebra that is very powerful. It forms the basis of other powerful methods. For example, PCA is performed after the SVD of the matrix is found first. This is an advanced linear algebra technique, so describing SVD without linear algebra is difficult, but we will look at its intuition.

We start with a collection of unit vectors, each orthogonal to each other. Any matrix, X, can be thought of as a mapping from one space to another. The unit vectors we start with will be mapped into unit vectors in a new space. The product of these unit vectors
with the matrix of these vectors are unit vectors orthogonal to one another, and are also scaled by values known as the singular values of the matrix. SVD describes...