Book Image

Principles of Data Science

Book Image

Principles of Data Science

Overview of this book

Need to turn your skills at programming into effective data science skills? Principles of Data Science is created to help you join the dots between mathematics, programming, and business analysis. With this book, you’ll feel confident about asking—and answering—complex and sophisticated questions of your data to move from abstract and raw statistics to actionable ideas. With a unique approach that bridges the gap between mathematics and computer science, this books takes you through the entire data science pipeline. Beginning with cleaning and preparing data, and effective data mining strategies and techniques, you’ll move on to build a comprehensive picture of how every piece of the data science puzzle fits together. Learn the fundamentals of computational mathematics and statistics, as well as some pseudocode being used today by data scientists and analysts. You’ll get to grips with machine learning, discover the statistical models that help you take control and navigate even the densest datasets, and find out how to create powerful visualizations that communicate what your data means.
Table of Contents (20 chapters)
Principles of Data Science
Credits
About the Author
About the Reviewers
www.PacktPub.com
Preface
Index

Logistic regression


Our first classification model is called logistic regression. I can already hear the questions you have in your head: what makes is logistic, why is it called regression if you claim that this is a classification algorithm? All in good time, my friend.

Logistic regression is a generalization of the linear regression model adapted to fit classification problems. In linear regression, we use a set of quantitative feature variables to predict a continuous response variable. In logistic regression, we use a set of quantitative feature variables to predict probabilities of class membership. These probabilities can then be mapped to class labels, thus predicting a class for each observation.

When performing linear regression, we use the following function to make our line of best fit:

y= 0 + 1x

Here, y is our response variable (the thing we wish to predict), our Beta represents our model parameters and x represents our input variable (a single one in this case, but it can take...