Book Image

Regression Analysis with Python

By : Luca Massaron, Alberto Boschetti
4 (1)
Book Image

Regression Analysis with Python

4 (1)
By: Luca Massaron, Alberto Boschetti

Overview of this book

Regression is the process of learning relationships between inputs and continuous outputs from example data, which enables predictions for novel inputs. There are many kinds of regression algorithms, and the aim of this book is to explain which is the right one to use for each set of problems and how to prepare real-world data for it. With this book you will learn to define a simple regression problem and evaluate its performance. The book will help you understand how to properly parse a dataset, clean it, and create an output matrix optimally built for regression. You will begin with a simple regression algorithm to solve some data science problems and then progress to more complex algorithms. The book will enable you to use regression models to predict outcomes and take critical business decisions. Through the book, you will gain knowledge to use Python for building fast better linear models and to apply the results in Python or in any computer language you prefer.
Table of Contents (16 chapters)
Regression Analysis with Python
Credits
About the Authors
About the Reviewers
www.PacktPub.com
Preface
Index

Revisiting gradient descent


In the previous chapter, we introduced the gradient descent technique to speed up processing. As we've seen with Linear Regression, the fitting of the model can be made in two ways: closed form or iterative form. Closed form gives the best possible solution in one step (but it's a very complex and time-demanding step); iterative algorithms, instead, reach the minima step by step with few calculations for each update and can be stopped at any time.

Gradient descent is a very popular choice for fitting the Logistic Regression model; however, it shares its popularity with Newton's methods. Since Logistic Regression is the base of the iterative optimization, and we've already introduced it, we will focus on it in this section. Don't worry, there is no winner or any best algorithm: all of them can reach the very same model eventually, following different paths in the coefficients' space.

First, we should compute the derivate of the loss function. Let's make it a bit...