Book Image

Regression Analysis with R

By : Giuseppe Ciaburro
Book Image

Regression Analysis with R

By: Giuseppe Ciaburro

Overview of this book

Regression analysis is a statistical process which enables prediction of relationships between variables. The predictions are based on the casual effect of one variable upon another. Regression techniques for modeling and analyzing are employed on large set of data in order to reveal hidden relationship among the variables. This book will give you a rundown explaining what regression analysis is, explaining you the process from scratch. The first few chapters give an understanding of what the different types of learning are – supervised and unsupervised, how these learnings differ from each other. We then move to covering the supervised learning in details covering the various aspects of regression analysis. The outline of chapters are arranged in a way that gives a feel of all the steps covered in a data science process – loading the training dataset, handling missing values, EDA on the dataset, transformations and feature engineering, model building, assessing the model fitting and performance, and finally making predictions on unseen datasets. Each chapter starts with explaining the theoretical concepts and once the reader gets comfortable with the theory, we move to the practical examples to support the understanding. The practical examples are illustrated using R code including the different packages in R such as R Stats, Caret and so on. Each chapter is a mix of theory and practical examples. By the end of this book you will know all the concepts and pain-points related to regression analysis, and you will be able to implement your learning in your projects.
Table of Contents (15 chapters)
Title Page
Packt Upsell
Contributors
Preface
Index

Gradient Descent and linear regression


The Gradient Descent (GD) is an iterative approach for minimizing the given function, or, in other words, a way to find a local minimum of a function. The algorithm starts with an initial estimate of the solution that we can give in several ways: one approach is to randomly sample values for the parameters. We evaluate the slope of the function at that point, determine the solution in the negative direction of the gradient, and repeat this process. The algorithm will eventually converge where the gradient is zero, corresponding to a local minimum.

The steepest descent step size is replaced by a similar size from the previous step. The gradient is basically defined as the slope of the curve, as shown in the following figure:

In Chapter 2Basic Concepts – Simple Linear Regression, we saw that the goal of OLS regression is to find the line that best fits the predictor in terms of minimizing the overall squared distance between itself and the response. In...