Book Image

Regression Analysis with R

By : Giuseppe Ciaburro
Book Image

Regression Analysis with R

By: Giuseppe Ciaburro

Overview of this book

Regression analysis is a statistical process which enables prediction of relationships between variables. The predictions are based on the casual effect of one variable upon another. Regression techniques for modeling and analyzing are employed on large set of data in order to reveal hidden relationship among the variables. This book will give you a rundown explaining what regression analysis is, explaining you the process from scratch. The first few chapters give an understanding of what the different types of learning are – supervised and unsupervised, how these learnings differ from each other. We then move to covering the supervised learning in details covering the various aspects of regression analysis. The outline of chapters are arranged in a way that gives a feel of all the steps covered in a data science process – loading the training dataset, handling missing values, EDA on the dataset, transformations and feature engineering, model building, assessing the model fitting and performance, and finally making predictions on unseen datasets. Each chapter starts with explaining the theoretical concepts and once the reader gets comfortable with the theory, we move to the practical examples to support the understanding. The practical examples are illustrated using R code including the different packages in R such as R Stats, Caret and so on. Each chapter is a mix of theory and practical examples. By the end of this book you will know all the concepts and pain-points related to regression analysis, and you will be able to implement your learning in your projects.
Table of Contents (15 chapters)
Title Page
Packt Upsell
Contributors
Preface
Index

Generalized Additive Model


A GAM is a GLM in which the linear predictor is given by a user-specified sum of smooth functions of the covariates plus a conventional parametric component of the linear predictor. Assume that a sample of n objects has a response variable y and r explanatory variables x1,. . . , xr. In these assumptions, the regression equation becomes:

Here, the functions f1, f2,…., fr are different nonlinear functions on variables x. Into the GAM, the linear relationship between the response and predictors are replaced by several nonlinear smooth functions to model and capture the nonlinearities in the data.

We can see the GAM as a generalization of a multiple regression model without interactions between predictors. Among the advantages of this approach, in addition to greater flexibility than the linear model, the good algorithmic convergence rate should also be mentioned for problems with many explanatory variables. The biggest drawback lies in the complexity of the parameter...