Although very similar to Lasso (seen in Chapter 6, Achieving Generalization), Least Angle Regression, or simply LARS, is a regression algorithm that, in a fast and smart way, selects the best features to use in the model, even though they're very closely correlated to each other. LARS is an evolution of the Forward Selection (also called Forward Stepwise Regression) algorithm and of the Forward Stagewise Regression algorithm.
Here is how the Forward Selection algorithm works, based on the hypothesis that all the variables, including the target one, have been previously normalized:
Of all the possible predictors for a problem, the one with the largest absolute correlation with the target variable y is selected (that is, the one with the most explanatory capability). Let's call it p1.
All the other predictors are now projected onto p1 Least Angle Regression, and the projection is removed, creating a vector of residuals orthogonal to p1.
Step 1 is repeated on the residual...