This concludes the description and implementation of the linear and logistic regression and the concept of regularization to reduce overfitting. Your first analytical projects using machine learning will (or did) likely involve a regression model of some type. Regression models, along with the Naïve Bayes classification, are the most understood techniques for those without a deep knowledge of statistics or machine learning.

After the completion of this chapter, you will hopefully have a grasp on the following topics:

The concept of linear and nonlinear least squares-based optimization

The implementation of ordinary least square regression as well as logistic regression

The impact of regularization with an implementation of the ridge regression

The logistic regression is also the foundation of the conditional random fields, as described in the *Conditional random fields* section in Chapter 7, *Sequential Data Models*, and multilayer perceptrons, which was introduced in the *The multilayer perceptron...*