# Summary

This concludes the description and implementation of linear and logistic regression and the concept of regularization to reduce overfitting. Your first analytical projects using machine learning will (or did) likely involve a regression model of some type. Regression models, along with the Naïve Bayes classification, are the most understood techniques for those without a deep knowledge of statistics or machine learning.

At the completion of this chapter, you hopefully have a grasp on the following:

- The concept of linear and nonlinear least squares-based optimization
- The implementation of ordinary least square regression as well as logistic regression
- The impact of regularization with an implementation of the Ridge regression

The logistic regression is also the foundation of the conditional random fields introduced in the next chapter and artificial neural networks in Chapter 9, *Artificial Neural Networks*.

Contrary to the Naïve Bayes models (refer to Chapter 5, *Naïve Bayes...*