Book Image

Mastering Machine Learning with scikit-learn - Second Edition

By : Gavin Hackeling
Book Image

Mastering Machine Learning with scikit-learn - Second Edition

By: Gavin Hackeling

Overview of this book

Machine learning is the buzzword bringing computer science and statistics together to build smart and efficient models. Using powerful algorithms and techniques offered by machine learning you can automate any analytical model. This book examines a variety of machine learning models including popular machine learning algorithms such as k-nearest neighbors, logistic regression, naive Bayes, k-means, decision trees, and artificial neural networks. It discusses data preprocessing, hyperparameter optimization, and ensemble methods. You will build systems that classify documents, recognize images, detect ads, and more. You will learn to use scikit-learn’s API to extract features from categorical variables, text and images; evaluate model performance, and develop an intuition for how to improve your model’s performance. By the end of this book, you will master all required concepts of scikit-learn to build efficient models at work to carry out advanced tasks with the practical approach.
Table of Contents (22 chapters)
Title Page
Credits
About the Author
About the Reviewer
www.PacktPub.com
Customer Feedback
Preface
9
From Decision Trees to Random Forests and Other Ensemble Methods
Index

Training multi-layer perceptrons


In this section, we will discuss how to train a multi-layer perceptron. Recall from Chapter 5, From Simple Linear Regression to Multiple Linear Regression that we can use gradient descent to minimize a real-valued function, C,  of many variables. Assume that C is a function of two variables v1 and v2. To understand how to change the variables to minimize C, we need a small change in the variables to produce a small change in the output. We will represent a change in the value of v1 with Δv1, a change in the value of v2 with Δv2, and a change in the value of C with ΔC. The relation between ΔC and changes to variables is given by:

∂C/∂v1 is the partial derivative of C with respect to v1. For convenience, we will represent Δv1 and Δv2 as a vector:

We will also represent the partial derivatives of C with respect to the variables using the gradient vector of C, 

:

We can rewrite our formula for ΔC as follows:

On each iteration, ΔC should be negative to decrease the...