Book Image

Practical Machine Learning Cookbook

By : Atul Tripathi
Book Image

Practical Machine Learning Cookbook

By: Atul Tripathi

Overview of this book

Machine learning has become the new black. The challenge in today’s world is the explosion of data from existing legacy data and incoming new structured and unstructured data. The complexity of discovering, understanding, performing analysis, and predicting outcomes on the data using machine learning algorithms is a challenge. This cookbook will help solve everyday challenges you face as a data scientist. The application of various data science techniques and on multiple data sets based on real-world challenges you face will help you appreciate a variety of techniques used in various situations. The first half of the book provides recipes on fairly complex machine-learning systems, where you’ll learn to explore new areas of applications of machine learning and improve its efficiency. That includes recipes on classifications, neural networks, unsupervised and supervised learning, deep learning, reinforcement learning, and more. The second half of the book focuses on three different machine learning case studies, all based on real-world data, and offers solutions and solves specific machine-learning issues in each one.
Table of Contents (21 chapters)
Practical Machine Learning Cookbook
Credits
About the Author
About the Reviewer
www.PacktPub.com
Customer Feedback
Preface
14
Case Study - Forecast of Electricity Consumption

Stochastic gradient descent - adult income


Stochastic gradient descent also known as incremental gradient descent, is a stochastic approximation of the gradient descent optimization method for minimizing an objective function that is written as a sum of differentiable functions. It tries to find minima or maxima by iteration. In stochastic gradient descent, the true gradient of Q(w) is approximated by a gradient at a single example:

As the algorithm sweeps through the training set, it performs the above update for each training example. Several passes can be made over the training set until the algorithm converges. If this is done, the data can be shuffled for each pass to prevent cycles. Typical implementations may use an adaptive learning rate so that the algorithm converges.

Getting ready

In order to perform stochastic gradient descent, we will be using a dataset collected from census data to predict income.

Step 1 - collecting and describing the data

The dataset titled adult.txt will be...