Book Image

Applied Supervised Learning with R

By : Karthik Ramasubramanian, Jojo Moolayil
Book Image

Applied Supervised Learning with R

By: Karthik Ramasubramanian, Jojo Moolayil

Overview of this book

R provides excellent visualization features that are essential for exploring data before using it in automated learning. Applied Supervised Learning with R helps you cover the complete process of employing R to develop applications using supervised machine learning algorithms for your business needs. The book starts by helping you develop your analytical thinking to create a problem statement using business inputs and domain research. You will then learn different evaluation metrics that compare various algorithms, and later progress to using these metrics to select the best algorithm for your problem. After finalizing the algorithm you want to use, you will study the hyperparameter optimization technique to fine-tune your set of optimal parameters. The book demonstrates how you can add different regularization terms to avoid overfitting your model. By the end of this book, you will have gained the advanced skills you need for modeling a supervised machine learning algorithm that precisely fulfills your business needs.
Table of Contents (12 chapters)
Applied Supervised Learning with R
Preface

Bayesian Optimization


One of the major trade-offs within grid search and random search is that both techniques do not keep track of the past evaluations of hyperparameter combinations used for the model training. Ideally, if there was some artificial intelligence were induced in this path that could indicate the process with the historic performance on the selected list of hyperparameters and a mechanism to improve performance by advancing iterations in the right direction, it would drastically reduce the number of iterations required to find the optimal set of values for the hyperparameters. Grid search and random search, however, miss on this front and iterate through all provided combinations without considering any cues from previous iterations.

With Bayesian optimization, we overcome this trade-off by enabling the tuning process to keep track of previous iterations and their evaluation by developing a probabilistic model that would map the hyperparameters to a probability score of the...