Book Image

Hyperparameter Tuning with Python

By : Louis Owen
Book Image

Hyperparameter Tuning with Python

By: Louis Owen

Overview of this book

Hyperparameters are an important element in building useful machine learning models. This book curates numerous hyperparameter tuning methods for Python, one of the most popular coding languages for machine learning. Alongside in-depth explanations of how each method works, you will use a decision map that can help you identify the best tuning method for your requirements. You’ll start with an introduction to hyperparameter tuning and understand why it's important. Next, you'll learn the best methods for hyperparameter tuning for a variety of use cases and specific algorithm types. This book will not only cover the usual grid or random search but also other powerful underdog methods. Individual chapters are also dedicated to the three main groups of hyperparameter tuning methods: exhaustive search, heuristic search, Bayesian optimization, and multi-fidelity optimization. Later, you will learn about top frameworks like Scikit, Hyperopt, Optuna, NNI, and DEAP to implement hyperparameter tuning. Finally, you will cover hyperparameters of popular algorithms and best practices that will help you efficiently tune your hyperparameter. By the end of this book, you will have the skills you need to take full control over your machine learning models and get the best models for the best results.
Table of Contents (19 chapters)
1
Section 1:The Methods
8
Section 2:The Implementation
13
Section 3:Putting Things into Practice

What this book covers

Chapter 1, Evaluating Machine Learning Models, covers all the important things we need to know when it comes to evaluating ML models, including the concept of overfitting, the idea of splitting data into several parts, a comparison between the random and stratified split, and numerous methods on how to split the data.

Chapter 2, Introducing Hyperparameter Tuning, introduces the concept of hyperparameter tuning, starting from the definition and moving on to the goal, several misconceptions, and distributions of hyperparameters.

Chapter 3, Exploring Exhaustive Search, explores each method that belongs to the first out of four groups of hyperparameter tuning, along with the pros and cons. There will be both high-level and detailed explanations for each of the methods. The high-level explanation will use a visualization strategy to help you understand more easily, while the detailed explanation will bring the math to the table.

Chapter 4, Exploring Bayesian Optimization, explores each method that belongs to the second out of four groups of hyperparameter tuning, along with the pros and cons. There will also be both high-level and detailed explanations for each of the methods.

Chapter 5, Exploring Heuristic Search, explores each method that belongs to the third out of four groups of hyperparameter tuning, along with the pros and cons. There will also be both high-level and detailed explanations for each of the methods.

Chapter 6, Exploring Multi-Fidelity Optimization, explores each method that belongs to the fourth out of four groups of hyperparameter tuning, along with the pros and cons. There will also be both high-level and detailed explanations for each of the methods.

Chapter 7, Hyperparameter Tuning via Scikit, covers all the important things about scikit-learn, scikit-optimize, and scikit-hyperband, along with how to utilize each of them to perform hyperparameter tuning.

Chapter 8, Hyperparameter Tuning via Hyperopt, introduces the Hyperopt package, starting from its capabilities and limitations, how to utilize it to perform hyperparameter tuning, and all the other important things you need to know about it.

Chapter 9, Hyperparameter Tuning via Optuna, introduces the Optuna package, starting from its numerous features, how to utilize it to perform hyperparameter tuning, and all the other important things you need to know about it.

Chapter 10, Advanced Hyperparameter Tuning with DEAP and Microsoft NNI, shows how to perform hyperparameter tuning using both the DEAP and Microsoft NNI packages, starting from getting ourselves familiar with the packages and moving on to the important modules and parameters we need to be aware of.

Chapter 11, Understanding Hyperparameters of Popular Algorithms, explores the hyperparameters of several popular ML algorithms. There will be a broad explanation for each of the algorithms, including (but not limited to) the definition of each hyperparameter, what will be impacted when the value of each hyperparameter is changed, and the priority list of hyperparameters based on the impact.

Chapter 12, Introducing Hyperparameter Tuning Decision Map, introduces the Hyperparameter Tuning Decision Map (HTDM), which summarizes all of the discussed hyperparameter tuning methods as a simple decision map based on six aspects. There will be also three study cases that show how to utilize the HTDM in practice.

Chapter 13, Tracking Hyperparameter Tuning Experiments, covers the importance of tracking hyperparameter tuning experiments, along with the usual practices. You will also be introduced to several open source packages that are available and learn how to utilize each of them in practice.

Chapter 14, Conclusions and Next Steps, summarizes all the important lessons learned in the previous chapters, and also introduces you to several topics or implementations that you may benefit from that we have not covered in detail in this book.