Book Image

Machine Learning for Imbalanced Data

By : Kumar Abhishek, Dr. Mounir Abdelaziz
Book Image

Machine Learning for Imbalanced Data

By: Kumar Abhishek, Dr. Mounir Abdelaziz

Overview of this book

As machine learning practitioners, we often encounter imbalanced datasets in which one class has considerably fewer instances than the other. Many machine learning algorithms assume an equilibrium between majority and minority classes, leading to suboptimal performance on imbalanced data. This comprehensive guide helps you address this class imbalance to significantly improve model performance. Machine Learning for Imbalanced Data begins by introducing you to the challenges posed by imbalanced datasets and the importance of addressing these issues. It then guides you through techniques that enhance the performance of classical machine learning models when using imbalanced data, including various sampling and cost-sensitive learning methods. As you progress, you’ll delve into similar and more advanced techniques for deep learning models, employing PyTorch as the primary framework. Throughout the book, hands-on examples will provide working and reproducible code that’ll demonstrate the practical implementation of each technique. By the end of this book, you’ll be adept at identifying and addressing class imbalances and confidently applying various techniques, including sampling, cost-sensitive techniques, and threshold adjustment, while using traditional machine learning or deep learning models.
Table of Contents (15 chapters)

Sampling techniques for deep learning models

In this section, we’ll explore some sampling methods, such as random oversampling and weighted sampling, for deep learning models. We’ll then transition into data augmentation techniques, which bolster model robustness and mitigate dataset limitations. While large datasets are ideal for deep learning, real-world constraints often make them hard to obtain. We will also look at some advanced augmentations, such as CutMix and MixUp. We’ll start with standard methods before discussing these advanced techniques.

Random oversampling

Here, we will apply the plain old random oversampling we learned in Chapter 2, Oversampling Methods, but using image data as input to a neural network. The basic idea is to duplicate samples from the minority classes randomly until we end up with an equal number of samples from each class. This technique often performs better than no sampling.

Tip

Make sure to train the model for enough...