Book Image

Practical Guide to Applied Conformal Prediction in Python

By : Valery Manokhin
4 (1)
Book Image

Practical Guide to Applied Conformal Prediction in Python

4 (1)
By: Valery Manokhin

Overview of this book

In the rapidly evolving landscape of machine learning, the ability to accurately quantify uncertainty is pivotal. The book addresses this need by offering an in-depth exploration of Conformal Prediction, a cutting-edge framework to manage uncertainty in various ML applications. Learn how Conformal Prediction excels in calibrating classification models, produces well-calibrated prediction intervals for regression, and resolves challenges in time series forecasting and imbalanced data. Discover specialised applications of conformal prediction in cutting-edge domains like computer vision and NLP. Each chapter delves into specific aspects, offering hands-on insights and best practices for enhancing prediction reliability. The book concludes with a focus on multi-class classification nuances, providing expert-level proficiency to seamlessly integrate Conformal Prediction into diverse industries. With practical examples in Python using real-world datasets, expert insights, and open-source library applications, you will gain a solid understanding of this modern framework for uncertainty quantification. By the end of this book, you will be able to master Conformal Prediction in Python with a blend of theory and practical application, enabling you to confidently apply this powerful framework to quantify uncertainty in diverse fields.
Table of Contents (19 chapters)
Free Chapter
1
Part 1: Introduction
4
Part 2: Conformal Prediction Framework
8
Part 3: Applications of Conformal Prediction
14
Part 4: Advanced Topics

Conformal prediction for time series and forecasting

Creating reliable PIs for time series forecasting has been a longstanding, intricate challenge that remained unsolved for years until conformal prediction emerged.

This problem was underscored during the 2018 M4 Forecasting Competition, which necessitated participants to supply PIs and point estimates.

In the research paper titled Combining Prediction Intervals in the M4 Competition, (https://www.sciencedirect.com/science/article/abs/pii/S0169207019301141), Yael Grushka-Cockayne from the Darden School of Business and Victor Richmond R. Jose from Harvard Business School scrutinized 20 interval submissions. They assessed both the calibration and precision of the predictions and gauged their performances across different time horizons. Their analysis concluded that the submissions were ineffective in accurately estimating uncertainty.

Ensemble batch PIs (EnbPIs)

Conformal Prediction Intervals for Dynamic Time-Series (http...