Book Image

Machine Learning for Time-Series with Python

By : Ben Auffarth
Book Image

Machine Learning for Time-Series with Python

By: Ben Auffarth

Overview of this book

The Python time-series ecosystem is huge and often quite hard to get a good grasp on, especially for time-series since there are so many new libraries and new models. This book aims to deepen your understanding of time series by providing a comprehensive overview of popular Python time-series packages and help you build better predictive systems. Machine Learning for Time-Series with Python starts by re-introducing the basics of time series and then builds your understanding of traditional autoregressive models as well as modern non-parametric models. By observing practical examples and the theory behind them, you will become confident with loading time-series datasets from any source, deep learning models like recurrent neural networks and causal convolutional network models, and gradient boosting with feature engineering. This book will also guide you in matching the right model to the right problem by explaining the theory behind several useful models. You’ll also have a look at real-world case studies covering weather, traffic, biking, and stock market data. By the end of this book, you should feel at home with effectively analyzing and applying machine learning methods to time-series.
Table of Contents (15 chapters)
13
Other Books You May Enjoy
14
Index

What this book covers

Chapter 1, Introduction to Time-Series with Python, is a general introduction to the topic. You'll learn about time-series and why they are important, and many conventions, and you'll see an overview of applications and techniques that will be explained in more detail in dedicated chapters.

Chapter 2, Time-Series Analysis with Python, breaks down the steps for analyzing time-series. It explains statistical tests and visualizations relevant for making sense of and drawing insights from time-series.

Chapter 3, Preprocessing Time-Series, is about data treatment for time-series for traditional techniques and for machine learning. Methods such as naïve and Loess STL decomposition for seasonal and trend effects are covered, along with normalizations for values, as well as specific feature extraction techniques such as catch22 and ROCKET.

Chapter 4, Introduction to Machine Learning for Time-Series, deals with an overview of the state of the art for univariate and multivariate time-series forecasts and predictions.

Chapter 5, Forecasting with Moving Averages and Autoregressive Models, focuses on forecasting, mostly on univariate time-series (see Chapter 12, Multivariate Forecasting for multivariate time-series). Well-established traditional methods used in econometrics are introduced, explained, and applied on data sets.

Chapter 6, Unsupervised Methods for Time-Series, introduces anomaly detection, change detection, and clustering. The chapter reviews industry practices at major technology companies such as Facebook, Amazon, Google, and others, and gives practical examples for both anomaly detection and change detection.

Chapter 7, Machine Learning Models for Time-Series, reviews recent research on machine learning for time-series at institutes such as at the University of East Anglia and Monash University. Many techniques are summarized and compared throughout the chapter, and there's a practical section with many examples.

Chapter 8, Online Learning for Time-Series, introduces online learning, a topic often neglected. Online models continuously update their parameters based on latest samples, and some of them have mechanisms to deal with different kinds of drift – a common problem with time-series.

Chapter 9, Probabilistic Models for Time-Series, covers probabilistic models for time-series. This includes models with confidence intervals such as Facebook's Prophet, Markov Models, Fuzzy Models, and counter-factual causal models such as Bayesian Structural Time-Series Models as proposed by Google.

Chapter 10, Deep Learning for Time-Series, reviews recent literature and benchmarks for different tasks. The chapter explains techniques such as autoencoders, InceptionTime, DeepAR, N-BEATS, Recurrent Neural Networks, ConvNets, and Informer. Deep learning still hasn't completely caught up with more traditional or other machine learning techniques; however, the progress has been promising, and for certain applications such as multivariate predictions, deep learning techniques are emerging as the state of the art, as can be seen in competitions such as M4.

Chapter 11, Reinforcement Learning for Time-Series, gives an overview of basic concepts in reinforcement learning. It introduces techniques relevant for time-series such as bandit algorithms and Deep Q-Learning, and they are applied for a recommender system and for a trading algorithm.

Chapter 12, Multivariate Forecasting, gives practical examples for multivariate multistep forecasts of energy demand with deep learning models.