Book Image

Practical Time Series Analysis

By : Avishek Pal, PKS Prakash
Book Image

Practical Time Series Analysis

By: Avishek Pal, PKS Prakash

Overview of this book

Time Series Analysis allows us to analyze data which is generated over a period of time and has sequential interdependencies between the observations. This book describes special mathematical tricks and techniques which are geared towards exploring the internal structures of time series data and generating powerful descriptive and predictive insights. Also, the book is full of real-life examples of time series and their analyses using cutting-edge solutions developed in Python. The book starts with descriptive analysis to create insightful visualizations of internal structures such as trend, seasonality, and autocorrelation. Next, the statistical methods of dealing with autocorrelation and non-stationary time series are described. This is followed by exponential smoothing to produce meaningful insights from noisy time series data. At this point, we shift focus towards predictive analysis and introduce autoregressive models such as ARMA and ARIMA for time series forecasting. Later, powerful deep learning methods are presented, to develop accurate forecasting models for complex time series, and under the availability of little domain knowledge. All the topics are illustrated with real-life problem scenarios and their solutions by best-practice implementations in Python. The book concludes with the Appendix, with a brief discussion of programming and solving data science problems using Python.
Table of Contents (13 chapters)

Recurrent neural networks


So far, we have used an MLP to develop a time series forecasting model. To predict the series

at time [xt-1, ... , xt-p] we fed an input vector of past

time steps

to an MLP. The past

 time steps are fed to the MLP as uncorrelated independent variables. One problem with this kind of model is that it does not implicitly consider the sequential nature of the time series data where observations have correlation with each other. The correlation in a time series can also be interpreted as the memory that the series carries over itself. In this section, we will discuss recurrent neural networks (RNNs) that are architecturally different from MLPs and are more appropriate to fit sequential data.

RNNs emerged as a good choice to develop language models that model the probability of occurrence of a word given the words that appear prior to it. So far, RNNs have been used to develop models that do text classification, for example, sentiment prediction, language translation...