Book Image

Python for Finance Cookbook - Second Edition

By : Eryk Lewinson
5 (1)
Book Image

Python for Finance Cookbook - Second Edition

5 (1)
By: Eryk Lewinson

Overview of this book

Python is one of the most popular programming languages in the financial industry, with a huge collection of accompanying libraries. In this new edition of the Python for Finance Cookbook, you will explore classical quantitative finance approaches to data modeling, such as GARCH, CAPM, factor models, as well as modern machine learning and deep learning solutions. You will use popular Python libraries that, in a few lines of code, provide the means to quickly process, analyze, and draw conclusions from financial data. In this new edition, more emphasis was put on exploratory data analysis to help you visualize and better understand financial data. While doing so, you will also learn how to use Streamlit to create elegant, interactive web applications to present the results of technical analyses. Using the recipes in this book, you will become proficient in financial data analysis, be it for personal or professional projects. You will also understand which potential issues to expect with such analyses and, more importantly, how to overcome them.
Table of Contents (18 chapters)
16
Other Books You May Enjoy
17
Index

Time series forecasting with Amazon’s DeepAR

We have already covered time series analysis and forecasting in Chapter 6, Time Series Analysis and Forecasting, and Chapter 7, Machine Learning-Based Approaches to Time Series Forecasting. This time, we will have a look at an example of a deep learning approach to time series forecasting. In this recipe, we cover Amazon’s DeepAR model. The model was originally developed as a tool for demand/sales forecasting at the scale of hundreds if not thousands of stock-keeping units (SKUs).

The architecture of DeepAR is beyond the scope of this book. Hence, we will only focus on some of the key characteristics of the model. Those are listed below:

  • DeepAR creates a global model used for all the considered time series. It implements LSTM cells in an architecture that allows for training using hundreds or thousands of time series simultaneously. The model also uses an encoder-decoder setup, which is common in sequence-to...