Book Image

Machine Learning for Algorithmic Trading - Second Edition

By : Stefan Jansen
Book Image

Machine Learning for Algorithmic Trading - Second Edition

By: Stefan Jansen

Overview of this book

The explosive growth of digital data has boosted the demand for expertise in trading strategies that use machine learning (ML). This revised and expanded second edition enables you to build and evaluate sophisticated supervised, unsupervised, and reinforcement learning models. This book introduces end-to-end machine learning for the trading workflow, from the idea and feature engineering to model optimization, strategy design, and backtesting. It illustrates this by using examples ranging from linear models and tree-based ensembles to deep-learning techniques from cutting edge research. This edition shows how to work with market, fundamental, and alternative data, such as tick data, minute and daily bars, SEC filings, earnings call transcripts, financial news, or satellite images to generate tradeable signals. It illustrates how to engineer financial features or alpha factors that enable an ML model to predict returns from price data for US and international stocks and ETFs. It also shows how to assess the signal content of new features using Alphalens and SHAP values and includes a new appendix with over one hundred alpha factor examples. By the end, you will be proficient in translating ML model predictions into a trading strategy that operates at daily or intraday horizons, and in evaluating its performance.
Table of Contents (27 chapters)
24
References
25
Index

A neural network from scratch in Python

To gain a better understanding of how NNs work, we will formulate the single-layer architecture and forward propagation computations displayed in Figure 17.2 using matrix algebra and implement it using NumPy. You can find the code samples in the notebook build_and_train_feedforward_nn.

The input layer

The architecture shown in Figure 17.2 is designed for two-dimensional input data X that represents two different classes Y. In matrix form, both X and Y are of shape :

We will generate 50,000 random binary samples in the form of two concentric circles with different radius using scikit-learn's make_circles function so that the classes are not linearly separable:

N = 50000
factor = 0.1
noise = 0.1
X, y = make_circles(n_samples=N, shuffle=True,
                   factor=factor, noise=noise)

We then convert the one-dimensional output into a two-dimensional array:

Y = np.zeros((N, 2))
for c in [0, 1]:
   Y[y == c...