Book Image

Machine Learning for Finance

By : Jannes Klaas
Book Image

Machine Learning for Finance

By: Jannes Klaas

Overview of this book

Machine Learning for Finance explores new advances in machine learning and shows how they can be applied across the financial sector, including insurance, transactions, and lending. This book explains the concepts and algorithms behind the main machine learning techniques and provides example Python code for implementing the models yourself. The book is based on Jannes Klaas’ experience of running machine learning training courses for financial professionals. Rather than providing ready-made financial algorithms, the book focuses on advanced machine learning concepts and ideas that can be applied in a wide variety of ways. The book systematically explains how machine learning works on structured data, text, images, and time series. You'll cover generative adversarial learning, reinforcement learning, debugging, and launching machine learning products. Later chapters will discuss how to fight bias in machine learning. The book ends with an exploration of Bayesian inference and probabilistic programming.
Table of Contents (15 chapters)
Machine Learning for Finance
Contributors
Preface
Other Books You May Enjoy
Index

Simple RNN


Another method to make order matter within neural networks is to give the network some kind of memory. So far, all of our networks have done a forward pass without any memory of what happened before or after the pass. It's time to change that with a recurrent neural network (RNN):

The scheme of an RNN

RNNs contain recurrent layers. Recurrent layers can remember their last activation and use it as their own input:

A recurrent layer takes a sequence as an input. For each element, it then computes a matrix multiplication (W * in), just like a Dense layer, and runs the result through an activation function, such as relu. It then retains its own activation. When the next item of the sequence arrives, it performs the matrix multiplication as before, but this time it also multiplies its previous activation with a second matrix (). The recurrent layer adds the result of both operations together and passes it through the activation function again.

In Keras, we can use a simple RNN as follows...