Book Image

Machine Learning Quick Reference

By : Rahul Kumar
Book Image

Machine Learning Quick Reference

By: Rahul Kumar

Overview of this book

Machine learning makes it possible to learn about the unknowns and gain hidden insights into your datasets by mastering many tools and techniques. This book guides you to do just that in a very compact manner. After giving a quick overview of what machine learning is all about, Machine Learning Quick Reference jumps right into its core algorithms and demonstrates how they can be applied to real-world scenarios. From model evaluation to optimizing their performance, this book will introduce you to the best practices in machine learning. Furthermore, you will also look at the more advanced aspects such as training neural networks and work with different kinds of data, such as text, time-series, and sequential data. Advanced methods and techniques such as causal inference, deep Gaussian processes, and more are also covered. By the end of this book, you will be able to train fast, accurate machine learning models at your fingertips, which you can easily use as a point of reference.
Table of Contents (18 chapters)
Title Page
Copyright and Credits
About Packt
Contributors
Preface
Index

Recurrent neural networks


Our thought process always has a sequence. We always understand things in an order. For example, if we watch a movie, we understand the next sequence by connecting it with the previous one. We retain the memory of the last sequence and get an understanding of the whole movie. We don't always go back to the first sequence in order to get it.

Can a neural network act like this? Traditional ones typically cannot operate in this manner and that is a major shortcoming. This is where recurrent neural networks make a difference. It comes with a loop that allows information to flow:

Here, a neural network takes an input as Xt and throws an output in the form of h. A recurrent neural network is made up of multiple copies of the same network that pass on the message to the successor.

If we were to go and unroll the preceding network, it would look like the following:

This chain-like nature reveals that recurrent neural networks are intimately related to sequences and lists...