Book Image

Python Machine Learning By Example - Third Edition

By : Yuxi (Hayden) Liu
Book Image

Python Machine Learning By Example - Third Edition

By: Yuxi (Hayden) Liu

Overview of this book

Python Machine Learning By Example, Third Edition serves as a comprehensive gateway into the world of machine learning (ML). With six new chapters, on topics including movie recommendation engine development with Naïve Bayes, recognizing faces with support vector machine, predicting stock prices with artificial neural networks, categorizing images of clothing with convolutional neural networks, predicting with sequences using recurring neural networks, and leveraging reinforcement learning for making decisions, the book has been considerably updated for the latest enterprise requirements. At the same time, this book provides actionable insights on the key fundamentals of ML with Python programming. Hayden applies his expertise to demonstrate implementations of algorithms in Python, both from scratch and with libraries. Each chapter walks through an industry-adopted application. With the help of realistic examples, you will gain an understanding of the mechanics of ML techniques in areas such as exploratory data analysis, feature engineering, classification, regression, clustering, and NLP. By the end of this ML Python book, you will have gained a broad picture of the ML ecosystem and will be well-versed in the best practices of applying ML techniques to solve problems.
Table of Contents (17 chapters)
15
Other Books You May Enjoy
16
Index

Training an RNN model

To explain how we optimize the weights (parameters) of an RNN, we first annotate the weights and the data on the network, as follows:

  • U denotes the weights connecting the input layer and the hidden layer.
  • V denotes the weights between the hidden layer and the output layer. Note here that we use only one recurrent layer for simplicity.
  • W denotes the weights of the recurrent layer; that is, the feedback layer.
  • xt denotes the inputs at time step t.
  • st denotes the hidden state at time step t.
  • ht denotes the outputs at time step t.

Next, we unfold the simple RNN model over three time steps: t − 1, t, and t + 1, as follows:

Figure 13.9: Unfolding a recurrent layer

We describe the mathematical relationships between the layers as follows:

  • We let a denote the activation function for the hidden layer. In RNNs, we usually choose tanh or ReLU as the activation function for the hidden layers...