Book Image

The Deep Learning Workshop

By : Mirza Rahim Baig, Thomas V. Joseph, Nipun Sadvilkar, Mohan Kumar Silaparasetty, Anthony So
Book Image

The Deep Learning Workshop

By: Mirza Rahim Baig, Thomas V. Joseph, Nipun Sadvilkar, Mohan Kumar Silaparasetty, Anthony So

Overview of this book

Are you fascinated by how deep learning powers intelligent applications such as self-driving cars, virtual assistants, facial recognition devices, and chatbots to process data and solve complex problems? Whether you are familiar with machine learning or are new to this domain, The Deep Learning Workshop will make it easy for you to understand deep learning with the help of interesting examples and exercises throughout. The book starts by highlighting the relationship between deep learning, machine learning, and artificial intelligence and helps you get comfortable with the TensorFlow 2.0 programming structure using hands-on exercises. You’ll understand neural networks, the structure of a perceptron, and how to use TensorFlow to create and train models. The book will then let you explore the fundamentals of computer vision by performing image recognition exercises with convolutional neural networks (CNNs) using Keras. As you advance, you’ll be able to make your model more powerful by implementing text embedding and sequencing the data using popular deep learning solutions. Finally, you’ll get to grips with bidirectional recurrent neural networks (RNNs) and build generative adversarial networks (GANs) for image synthesis. By the end of this deep learning book, you’ll have learned the skills essential for building deep learning models with TensorFlow and Keras.
Table of Contents (9 chapters)
Preface

6. LSTMs, GRUs, and Advanced RNNs

Overview

In this chapter, we will study and implement advanced models and variations of the plain Recurrent Neural Network (RNN) that overcome some of RNNs' practical drawbacks and are among the best performing deep learning models at the moment. We will start by understanding the drawbacks of plain RNNs and see how the novel idea of Long Short-Term Memory overcomes them. We will then see and implement a Gated Recurrent Unit based model. We will also work with bidirectional and stacked RNNs and explore attention-based models. By the end of this chapter, you will have built and assessed the performance of these models on a sentiment classification task, observing for yourself the trade-offs in choosing the different models.