Book Image

Neural Networks with Keras Cookbook

By : V Kishore Ayyadevara
Book Image

Neural Networks with Keras Cookbook

By: V Kishore Ayyadevara

Overview of this book

This book will take you from the basics of neural networks to advanced implementations of architectures using a recipe-based approach. We will learn about how neural networks work and the impact of various hyper parameters on a network's accuracy along with leveraging neural networks for structured and unstructured data. Later, we will learn how to classify and detect objects in images. We will also learn to use transfer learning for multiple applications, including a self-driving car using Convolutional Neural Networks. We will generate images while leveraging GANs and also by performing image encoding. Additionally, we will perform text analysis using word vector based techniques. Later, we will use Recurrent Neural Networks and LSTM to implement chatbot and Machine Translation systems. Finally, you will learn about transcribing images, audio, and generating captions and also use Deep Q-learning to build an agent that plays Space Invaders game. By the end of this book, you will have developed the skills to choose and customize multiple neural network architectures for various deep learning problems you might encounter.
Table of Contents (18 chapters)

Encoder decoder architecture with attention for machine translation

In the previous section, we learned that we could increase the accuracy of translation by enabling the teacher forcing technique, where the actual word in the previous time step of target was used as an input to the model.

In this section, we will extend this idea further and assign weightage to the input encoder based on how similar the encoder and decoder vectors are at each time step. This way, we are enabling that certain words have a higher weightage in the encoder's hidden vector, depending on the time step of the decoder.

How to do it...

With this, let's look at how we can build the encoder decoder architecture, along with the attention mechanism...