Book Image

Deep Learning Quick Reference

By : Mike Bernico
Book Image

Deep Learning Quick Reference

By: Mike Bernico

Overview of this book

Deep learning has become an essential necessity to enter the world of artificial intelligence. With this book deep learning techniques will become more accessible, practical, and relevant to practicing data scientists. It moves deep learning from academia to the real world through practical examples. You will learn how Tensor Board is used to monitor the training of deep neural networks and solve binary classification problems using deep learning. Readers will then learn to optimize hyperparameters in their deep learning models. The book then takes the readers through the practical implementation of training CNN's, RNN's, and LSTM's with word embeddings and seq2seq models from scratch. Later the book explores advanced topics such as Deep Q Network to solve an autonomous agent problem and how to use two adversarial networks to generate artificial images that appear real. For implementation purposes, we look at popular Python-based deep learning frameworks such as Keras and Tensorflow, Each chapter provides best practices and safe choices to help readers make the right decision while training deep neural networks. By the end of this book, you will be able to solve real-world problems quickly with deep neural networks.
Table of Contents (15 chapters)

Keras embedding layer

The Keras embedding layer allows us to learn a vector space representation of an input word, like we did in word2vec, as we train our model. Using the functional API, the Keras embedding layer is always the second layer in the network, coming after the input layer.

The embedding layer needs the following three arguments:

  • input_dim: The size of the vocabulary of the corpus.
  • output_dim: The size of the vector space we want to learn. This would correspond to the number of neurons in word2vec hidden layer.
  • input_length: The number of words in the text we're going to use in each observation. In the examples that follow, we will use a fixed size based on the longest text we need to send and we will pad smaller documents with 0s.

An embedding layer will output a 2D matrix for each input document that contains one vector for each word in the sequence specified...