Book Image

TensorFlow 2.0 Quick Start Guide

By : Tony Holdroyd
Book Image

TensorFlow 2.0 Quick Start Guide

By: Tony Holdroyd

Overview of this book

TensorFlow is one of the most popular machine learning frameworks in Python. With this book, you will improve your knowledge of some of the latest TensorFlow features and will be able to perform supervised and unsupervised machine learning and also train neural networks. After giving you an overview of what's new in TensorFlow 2.0 Alpha, the book moves on to setting up your machine learning environment using the TensorFlow library. You will perform popular supervised machine learning tasks using techniques such as linear regression, logistic regression, and clustering. You will get familiar with unsupervised learning for autoencoder applications. The book will also show you how to train effective neural networks using straightforward examples in a variety of different domains. By the end of the book, you will have been exposed to a large variety of machine learning and neural network TensorFlow techniques.
Table of Contents (15 chapters)
Free Chapter
1
Section 1: Introduction to TensorFlow 2.00 Alpha
5
Section 2: Supervised and Unsupervised Learning in TensorFlow 2.00 Alpha
7
Unsupervised Learning Using TensorFlow 2
8
Section 3: Neural Network Applications of TensorFlow 2.00 Alpha
13
Converting from tf1.12 to tf2

Building and instantiating our model

As we have seen previously, one technique for building a model is to pass the required layers into the tf.keras.Sequential() constructor. In this instance, we have three layers: an embedding layer, an RNN layer, and a dense layer.

The first, embedding layer is a lookup table of vectors, one vector for the numeric value of each character. It has the dimension, embedding_dimension. The middle, the recurrent layer is a GRU; its size is recurrent_nn_units. The last layer is a dense output layer of the length vocabulary_length units.

What the model does is look up the embedding, run the GRU for a single time step using the embedding for input, and pass this to the dense layer, which generates logits (log odds) for the next character.

A diagram showing this is as follows:

The code that implements this model is, therefore, as follows:

def build_model...