Book Image

TensorFlow 2.0 Quick Start Guide

By : Tony Holdroyd
Book Image

TensorFlow 2.0 Quick Start Guide

By: Tony Holdroyd

Overview of this book

TensorFlow is one of the most popular machine learning frameworks in Python. With this book, you will improve your knowledge of some of the latest TensorFlow features and will be able to perform supervised and unsupervised machine learning and also train neural networks. After giving you an overview of what's new in TensorFlow 2.0 Alpha, the book moves on to setting up your machine learning environment using the TensorFlow library. You will perform popular supervised machine learning tasks using techniques such as linear regression, logistic regression, and clustering. You will get familiar with unsupervised learning for autoencoder applications. The book will also show you how to train effective neural networks using straightforward examples in a variety of different domains. By the end of the book, you will have been exposed to a large variety of machine learning and neural network TensorFlow techniques.
Table of Contents (15 chapters)
Free Chapter
1
Section 1: Introduction to TensorFlow 2.00 Alpha
5
Section 2: Supervised and Unsupervised Learning in TensorFlow 2.00 Alpha
7
Unsupervised Learning Using TensorFlow 2
8
Section 3: Neural Network Applications of TensorFlow 2.00 Alpha
13
Converting from tf1.12 to tf2

Recurrent architectures

Hence, a new architecture is required for handling data that arrives sequentially, and where both or either of its input values and output values are of variable length for example, the words in a sentence in a language translation application. In this case, both the input and output to the model are of varying lengths as in the fourth mode previously. Also, in order to predict subsequent words given the current word, previous words need to be known as well. This new neural network architecture is called an RNN, and it is specifically designed to handle sequential data.

The term recurrent arises because such models perform the same computation on every element of a sequence, where each output is dependent on previous output. Theoretically, each output depends on all of the previous output items, but in practical terms, RNNs are limited to looking back just...