Book Image

Hands-On Neural Networks with TensorFlow 2.0

By : Paolo Galeone
Book Image

Hands-On Neural Networks with TensorFlow 2.0

By: Paolo Galeone

Overview of this book

TensorFlow, the most popular and widely used machine learning framework, has made it possible for almost anyone to develop machine learning solutions with ease. With TensorFlow (TF) 2.0, you'll explore a revamped framework structure, offering a wide variety of new features aimed at improving productivity and ease of use for developers. This book covers machine learning with a focus on developing neural network-based solutions. You'll start by getting familiar with the concepts and techniques required to build solutions to deep learning problems. As you advance, you’ll learn how to create classifiers, build object detection and semantic segmentation networks, train generative models, and speed up the development process using TF 2.0 tools such as TensorFlow Datasets and TensorFlow Hub. By the end of this TensorFlow book, you'll be ready to solve any machine learning problem by developing solutions using TF 2.0 and putting them into production.
Table of Contents (15 chapters)
Free Chapter
1
Section 1: Neural Network Fundamentals
4
Section 2: TensorFlow Fundamentals
8
Section 3: The Application of Neural Networks

Regularization

Regularization is a way to deal with the problem of overfitting: the goal of regularization is to modify the learning algorithm, or the model itself, to make the model perform well—not just on the training data, but also on new inputs.

One of the most widely used solutions to the overfitting problem—and probably one of the most simple to understand and analyze—is known as dropout.

Dropout

The idea of dropout is to train an ensemble of neural networks and average the results instead of training only a single standard network. Dropout builds new neural networks, starting from a standard neural network, by dropping out neurons with probability.

When a neuron is dropped out, its output is set...