Book Image

Hands-On Neural Networks with TensorFlow 2.0

By : Paolo Galeone
Book Image

Hands-On Neural Networks with TensorFlow 2.0

By: Paolo Galeone

Overview of this book

TensorFlow, the most popular and widely used machine learning framework, has made it possible for almost anyone to develop machine learning solutions with ease. With TensorFlow (TF) 2.0, you'll explore a revamped framework structure, offering a wide variety of new features aimed at improving productivity and ease of use for developers. This book covers machine learning with a focus on developing neural network-based solutions. You'll start by getting familiar with the concepts and techniques required to build solutions to deep learning problems. As you advance, you’ll learn how to create classifiers, build object detection and semantic segmentation networks, train generative models, and speed up the development process using TF 2.0 tools such as TensorFlow Datasets and TensorFlow Hub. By the end of this TensorFlow book, you'll be ready to solve any machine learning problem by developing solutions using TF 2.0 and putting them into production.
Table of Contents (15 chapters)
Free Chapter
1
Section 1: Neural Network Fundamentals
4
Section 2: TensorFlow Fundamentals
8
Section 3: The Application of Neural Networks

Exercises

This chapter was filled with various theoretical concepts to understand so, just like the previous chapter, don't skip the exercises:

  1. What are the similarities between artificial and biological neurons?
  2. Does the neuron's topology change the neural network's behavior?
  3. Why do neurons require a non-linear activation function?
  4. If the activation function is linear, a multi-layer neural network is the same as a single layer neural network. Why?
  5. How is an error in input data treated by a neural network?
  6. Write the mathematical formulation of a generic neuron.
  7. Write the mathematical formulation of a fully connected layer.
  8. Why can a multi-layer configuration solve problems with non-linearly separable solutions?
  9. Draw the graph of the sigmoid, tanh, and ReLu activation functions.
  10. Is it always required to format training set labels into a one-hot encoded representation...