Book Image

TensorFlow Machine Learning Projects

By : Ankit Jain, Amita Kapoor
Book Image

TensorFlow Machine Learning Projects

By: Ankit Jain, Amita Kapoor

Overview of this book

TensorFlow has transformed the way machine learning is perceived. TensorFlow Machine Learning Projects teaches you how to exploit the benefits—simplicity, efficiency, and flexibility—of using TensorFlow in various real-world projects. With the help of this book, you’ll not only learn how to build advanced projects using different datasets but also be able to tackle common challenges using a range of libraries from the TensorFlow ecosystem. To start with, you’ll get to grips with using TensorFlow for machine learning projects; you’ll explore a wide range of projects using TensorForest and TensorBoard for detecting exoplanets, TensorFlow.js for sentiment analysis, and TensorFlow Lite for digit classification. As you make your way through the book, you’ll build projects in various real-world domains, incorporating natural language processing (NLP), the Gaussian process, autoencoders, recommender systems, and Bayesian neural networks, along with trending areas such as Generative Adversarial Networks (GANs), capsule networks, and reinforcement learning. You’ll learn how to use the TensorFlow on Spark API and GPU-accelerated computing with TensorFlow to detect objects, followed by how to train and develop a recurrent neural network (RNN) model to generate book scripts. By the end of this book, you’ll have gained the required expertise to build full-fledged machine learning projects at work.
Table of Contents (23 chapters)
Title Page
Copyright and Credits
Dedication
About Packt
Contributors
Preface
Index

Understanding categorical cross entropy loss


 Cross entropy loss, or log loss, measures the performance of the classification model whose output is a probability between 0 and 1. Cross entropy increases as the predicted probability of a sample diverges from the actual value. Therefore, predicting a probability of 0.05 when the actual label has a value of 1 increases the cross entropy loss. 

Mathematically, for a binary classification setting, cross entropy is defined as the following equation:

Here, 

 is the binary indicator (0 or 1) denoting the class for the sample 

, while 

 denotes the predicted probability between 0 and 1 for that sample.

Alternatively, if there are more than two classes, we define a new term known as categorical cross entropy. It is calculated as a sum of separate loss for each class label per observation. Mathematically, it is given as the following equation:

 

Here, 

 denotes the number of classes, 

 is a binary indicator (0 or 1) that indicates whether 

 is the correct class...