Book Image

Neural Network Programming with Tensorflow

By : Manpreet Singh Ghotra, Rajdeep Dua
Book Image

Neural Network Programming with Tensorflow

By: Manpreet Singh Ghotra, Rajdeep Dua

Overview of this book

If you're aware of the buzz surrounding the terms such as "machine learning," "artificial intelligence," or "deep learning," you might know what neural networks are. Ever wondered how they help in solving complex computational problem efficiently, or how to train efficient neural networks? This book will teach you just that. You will start by getting a quick overview of the popular TensorFlow library and how it is used to train different neural networks. You will get a thorough understanding of the fundamentals and basic math for neural networks and why TensorFlow is a popular choice Then, you will proceed to implement a simple feed forward neural network. Next you will master optimization techniques and algorithms for neural networks using TensorFlow. Further, you will learn to implement some more complex types of neural networks such as convolutional neural networks, recurrent neural networks, and Deep Belief Networks. In the course of the book, you will be working on real-world datasets to get a hands-on understanding of neural network programming. You will also get to train generative models and will learn the applications of autoencoders. By the end of this book, you will have a fair understanding of how you can leverage the power of TensorFlow to train neural networks of varying complexities, without any hassle. While you are learning about various neural network implementations you will learn the underlying mathematics and linear algebra and how they map to the appropriate TensorFlow constructs.
Table of Contents (17 chapters)
Title Page
About the Authors
About the Reviewer
Customer Feedback

Effect of the number of neurons in an RBM layer in a DBN

Let's look at how changing the number of neurons in an RBM layer affects the test set's accuracy:

An RBM layer with 512 neurons

The following is the output of a DBN with 512 neurons in an RBM layer. The reconstruction loss has come down and the test set's accuracy has come down as well:

Reconstruction loss: 0.128517: 100%|██████████| 5/5 [01:32<00:00, 19.25s/it]
Start deep belief net finetuning...
Tensorboard logs dir for this run is /home/ubuntu/.yadlt/logs/run55
Accuracy: 0.0758: 100%|██████████| 1/1 [00:06<00:00, 6.40s/it]
Test set accuracy: 0.0689999982715

Notice how the accuracy and test set accuracy both have come down. This means increasing the number of neurons doesn't necessarily improve the accuracy.

An RBM layer with 128 neurons

A 128-neuron RBM layer leads to higher test set accuracy but a lower overall accuracy:

Reconstruction loss: 0.180337: 100%|██████████| 5/5 [00:32<00:00, 6.44s/it]
 Start deep belief...