Book Image

Hands-On Convolutional Neural Networks with TensorFlow

By : Iffat Zafar, Giounona Tzanidou, Richard Burton, Nimesh Patel, Leonardo Araujo
Book Image

Hands-On Convolutional Neural Networks with TensorFlow

By: Iffat Zafar, Giounona Tzanidou, Richard Burton, Nimesh Patel, Leonardo Araujo

Overview of this book

Convolutional Neural Networks (CNN) are one of the most popular architectures used in computer vision apps. This book is an introduction to CNNs through solving real-world problems in deep learning while teaching you their implementation in popular Python library - TensorFlow. By the end of the book, you will be training CNNs in no time! We start with an overview of popular machine learning and deep learning models, and then get you set up with a TensorFlow development environment. This environment is the basis for implementing and training deep learning models in later chapters. Then, you will use Convolutional Neural Networks to work on problems such as image classification, object detection, and semantic segmentation. After that, you will use transfer learning to see how these models can solve other deep learning problems. You will also get a taste of implementing generative models such as autoencoders and generative adversarial networks. Later on, you will see useful tips on machine learning best practices and troubleshooting. Finally, you will learn how to apply your models on large datasets of millions of images.
Table of Contents (17 chapters)
Title Page
Copyright and Credits
Packt Upsell
Contributors
Preface
Index

Chapter 9. Training at Scale

So far in this book, the datasets we have used or looked at have ranged in size from the tens of thousands (MNIST) of samples to just over a million (ImageNet). Although all these datasets were considered huge when they first came out, and required state-of-the-art machines to use, the great speed at which technologies such as GPUs and cloud computing have advanced has now made them both easy and quick to train by people with relatively low-power machines.

However, some of the amazing power of deep neural networks comes from their ability to scale with the amount of data fed to them. In simple terms, this means that the more good, clean data you can use to train your model, the better the result is going to be. Researchers are aware of this, and we can see that the number of training samples in new public datasets has continued to increase.

As a result of this, it is highly likely that, if you start working on problems in the industry or maybe even just the latest...