Book Image

Hands-On Convolutional Neural Networks with TensorFlow

By : Iffat Zafar, Giounona Tzanidou, Richard Burton, Nimesh Patel, Leonardo Araujo
Book Image

Hands-On Convolutional Neural Networks with TensorFlow

By: Iffat Zafar, Giounona Tzanidou, Richard Burton, Nimesh Patel, Leonardo Araujo

Overview of this book

Convolutional Neural Networks (CNN) are one of the most popular architectures used in computer vision apps. This book is an introduction to CNNs through solving real-world problems in deep learning while teaching you their implementation in popular Python library - TensorFlow. By the end of the book, you will be training CNNs in no time! We start with an overview of popular machine learning and deep learning models, and then get you set up with a TensorFlow development environment. This environment is the basis for implementing and training deep learning models in later chapters. Then, you will use Convolutional Neural Networks to work on problems such as image classification, object detection, and semantic segmentation. After that, you will use transfer learning to see how these models can solve other deep learning problems. You will also get a taste of implementing generative models such as autoencoders and generative adversarial networks. Later on, you will see useful tips on machine learning best practices and troubleshooting. Finally, you will learn how to apply your models on large datasets of millions of images.
Table of Contents (17 chapters)
Title Page
Copyright and Credits
Packt Upsell
Contributors
Preface
Index

Making efficient pipelines


When we dealt with smaller datasets, it was enough for us to just load the entire dataset into computer memory. This is simple and works fine if your dataset is small enough; however, a lot of the time, this won't be the case. We will now look at how to overcome this issue.

 

In order to avoid loading all our data at once, we will need to create a data pipeline that can feed our training data to the model. This pipeline will be responsible for, among other things, loading a batch of elements from storage, preprocessing the data, and finally, feeding the data to our model. Luckily for us, this can all be easily accomplished using the TensorFlow data API.

For these examples, we are going to assume that we have saved our data into multiple (two in this case) TFRecord files like those described previously. There is no difference if you have more than two; you just have to include all their names when setting things up.

We start by creating a TFRecord dataset from a list...