Book Image

TensorFlow 2.0 Computer Vision Cookbook

By : Jesús Martínez
Book Image

TensorFlow 2.0 Computer Vision Cookbook

By: Jesús Martínez

Overview of this book

Computer vision is a scientific field that enables machines to identify and process digital images and videos. This book focuses on independent recipes to help you perform various computer vision tasks using TensorFlow. The book begins by taking you through the basics of deep learning for computer vision, along with covering TensorFlow 2.x’s key features, such as the Keras and tf.data.Dataset APIs. You’ll then learn about the ins and outs of common computer vision tasks, such as image classification, transfer learning, image enhancing and styling, and object detection. The book also covers autoencoders in domains such as inverse image search indexes and image denoising, while offering insights into various architectures used in the recipes, such as convolutional neural networks (CNNs), region-based CNNs (R-CNNs), VGGNet, and You Only Look Once (YOLO). Moving on, you’ll discover tips and tricks to solve any problems faced while building various computer vision applications. Finally, you’ll delve into more advanced topics such as Generative Adversarial Networks (GANs), video processing, and AutoML, concluding with a section focused on techniques to help you boost the performance of your networks. By the end of this TensorFlow book, you’ll be able to confidently tackle a wide range of computer vision problems using TensorFlow 2.x.
Table of Contents (14 chapters)

Using convolutional neural network ensembles to improve accuracy

In machine learning, one of the most robust classifiers is, in fact, a meta-classifier, known as an ensemble. An ensemble is comprised of what's known as weak classifiers, predictive models just a tad better than random guessing. However, when combined, they result in a rather robust algorithm, especially against high variance (overfitting). Some of the most famous examples of ensembles we may encounter include Random Forest and Gradient Boosting Machines.

The good news is that we can leverage the same principle when it comes to neural networks, thus creating a whole that's more than the sum of its parts. Do you want to learn how? Keep reading!

Getting ready

This recipe depends on Pillow and tensorflow_docs, which can be easily installed like this:

$> pip install Pillow git+https://github.com/tensorflow/docs

We'll also be using the famous Caltech 101 dataset, available here: http://www...