Book Image

Machine Learning Using TensorFlow Cookbook

By : Luca Massaron, Alexia Audevart, Konrad Banachewicz
Book Image

Machine Learning Using TensorFlow Cookbook

By: Luca Massaron, Alexia Audevart, Konrad Banachewicz

Overview of this book

The independent recipes in Machine Learning Using TensorFlow Cookbook will teach you how to perform complex data computations and gain valuable insights into your data. Dive into recipes on training models, model evaluation, sentiment analysis, regression analysis, artificial neural networks, and deep learning - each using Google’s machine learning library, TensorFlow. This cookbook covers the fundamentals of the TensorFlow library, including variables, matrices, and various data sources. You’ll discover real-world implementations of Keras and TensorFlow and learn how to use estimators to train linear models and boosted trees, both for classification and regression. Explore the practical applications of a variety of deep learning architectures, such as recurrent neural networks and Transformers, and see how they can be used to solve computer vision and natural language processing (NLP) problems. With the help of this book, you will be proficient in using TensorFlow, understand deep learning from the basics, and be able to implement machine learning algorithms in real-world scenarios.
Table of Contents (15 chapters)
5
Boosted Trees
11
Reinforcement Learning with TensorFlow and TF-Agents
13
Other Books You May Enjoy
14
Index

What this book covers

Chapter 1, Getting Started with TensorFlow 2.x, covers the main objects and concepts in TensorFlow. We introduce tensors, variables, and placeholders. We also show how to work with matrices and various mathematical operations in TensorFlow. At the end of the chapter, we show how to access the data sources used in the rest of the book.

Chapter 2, The TensorFlow Way, establishes how to connect all the algorithm components from Chapter 1, Getting Started with TensorFlow, into a computational graph in multiple ways to create a simple classifier. Along the way, we cover computational graphs, loss functions, backpropagation, and training with data.

Chapter 3, Keras, focuses on the high-level TensorFlow API named Keras. After having introduced the layers that are the building blocks of the models, we will cover the Sequential, Functional, and Sub-Classing APIs to create Keras models.

Chapter 4, Linear Regression, focuses on using TensorFlow for exploring various linear regression techniques, such as Lasso and Ridge, ElasticNet, and logistic regression. We conclude extending linear models with Wide & Deep. We show how to implement each model using estimators.

Chapter 5, Boosted Trees, discusses the TensorFlow implementation of boosted trees – one of the most popular models for tabular data. We demonstrate the functionality by addressing a business problem of predicting hotel booking cancellations.

Chapter 6, Neural Networks, covers how to implement neural networks in TensorFlow, starting with the operational gates and activation function concepts. We then show a shallow neural network and how to build up various different types of layers. We end the chapter by teaching a TensorFlow neural network to play tic tac toe.

Chapter 7, Predicting with Tabular Data, this chapter extends the previous one by demonstrating how to use TensorFlow for tabular data. We show how to process data handling missing values, binary, nominal, ordinal, and date features. We also introduce activation functions like GELU and SELU (particularly effective for deep architectures) and the correct usage of cross-validation in order to validate your architecture and parameters when you do not have enough data available.

Chapter 8, Convolutional Neural Networks, expands our knowledge of neural networks by illustrating how to use images with convolutional layers (and other image layers and functions). We show how to build a shortened CNN for MNIST digit recognition and extend it to color images in the CIFAR-10 task. We also illustrate how to extend prior-trained image recognition models for custom tasks. We end the chapter by explaining and demonstrating the StyleNet/neural style and DeepDream algorithms in TensorFlow.

Chapter 9, Recurrent Neural Networks, introduces a powerful architecture type (RNN) that has been instrumental in achieving state-of-the-art results on different modes of sequential data; applications presented include time-series prediction and text sentiment analysis.

Chapter 10, Transformers, is dedicated to Transformers – a new class of deep learning models that have revolutionized the field of Natural Language Processing (NLP). We demonstrate how to leverage their strength for both generative and discriminative tasks.

Chapter 11, Reinforcement Learning with TensorFlow and TF-Agents, presents the TensorFlow library dedicated to reinforcement learning. The structured approach allows us to handle problems ranging from simple games to content personalization in e-commerce.

Chapter 12, Taking TensorFlow to Production, gives tips and examples on moving TensorFlow to a production environment and how to take advantage of multiple processing devices (for example, GPUs) and setting up TensorFlow distributed on multiple machines. We also show the various uses of TensorBoard, and how to view computational graph metrics and charts. We end the chapter by showing an example of setting up an RNN model on TensorFlow serving an API.