Chapter 1, *Getting Started with TensorFlow*, covers the main objects and concepts in TensorFlow. We introduce tensors, variables, and placeholders. We also show how to work with matrices and various mathematical operations in TensorFlow. At the end of the chapter, we show how to access the data sources used in the rest of the book.

Chapter 2, *The TensorFlow Way*, establishes how to connect all the algorithm components from Chapter 1, *Getting Started with TensorFlow*, into a computational graph in multiple ways to create a simple classifier. Along the way, we cover computational graphs, loss functions, back propagation, and training with data.

Chapter 3, *Linear Regression*, focuses on using TensorFlow for exploring various linear regression techniques, such as deming, lasso and ridge, elastic net, and logistic regression. We show how to implement each in a TensorFlow computational graph.

Chapter 4, *Support Vector Machines*, introduces support vector machines (SVMs) and shows how to use TensorFlow to implement linear SVMs, non-linear SVMs, and multi-class SVMs.

Chapter 5, *Nearest-Neighbor Methods*, shows how to implement nearest neighbor techniques using numerical metrics, textual metrics, and scaled distance functions. We use nearest neighbor techniques to perform record matching of addresses and to classify hand-written digits from the MNIST database.

Chapter 6, *Neural Networks*, covers how to implement neural networks in TensorFlow, starting with the operational gates and activation function concepts. We then show a shallow neural network and how to build up various different types of layers. We end the chapter by teaching a TensorFlow neural network to play tic tac toe.

Chapter 7, *Natural Language Processing*, illustrates various text processing techniques with TensorFlow. We show how to implement the bag-of-words technique and TF-IDF (text frequency - inverse document frequency) for text. We then introduce text representations (CBOW, continuous bag-of-words, and skip-gram) and use these techniques for Word2Vec and Doc2Vec to make real-world predictions such as predicting whether a text message is spam.

Chapter 8, *Convolutional Neural Networks*, expands our knowledge of neural networks by illustrating how to use images with convolutional layers (and other image layers and functions). We show how to build a shortened CNN for MNIST digit recognition and extend it to color images in the CIFAR-10 task. We also illustrate how to extend prior trained image recognition models for custom tasks. We end the chapter by explaining and demonstrating the stylenet/neural style and deep-dream algorithms in TensorFlow.

Chapter 9, *Recurrent Neural Networks*, explains how to implement recurrent neural networks in TensorFlow. We show how to do text-spam prediction, and expand the RNN model to perform text generation based on the works of Shakespeare. We also train a sequence-to-sequence model for German-English translation. We finish the chapter by showing the usage of Siamese RNNs for record matching on addresses.

Chapter 10, *Taking TensorFlow to Production*, gives tips and examples on moving TensorFlow to a production environment and how to take advantage of multiple processing devices (for example, GPUs) and setting up TensorFlow distributed on multiple machines. We end the chapter by showing an example of setting up an RNN model on TensorFlow serving an API.

Chapter 11, *More with TensorFlow*, demonstrates the versatility of TensorFlow by illustrating how to use the k-means and genetic algorithms, and how to solve a system of ordinary differential equations (ODEs). We also show the various uses of TensorBoard, and how to view computational graph metrics and charts.