Book Image

Production-Ready Applied Deep Learning

By : Tomasz Palczewski, Jaejun (Brandon) Lee, Lenin Mookiah
Book Image

Production-Ready Applied Deep Learning

By: Tomasz Palczewski, Jaejun (Brandon) Lee, Lenin Mookiah

Overview of this book

Machine learning engineers, deep learning specialists, and data engineers encounter various problems when moving deep learning models to a production environment. The main objective of this book is to close the gap between theory and applications by providing a thorough explanation of how to transform various models for deployment and efficiently distribute them with a full understanding of the alternatives. First, you will learn how to construct complex deep learning models in PyTorch and TensorFlow. Next, you will acquire the knowledge you need to transform your models from one framework to the other and learn how to tailor them for specific requirements that deployment environments introduce. The book also provides concrete implementations and associated methodologies that will help you apply the knowledge you gain right away. You will get hands-on experience with commonly used deep learning frameworks and popular cloud services designed for data analytics at scale. Additionally, you will get to grips with the authors’ collective knowledge of deploying hundreds of AI-based services at a large scale. By the end of this book, you will have understood how to convert a model developed for proof of concept into a production-ready application optimized for a particular production setting.
Table of Contents (19 chapters)
1
Part 1 – Building a Minimum Viable Product
6
Part 2 – Building a Fully Featured Product
10
Part 3 – Deployment and Maintenance

Going through the basic theory of DL

As briefly described in Chapter 1, Effective Planning of Deep-Learning-Driven Projects, DL is a machine learning (ML) technique based on artificial neural networks (ANNs). In this section, our goal is to explain how ANNs work without going too deep into the math.

How does DL work?

An ANN is basically a set of connected neurons. As shown in Figure 3.1, neurons from an ANN and neurons from our brain behave in a similar way. Each connection in an ANN consists of a tunable parameter called the weight. When there is a connection from neuron A to neuron B, the output of neuron A gets multiplied by the weight of the connection; the weighted value becomes the input of neuron B. Bias is another tunable parameter within a neuron; a neuron sums up all the inputs and adds the bias. The last operation is an activation function that maps the computed value into a different range. The value in the new range is the output of the neuron, which gets passed...