Book Image

Azure Machine Learning Engineering

By : Sina Fakhraee, Balamurugan Balakreshnan, Megan Masanz
Book Image

Azure Machine Learning Engineering

By: Sina Fakhraee, Balamurugan Balakreshnan, Megan Masanz

Overview of this book

Data scientists working on productionizing machine learning (ML) workloads face a breadth of challenges at every step owing to the countless factors involved in getting ML models deployed and running. This book offers solutions to common issues, detailed explanations of essential concepts, and step-by-step instructions to productionize ML workloads using the Azure Machine Learning service. You’ll see how data scientists and ML engineers working with Microsoft Azure can train and deploy ML models at scale by putting their knowledge to work with this practical guide. Throughout the book, you’ll learn how to train, register, and productionize ML models by making use of the power of the Azure Machine Learning service. You’ll get to grips with scoring models in real time and batch, explaining models to earn business trust, mitigating model bias, and developing solutions using an MLOps framework. By the end of this Azure Machine Learning book, you’ll be ready to build and deploy end-to-end ML solutions into a production system using the Azure Machine Learning service for real-time scenarios.
Table of Contents (17 chapters)
1
Part 1: Training and Tuning Models with the Azure Machine Learning Service
7
Part 2: Deploying and Explaining Models in AMLS
12
Part 3: Productionizing Your Workload with MLOps

Summary

In this chapter, the focus was on deploying your model as a REST endpoint to support real-time inferencing use cases. We saw that we are able to leverage AMLS Studio for a low-code deployment experience. We also leveraged SDK v2 to deploy models to managed online endpoints. We continued by deploying models through CLI v2 to support model deployment for real-time inferencing. These sections demonstrated deploying real-time web services through low-code, code-first, and configuration-driven approaches. These capabilities empower you to deploy in a variety of ways.

In the next chapter, we will learn how to leverage batch-inferencing to support our use cases.