Book Image

Azure Machine Learning Engineering

By : Sina Fakhraee, Balamurugan Balakreshnan, Megan Masanz
Book Image

Azure Machine Learning Engineering

By: Sina Fakhraee, Balamurugan Balakreshnan, Megan Masanz

Overview of this book

Data scientists working on productionizing machine learning (ML) workloads face a breadth of challenges at every step owing to the countless factors involved in getting ML models deployed and running. This book offers solutions to common issues, detailed explanations of essential concepts, and step-by-step instructions to productionize ML workloads using the Azure Machine Learning service. You’ll see how data scientists and ML engineers working with Microsoft Azure can train and deploy ML models at scale by putting their knowledge to work with this practical guide. Throughout the book, you’ll learn how to train, register, and productionize ML models by making use of the power of the Azure Machine Learning service. You’ll get to grips with scoring models in real time and batch, explaining models to earn business trust, mitigating model bias, and developing solutions using an MLOps framework. By the end of this Azure Machine Learning book, you’ll be ready to build and deploy end-to-end ML solutions into a production system using the Azure Machine Learning service for real-time scenarios.
Table of Contents (17 chapters)
1
Part 1: Training and Tuning Models with the Azure Machine Learning Service
7
Part 2: Deploying and Explaining Models in AMLS
12
Part 3: Productionizing Your Workload with MLOps

Deploying a model for batch inferencing using the Studio

In Chapter 3, Training Machine Learning Models in AMLS, we trained a model and registered it in an Azure Machine Learning workspace. We are going to deploy that model to a managed batch endpoint for batch scoring:

  1. Navigate to your Azure Machine Learning workspace, select Models from the left menu bar to see the models registered in your workspace, and select titanic_servival_model_, as shown in Figure 7.4:
Figure 7.4 – List of models registered in the workspace

Figure 7.4 – List of models registered in the workspace

  1. Click on Deploy and select Deploy to batch endpoint, as shown in Figure 7.5:
Figure 7.5 – Deploy the selected model to a batch endpoint

Figure 7.5 – Deploy the selected model to a batch endpoint

This opens the deployment wizard. Use the following values for the required fields:

  • Endpoint name: titanic-survival-batch-endpoint
  • Model: Retain the default of titanic_survival_model_
  • Deployment name: titanic-deployment
  • Environment...