Book Image

MLOps with Red Hat OpenShift

By : Ross Brigoli, Faisal Masood
Book Image

MLOps with Red Hat OpenShift

By: Ross Brigoli, Faisal Masood

Overview of this book

MLOps with OpenShift offers practical insights for implementing MLOps workflows on the dynamic OpenShift platform. As organizations worldwide seek to harness the power of machine learning operations, this book lays the foundation for your MLOps success. Starting with an exploration of key MLOps concepts, including data preparation, model training, and deployment, you’ll prepare to unleash OpenShift capabilities, kicking off with a primer on containers, pods, operators, and more. With the groundwork in place, you’ll be guided to MLOps workflows, uncovering the applications of popular machine learning frameworks for training and testing models on the platform. As you advance through the chapters, you’ll focus on the open-source data science and machine learning platform, Red Hat OpenShift Data Science, and its partner components, such as Pachyderm and Intel OpenVino, to understand their role in building and managing data pipelines, as well as deploying and monitoring machine learning models. Armed with this comprehensive knowledge, you’ll be able to implement MLOps workflows on the OpenShift platform proficiently.
Table of Contents (13 chapters)
Free Chapter
1
Part 1: Introduction
3
Part 2: Provisioning and Configuration
6
Part 3: Operating ML Workloads

Deploying ML Models as a Service

In the previous chapter, you built a model using RHODS. In this chapter, you will start packaging and deploying your models as a service. You will see that you do not need any application development experience to expose your model. This capability enables your data science teams to be more agile in testing new models and making them available for consumption.

In this chapter, we will cover the following topics.

  • Packaging and deploying models as a service
  • Autoscaling the deployed models
  • Releasing new versions of the model
  • Securing the deployed model endpoint

Before we start, please make sure that you have completed the model-building steps and performed the configuration mentioned in the previous chapter. We’ll start by exposing our model as an HTTP service.