Book Image

Machine Learning on Kubernetes

By : Faisal Masood, Ross Brigoli
Book Image

Machine Learning on Kubernetes

By: Faisal Masood, Ross Brigoli

Overview of this book

MLOps is an emerging field that aims to bring repeatability, automation, and standardization of the software engineering domain to data science and machine learning engineering. By implementing MLOps with Kubernetes, data scientists, IT professionals, and data engineers can collaborate and build machine learning solutions that deliver business value for their organization. You'll begin by understanding the different components of a machine learning project. Then, you'll design and build a practical end-to-end machine learning project using open source software. As you progress, you'll understand the basics of MLOps and the value it can bring to machine learning projects. You will also gain experience in building, configuring, and using an open source, containerized machine learning platform. In later chapters, you will prepare data, build and deploy machine learning models, and automate workflow tasks using the same platform. Finally, the exercises in this book will help you get hands-on experience in Kubernetes and open source tools, such as JupyterHub, MLflow, and Airflow. By the end of this book, you'll have learned how to effectively build, train, and deploy a machine learning model using the machine learning platform you built.
Table of Contents (16 chapters)
1
Part 1: The Challenges of Adopting ML and Understanding MLOps (What and Why)
5
Part 2: The Building Blocks of an MLOps Platform and How to Build One on Kubernetes
10
Part 3: How to Use the MLOps Platform and Build a Full End-to-End Project Using the New Platform

Packaging, running, and monitoring a model using Seldon Core

In this section, you will package and build the container from the model file you built in Chapter 6, Machine Learning Engineering. You will then use the Seldon Deployment to deploy and access the model. Later in this book, you will automate the process, but to do it manually, as you'll do in this section, we will further strengthen your understanding of the components and how they work.

Before you start this exercise, please make sure that you have created an account with a public Docker registry. We will use the free quay.io as our registry, but you are free to use your preferred one:

  1. Let's first verify that MLflow and Minio (our S3 server) are running in our cluster:
    kubectl get pods -n ml-workshop | grep -iE 'mlflow|minio' 

You should see the following response:

Figure 7.11 – MLflow and Minio are running on the platform

  1. Get the ingress list for...