Serving models with MLflow
One of the benefits of using MLflow in Azure Databricks as the repository of our machine learning models is that it allows us to simply serve predictions from the Model Registry as REST API endpoints. These endpoints are updated automatically on newer versions of the models in each one of the stages, therefore this is a complementary feature of keeping track of the model's lifecycle using the MLflow Model Registry.
Enabling a model to be served as a REST API endpoint can be done from the Model Registry UI in the Azure workspace. To enable a model to be served, go to the model page in the Model Registry UI and click on the Enable Serving button in the Serving tab.
Once you have clicked on the button, which is shown in the following screenshot, you should see the status as Pending. After a couple of minutes, the status will change to Ready:
If you want to disable...