Book Image

Mastering Machine Learning on AWS

By : Dr. Saket S.R. Mengle, Maximo Gurmendez
Book Image

Mastering Machine Learning on AWS

By: Dr. Saket S.R. Mengle, Maximo Gurmendez

Overview of this book

Amazon Web Services (AWS) is constantly driving new innovations that empower data scientists to explore a variety of machine learning (ML) cloud services. This book is your comprehensive reference for learning and implementing advanced ML algorithms in AWS cloud. As you go through the chapters, you’ll gain insights into how these algorithms can be trained, tuned, and deployed in AWS using Apache Spark on Elastic Map Reduce (EMR), SageMaker, and TensorFlow. While you focus on algorithms such as XGBoost, linear models, factorization machines, and deep nets, the book will also provide you with an overview of AWS as well as detailed practical applications that will help you solve real-world problems. Every application includes a series of companion notebooks with all the necessary code to run on AWS. In the next few chapters, you will learn to use SageMaker and EMR Notebooks to perform a range of tasks, right from smart analytics and predictive modeling through to sentiment analysis. By the end of this book, you will be equipped with the skills you need to effectively handle machine learning projects and implement and evaluate algorithms on AWS.
Table of Contents (24 chapters)
Free Chapter
1
Section 1: Machine Learning on AWS
3
Section 2: Implementing Machine Learning Algorithms at Scale on AWS
9
Section 3: Deep Learning
13
Section 4: Integrating Ready-Made AWS Machine Learning Services
17
Section 5: Optimizing and Deploying Models through AWS
Appendix: Getting Started with AWS

Training and serving the TensorFlow model through SageMaker

Instead of training the model in a notebook instance, we train the model using the SageMaker infrastructure. In previous chapters, we used built-in estimators, such as BlazingText, XGBoost, and Factorization Machines (FMs). In this section, we will explore how we can build our own TensorFlow models and train them through SageMaker, much like we did with these prebuilt models. To do this, we just have to teach SageMaker how our TensorFlow model should be constructed and comply with some conventions regarding the format, location, and structure of the data. Through a Python script, we can specify all of this.

SageMaker will rely on this Python script to perform the training within SageMaker training instances:

import sagemaker
from sagemaker import get_execution_role
import json
import boto3
from sagemaker.tensorflow import...