Book Image

Building Machine Learning Systems with Python - Third Edition

By : Luis Pedro Coelho, Willi Richert, Matthieu Brucher
Book Image

Building Machine Learning Systems with Python - Third Edition

By: Luis Pedro Coelho, Willi Richert, Matthieu Brucher

Overview of this book

Machine learning enables systems to make predictions based on historical data. Python is one of the most popular languages used to develop machine learning applications, thanks to its extensive library support. This updated third edition of Building Machine Learning Systems with Python helps you get up to speed with the latest trends in artificial intelligence (AI). With this guide’s hands-on approach, you’ll learn to build state-of-the-art machine learning models from scratch. Complete with ready-to-implement code and real-world examples, the book starts by introducing the Python ecosystem for machine learning. You’ll then learn best practices for preparing data for analysis and later gain insights into implementing supervised and unsupervised machine learning techniques such as classification, regression and clustering. As you progress, you’ll understand how to use Python’s scikit-learn and TensorFlow libraries to build production-ready and end-to-end machine learning system models, and then fine-tune them for high performance. By the end of this book, you’ll have the skills you need to confidently train and deploy enterprise-grade machine learning models in Python.
Table of Contents (17 chapters)
Free Chapter
1
Getting Started with Python Machine Learning

Splitting into training and testing

At a high level, splitting the dataset into training and testing data in order to obtain a principled estimate of the system's performance is performed in the same way that we saw in previous chapters: we take a certain fraction of our data points (we will use 10 percent) and reserve them for testing; the rest will be used for training.

However, because the data is structured differently in this context, the code is different. In some of the models we explore, setting aside 10 percent of the users would not work as we transfer the data.

The first step is to load the data from the disk, for which we use the following function:

def load(): 
    import numpy as np 
    from scipy import sparse 
 
    data = np.loadtxt('data/ml-100k/u.data') 
    ij = data[:, :2] 
    ij = 1  # original data is in 1-based system 
    values = data...