Book Image

Python Machine Learning Cookbook

By : Prateek Joshi, Vahid Mirjalili
Book Image

Python Machine Learning Cookbook

By: Prateek Joshi, Vahid Mirjalili

Overview of this book

Machine learning is becoming increasingly pervasive in the modern data-driven world. It is used extensively across many fields such as search engines, robotics, self-driving cars, and more. With this book, you will learn how to perform various machine learning tasks in different environments. We’ll start by exploring a range of real-life scenarios where machine learning can be used, and look at various building blocks. Throughout the book, you’ll use a wide variety of machine learning algorithms to solve real-world problems and use Python to implement these algorithms. You’ll discover how to deal with various types of data and explore the differences between machine learning paradigms such as supervised and unsupervised learning. We also cover a range of regression techniques, classification algorithms, predictive modeling, data visualization techniques, recommendation engines, and more with the help of real-world examples.
Table of Contents (19 chapters)
Python Machine Learning Cookbook
Credits
About the Author
About the Reviewer
www.PacktPub.com
Preface
Index

Constructing a k-nearest neighbors regressor


We learned how to use k-nearest neighbors algorithm to build a classifier. The good thing is that we can also use this algorithm as a regressor. Let's see how to use it as a regressor.

How to do it…

  1. Create a new Python file, and import the following packages:

    import numpy as np
    import matplotlib.pyplot as plt
    from sklearn import neighbors
  2. Let's generate some sample Gaussian-distributed data:

    # Generate sample data
    amplitude = 10
    num_points = 100
    X = amplitude * np.random.rand(num_points, 1) - 0.5 * amplitude
  3. We need to add some noise into the data to introduce some randomness into it. The goal of adding noise is to see whether our algorithm can get past it and still function in a robust way:

    # Compute target and add noise
    y = np.sinc(X).ravel() 
    y += 0.2 * (0.5 - np.random.rand(y.size))
  4. Let's visualize it as follows:

    # Plot input data
    plt.figure()
    plt.scatter(X, y, s=40, c='k', facecolors='none')
    plt.title('Input data')
  5. We just generated some data and evaluated...