Book Image

Learning Data Mining with Python - Second Edition

By : Robert Layton
Book Image

Learning Data Mining with Python - Second Edition

By: Robert Layton

Overview of this book

This book teaches you to design and develop data mining applications using a variety of datasets, starting with basic classification and affinity analysis. This book covers a large number of libraries available in Python, including the Jupyter Notebook, pandas, scikit-learn, and NLTK. You will gain hands on experience with complex data types including text, images, and graphs. You will also discover object detection using Deep Neural Networks, which is one of the big, difficult areas of machine learning right now. With restructured examples and code samples updated for the latest edition of Python, each chapter of this book introduces you to new algorithms and techniques. By the end of the book, you will have great insights into using Python for data mining and understanding of the algorithms as well as implementations.
Table of Contents (20 chapters)
Title Page
Credits
About the Author
About the Reviewer
www.PacktPub.com
Customer Feedback
Preface

Training on Amazon's EMR infrastructure


We are going to use Amazon's Elastic Map Reduce (EMR) infrastructure to run our parsing and model building jobs.

In order to do that, we first need to create a bucket in Amazon's storage cloud. To do this, open the Amazon S3 console in your web browser by going to http://console.aws.amazon.com/s3 and click on Create Bucket. Remember the name of the bucket, as we will need it later.

Right-click on the new bucket and select Properties. Then, change the permissions, granting everyone full access. This is not a good security practice in general, and I recommend that you change the access permissions after you complete this chapter. You can use advanced permissions in Amazon's services to give your script access and also protect against third parties viewing your data.

Left-click the bucket to open it and click on Create Folder. Name the folder blogs_train. We are going to upload our training data to this folder for processing on the cloud.

On your computer...