Book Image

Learning Predictive Analytics with R

By : Eric Mayor
Book Image

Learning Predictive Analytics with R

By: Eric Mayor

Overview of this book

This book is packed with easy-to-follow guidelines that explain the workings of the many key data mining tools of R, which are used to discover knowledge from your data. You will learn how to perform key predictive analytics tasks using R, such as train and test predictive models for classification and regression tasks, score new data sets and so on. All chapters will guide you in acquiring the skills in a practical way. Most chapters also include a theoretical introduction that will sharpen your understanding of the subject matter and invite you to go further. The book familiarizes you with the most common data mining tools of R, such as k-means, hierarchical regression, linear regression, association rules, principal component analysis, multilevel modeling, k-NN, Naïve Bayes, decision trees, and text mining. It also provides a description of visualization techniques using the basic visualization tools of R as well as lattice for visualizing patterns in data organized in groups. This book is invaluable for anyone fascinated by the data mining opportunities offered by GNU R and its packages.
Table of Contents (23 chapters)
Learning Predictive Analytics with R
Credits
About the Author
About the Reviewers
www.PacktPub.com
Preface
Exercises and Solutions
Index

ID3


ID3 is one of the simplest algorithms to produce decision trees with categorical classes and attributes. We chose to explain it because of its simplicity (but will not examine its use here). We will then build upon this understanding when discussing the other algorithms.

ID3 relies on a measure called information gain to build the trees. The goal is to maximize the predictive power of the tree by reducing the uncertainty in the data.

Entropy

Entropy is a measure of uncertainty in a source of information. We discuss it before we talk about information gain, as information gain relies on the computation of entropy.

Entropy is easily understood using an example. Let's consider three opaque boxes containing 100 M&Ms each. In box 1, there are 99 red M&Ms and 1 yellow. In box 2, there are as many red and yellow M&Ms. In box 3, there are 25 red M&Ms and 75 yellow. Knowing this, we want to guess the color of the next M&M we pick from each of the boxes.

As you have guessed, it...