Book Image

Machine Learning with R

By : Brett Lantz
Book Image

Machine Learning with R

By: Brett Lantz

Overview of this book

Machine learning, at its core, is concerned with transforming data into actionable knowledge. This fact makes machine learning well-suited to the present-day era of "big data" and "data science". Given the growing prominence of R—a cross-platform, zero-cost statistical programming environment—there has never been a better time to start applying machine learning. Whether you are new to data science or a veteran, machine learning with R offers a powerful set of methods for quickly and easily gaining insight from your data. "Machine Learning with R" is a practical tutorial that uses hands-on examples to step through real-world application of machine learning. Without shying away from the technical details, we will explore Machine Learning with R using clear and practical examples. Well-suited to machine learning beginners or those with experience. Explore R to find the answer to all of your questions. How can we use machine learning to transform data into action? Using practical examples, we will explore how to prepare data for analysis, choose a machine learning method, and measure the success of the process. We will learn how to apply machine learning methods to a variety of common tasks including classification, prediction, forecasting, market basket analysis, and clustering. By applying the most effective machine learning methods to real-world problems, you will gain hands-on experience that will transform the way you think about data. "Machine Learning with R" will provide you with the analytical tools you need to quickly gain insight from complex data.
Table of Contents (19 chapters)
Machine Learning with R
Credits
About the Author
About the Reviewers
www.PacktPub.com
Preface
9
Finding Groups of Data – Clustering with k-means
Index

Summary


In this chapter, we learned about classification using naive Bayes. This algorithm constructs tables of probabilities that are used to estimate the likelihood that new examples belong to various classes. The probabilities are calculated using a formula known as Bayes' theorem, which specifies how dependent events are related. Although Bayes' theorem can be computationally expensive to process, a simplified version that makes so-called "naive" assumptions about the independence of features is capable of being used with extremely large datasets.

The naive Bayes classifier is often used for text classification. To illustrate its effectiveness, we employed naive Bayes on a classification task involving filtering spam SMS messages. Preparing the text data for analysis required the use of specialized R packages for text processing and visualization. Ultimately, the model was able to classify nearly 98 percent of all SMS messages correctly as spam or ham.

In the next chapter, we will examine...