Book Image

Mastering Machine Learning with R - Second Edition

Book Image

Mastering Machine Learning with R - Second Edition

Overview of this book

This book will teach you advanced techniques in machine learning with the latest code in R 3.3.2. You will delve into statistical learning theory and supervised learning; design efficient algorithms; learn about creating Recommendation Engines; use multi-class classification and deep learning; and more. You will explore, in depth, topics such as data mining, classification, clustering, regression, predictive modeling, anomaly detection, boosted trees with XGBOOST, and more. More than just knowing the outcome, you’ll understand how these concepts work and what they do. With a slow learning curve on topics such as neural networks, you will explore deep learning, and more. By the end of this book, you will be able to perform machine learning with R in the cloud using AWS in various scenarios with different datasets.
Table of Contents (23 chapters)
Title Page
Credits
About the Author
About the Reviewers
Packt Upsell
Customer Feedback
Preface
16
Sources

Random forest


Like our motivation with the use of the Gower metric in handling mixed, in fact, messy data, we can apply random forest in an unsupervised fashion.  Selection of this method has some advantages:

  • Robust against outliers and highly skewed variables
  • No need to transform or scale the data
  • Handles mixed data (numeric and factors)
  • Can accommodate missing data
  • Can be used on data with a large number of variables, in fact, it can be used to eliminate useless features by examining variable importance
  • The dissimilarity matrix produced serves as an input to the other techniques discussed earlier (hierarchical, k-means, and PAM) 

A couple words of caution.  It may take some trial and error to properly tune the Random Forest with respect to the number of variables sampled at each tree split (mtry = ? in the function) and the number of trees grown.  Studies done show that the more trees grown, up to a point, provide better results, and a good starting point is to grow 2,000 trees (Shi, T. &amp...