Book Image

Mastering Machine Learning with R, Second Edition - Second Edition

Book Image

Mastering Machine Learning with R, Second Edition - Second Edition

Overview of this book

This book will teach you advanced techniques in machine learning with the latest code in R 3.3.2. You will delve into statistical learning theory and supervised learning; design efficient algorithms; learn about creating Recommendation Engines; use multi-class classification and deep learning; and more. You will explore, in depth, topics such as data mining, classification, clustering, regression, predictive modeling, anomaly detection, boosted trees with XGBOOST, and more. More than just knowing the outcome, you’ll understand how these concepts work and what they do. With a slow learning curve on topics such as neural networks, you will explore deep learning, and more. By the end of this book, you will be able to perform machine learning with R in the cloud using AWS in various scenarios with different datasets.
Table of Contents (23 chapters)
Title Page
Credits
About the Author
About the Reviewers
Packt Upsell
Customer Feedback
Preface
16
Sources

Multivariate Adaptive Regression Splines (MARS)


How would you like a modeling technique that provides all of the following?

  • Offers the flexibility to build linear and nonlinear models for both regression and classification
  • Can support variable interaction terms
  • Is simple to understand and explain
  • Requires little data preprocessing
  • Handles all types of data: numeric, factors, and so on
  • Performs well on unseen data, that is, it does well in bias-variance trade-off 

If that all sounds appealing, then I cannot recommend the use of MARS models enough. The method was brought to my attention several months ago, and I have found it to perform extremely well. In fact, in a recent case of mine, it outperformed both a random forest and boosted trees on test/validation data. It has quickly become my baseline model and all others are competitors. The other benefit I've seen is that it has negated much of the feature engineering I was doing. Much of that was using Weight-of-Evidence (WOE) and Information Values...