#### Overview of this book

This book will teach you advanced techniques in machine learning with the latest code in R 3.3.2. You will delve into statistical learning theory and supervised learning; design efficient algorithms; learn about creating Recommendation Engines; use multi-class classification and deep learning; and more. You will explore, in depth, topics such as data mining, classification, clustering, regression, predictive modeling, anomaly detection, boosted trees with XGBOOST, and more. More than just knowing the outcome, you’ll understand how these concepts work and what they do. With a slow learning curve on topics such as neural networks, you will explore deep learning, and more. By the end of this book, you will be able to perform machine learning with R in the cloud using AWS in various scenarios with different datasets.
Title Page
Credits
Packt Upsell
Customer Feedback
Preface
Free Chapter
A Process for Success
Linear Regression - The Blocking and Tackling of Machine Learning
Logistic Regression and Discriminant Analysis
Advanced Feature Selection in Linear Models
More Classification Techniques - K-Nearest Neighbors and Support Vector Machines
Classification and Regression Trees
Neural Networks and Deep Learning
Cluster Analysis
Principal Components Analysis
Market Basket Analysis, Recommendation Engines, and Sequential Analysis
Creating Ensembles and Multiclass Classification
Time Series and Causality
Text Mining
R on the Cloud
R Fundamentals
Sources

## Logistic regression

As previously discussed, our classification problem is best modeled with the probabilities that are bound by `0` and `1`. We can do this for all of our observations with a number of different functions, but here we will focus on the logistic function. The logistic function used in logistic regression is as follows:

If you have ever placed a friendly wager on horse races or the World Cup, you may understand the concept better as odds. The logistic function can be turned to odds with the formulation of Probability (Y) / 1 - Probability (Y). For instance, if the probability of Brazil winning the World Cup is 20 percent, then the odds are 0.2 / 1 - 0.2, which is equal to 0.25, translating to odds of one in four.

To translate the odds back to probability, take the odds and divide by one plus the odds. The World Cup example is thus 0.25 / 1 + 0.25, which is equal to 20 percent. Additionally, let's consider the odds ratio. Assume that the odds of Germany winning the Cup are 0.18....