Book Image

Machine Learning Quick Reference

By : Rahul Kumar
Book Image

Machine Learning Quick Reference

By: Rahul Kumar

Overview of this book

Machine learning makes it possible to learn about the unknowns and gain hidden insights into your datasets by mastering many tools and techniques. This book guides you to do just that in a very compact manner. After giving a quick overview of what machine learning is all about, Machine Learning Quick Reference jumps right into its core algorithms and demonstrates how they can be applied to real-world scenarios. From model evaluation to optimizing their performance, this book will introduce you to the best practices in machine learning. Furthermore, you will also look at the more advanced aspects such as training neural networks and work with different kinds of data, such as text, time-series, and sequential data. Advanced methods and techniques such as causal inference, deep Gaussian processes, and more are also covered. By the end of this book, you will be able to train fast, accurate machine learning models at your fingertips, which you can easily use as a point of reference.
Table of Contents (18 chapters)
Title Page
Copyright and Credits
About Packt
Contributors
Preface
Index

The Bayes theorem


The Bayes theorem helps us in finding posterior probability, given a certain condition:

P(A|B)= P(B|A) * P(A)/P(B)

A and B can be deemed as the target and features, respectively.

Where, P(A|B): posterior probability, which implies the probability of event A, given that B has taken place:

  • P(B|A): The likelihood that implies the probability of feature B, given the target A
  • P(A): The prior probability of target A
  • P(B): The prior probability of feature B

How the Naive Bayes classifier works

We will try to understand all of this by looking at the example of the Titanic. While the Titanic was sinking, a few of the categories had priority over others, in terms of being saved. We have the following dataset (it is a Kaggle dataset):

Person category

Survival chance

Woman

Yes

Kid

Yes

Kid

Yes

Man

No

Woman

Yes

Woman

Yes

Man

No

Man

Yes

Kid

Yes

Woman

No

Kid

No

Woman

No

Man

Yes

Man

No

Woman

Yes

 

Now, let's prepare a likelihood table for the preceding information:

 

 

Survival chance

 

 

 

 

No

Yes

Grand Total

 

 

Category

Kid

1

3

4

4/15=

0.27

Man...