A Naive Bayes classification algorithm assigns a class to an element of a set that is most probable according to Bayes' theorem.
Let's say that A and B are probabilistic events. P(A) is the probability of A being true
. P(A|B) is the conditional probability of A being true
, given that B is true
. If this is the case, then Bayes' theorem states the following:
P(A) is the prior probability of A being true
without the knowledge of the probability of P(B) and P(B|A). P(A|B) is the posterior probability of A being true
, taking into consideration additional knowledge about the probability of B being true
.
In this chapter, you will learn about the following topics:
- How to apply Bayes' theorem in a basic way to compute the probability of a medical test that is correct in the simple example medical test
- How to grasp Bayes' theorem by proving its statement and its extension
- How to apply Bayes' theorem differently to independent and dependent variables in examples of playing chess
- How...