ID3 is one of the simplest algorithms to produce decision trees with categorical classes and attributes. We chose to explain it because of its simplicity (but will not examine its use here). We will then build upon this understanding when discussing the other algorithms.
ID3 relies on a measure called information gain to build the trees. The goal is to maximize the predictive power of the tree by reducing the uncertainty in the data.
Entropy is a measure of uncertainty in a source of information. We discuss it before we talk about information gain, as information gain relies on the computation of entropy.
Entropy is easily understood using an example. Let's consider three opaque boxes containing 100 M&Ms each. In box 1, there are 99 red M&Ms and 1 yellow. In box 2, there are as many red and yellow M&Ms. In box 3, there are 25 red M&Ms and 75 yellow. Knowing this, we want to guess the color of the next M&M we pick from each of the boxes.
As you have guessed, it...