Sometimes, one decision tree is not enough, so a set of decision trees is used to produce more powerful models. These are called ensemble learning algorithms. Ensemble learning algorithms are not limited to using decision trees as base models.
The most popular ensemble learning algorithm is random forest. In random forest, rather than growing one single tree, K number of trees are grown. Every tree is given a random subset S of training data. To add a twist to it, every tree only uses a subset of features. When it comes to making predictions, a majority vote is done on the trees and that becomes the prediction.
Let me explain this with an example. The goal is to make a prediction for a given person about whether he/she has good credit or bad credit.
To do this, we will provide labeled training data—in this case, a person with features and labels indicating whether he/she has good credit or bad credit. Now we do not want to create feature bias, so we...