Random forest hyperparameters
The range of random forest hyperparameters is large, unless one already has a working knowledge of decision tree hyperparameters, as covered in Chapter 2, Decision Trees in Depth.
In this section, we will go over additional random forest hyperparameters before grouping the hyperparameters that you have already seen. Many of these hyperparameters will be used by XGBoost.
Random forests select decision trees via bagging, meaning that samples are selected with replacement. After all of the samples have been chosen, some samples should remain that have not been chosen.
It's possible to hold back these samples as the test set. After the model is fit on one tree, the model can immediately be scored against this test set. When the hyperparameter is set to
oob_score=True, this is exactly what happens.
In other words,
oob_score provides a shortcut to get a...