Similar to the bagging
function, adabag
provides a cross-validation function for the boosting method named boosting.cv
. In this recipe, we will demonstrate how to perform cross-validation using boosting.cv
from the package adabag
.
In this recipe, we continue to use the telecom churn
dataset as the input data source to perform a k-fold cross-validation with the boosting
method.
Perform the following steps to retrieve the minimum estimation errors via cross-validation with the boosting
method:
- First, you can use
boosting.cv
to cross-validate the training dataset:
> churn.boostcv = boosting.cv(churn ~ ., v=10, data=trainset, mfinal=5,control=rpart.control(cp=0.01))
- You can then obtain the confusion matrix from the boosting results:
> churn.boostcv$confusion Output Observed Class Predicted Class yes no no 119 1940 ...