-
Book Overview & Buying
-
Table Of Contents
Hyperparameter Tuning with Python
By :
Bayesian Optimization Gradient Boosted Trees (BOGBRT) is another variant of Bayesian Optimization that utilizes Gradient Boosted Trees as a surrogate model. Note that there will be endless variants of Bayesian Optimization that we can implement in skopt since we can just pass any other regressors from sklearn to be utilized as the base_estimator parameter. However, GBRT is part of the default surrogate model with predefined default hyperparameter values from the skopt package.
Similar to the Implementing Bayesian Optimization Random Forest section, we can just change the base_estimator parameter within optimizer_kwargs to GBRT. The following code shows you how to implement BOGBRT in skopt:
from skopt import BayesSearchCV
Initiate the BayesSearchCV class:
clf = BayesSearchCV(pipe, hyperparameter_space, n_iter=50,
optimizer_kwargs={"base_estimator":"GBRT",
...