-
Book Overview & Buying
-
Table Of Contents
Hyperparameter Tuning with Python
By :
Bayesian Optimization Random Forest (BORF) is another variant of Bayesian Optimization hyperparameter tuning methods that utilize RF as the surrogate model. Note that this variant is different from Sequential Model Algorithm Configuration (SMAC) although both of them utilize RF as the surrogate model (see Chapter 4, Exploring Bayesian Optimization).
Implementing BORF with skopt is actually very similar to implementing BOGP as discussed in the previous section. We just need to change the base_estimator parameter within optimizer_kwargs to RF. Let’s use the same example as in the Implementing Bayesian Optimization Gaussian Process section, but change the acquisition function from EI to LCB. Additionally, let’s change the xi parameter in the acq_func_kwargs to kappa since we are using LCB as our acquisition function. Note that we can also still use the same acquisition function. The changes made here just to show how you...