Implementing Successive Halving
Successive Halving (SH) is implemented as a pruner in Optuna
, meaning that it is responsible for stopping hyperparameter tuning iterations whenever it seems that there’s no additional benefit to continuing the process. Since it is implemented as a pruner, the resource definition of SH (see Chapter 6) in Optuna
refers to the number of training steps or epochs of the model, instead of the number of samples, as it does in scikit-learn
’s implementation.
We can utilize SH as a pruner along with any sampler that we use. This example shows you how to perform hyperparameter tuning with the Random Search algorithm as the sampler and SH as the pruner. The overall procedure is similar to the procedure stated in the Implementing TPE section. Since we are utilizing SH as a pruner, we have to edit our objective
function so that it will utilize the pruner during the optimization process. In this example, we can use the callback integration with TFKeras...