What’s next?
Even though we have discussed a lot of hyperparameter tuning methods and their implementations in various packages, there are several important concepts you may need to know about that have not been discussed in this book. As for the hyperparameter tuning method, you can also read more about the CMA-ES method, which is part of the heuristic search group (https://cma-es.github.io/). You can also read more about the meta-learning concept to further boost the performance of your Bayesian optimization tuning results (https://lilianweng.github.io/posts/2018-11-30-meta-learning/). It is also worth noting that we can combine the manual search method with other hyperparameter tuning methods to boost the efficiency of our experiments, especially when we already have prior knowledge about the good range of the hyperparameter values.
As for the packages, you can also learn more about the HpBandSter package, which implements the Hyper Band, BOHB, and random search methods...