Summary
In this chapter, we have learned all of the important aspects of the Optuna
package. We have also learned how to implement various hyperparameter tuning methods using the help of this package, in addition to understanding each of the important parameters of the classes and how are they related to the theory that we have learned in previous chapters. From now on, you should be able to utilize the packages we have discussed in the last few chapters to implement your chosen hyperparameter tuning method, and ultimately, boost the performance of your ML model. Equipped with the knowledge from Chapters 3 - 6, you will also be able to debug your code if there are errors or unexpected results, and you will be able to craft your own experiment configuration to match your specific problem.
In the next chapter, we will learn about the DEAP and Microsoft NNI packages and how to utilize them to perform various hyperparameter tuning methods. The goal of the next chapter is similar to...