Summary
In this chapter, we have learned all the important things about the DEAP and Microsoft NNI packages. We also have learned how to implement various hyperparameter tuning methods with the help of these packages, along with understanding each of the important parameters of the classes and how are they related to the theory that we have learned in the previous chapters. From now on, you should be able to utilize these packages to implement your chosen hyperparameter tuning method, and ultimately, boost the performance of your ML model. Equipped with the knowledge from Chapters 3 – 6, you will also be able to debug your code if there are errors or unexpected results, and be able to craft your own experiment configuration to match your specific problem.
In the next chapter, we’ll learn about hyperparameters for several popular algorithms. There will be a wide explanation for each of the algorithms, including (but not limited to) the definition of each hyperparameter...