Delineating hyperparameter types
As we develop a model and its training process, we define variables and set their values to determine the training workflow and the model's structure. These values (such as the number of hidden nodes in a layer of a multilayer perceptron, or the selection of an optimizer and a loss function) are known as hyperparameters. These parameters are specified by the model creator. The performance of a machine learning model often depends on the model architecture and the hyperparameters selected during its training process. Finding a set of optimal hyperparameters for the model is not a trivial task. The simplest method to this task is by grid search, that is, building all possible combinations of hyperparameter values within a search space and then comparing the evaluation metrics across these combinations. While this is straightforward and thorough, it is a tedious process. We will see how the new tf.keras
API implements three different search algorithms...