The parameters of a model typically refer to things such as the weights or bias/intercept parameters. However, there are many other parameters that must be set at the offset and are not optimized or learned during model training. These are sometimes referred to as hyperparameters. Indeed, even the choice of model (for example, deep feedforward neural network, random forest, or support vector machine) can be seen as a hyperparameter.
Even if we assume that somehow we have decided that a deep feedforward neural network is the best modeling strategy, there are still many hyperparameters that must be set. These hyperparameters may be explicitly specified by the user or implicitly specified by using default values, where software provides them.
The values chosen for the hyperparameters can have a dramatic impact on the accuracy and training speed of a model. Indeed, we have already seen examples of trying different hyperparameters, such as trying different numbers of hidden...