Summary
Hyperparameter tuning is an utterly useful technique for finding optional hyperparameters, not just in sequence tagging, but across many areas of machine learning. However, it does so at a cost; it is incredibly expensive. In essence, it performs the training and evaluation process tens, hundreds, or potentially thousands of times. But in the real world, exploring each and every potentially useful hyperparameter value can take months to execute, and the search space is usually narrowed down to a finite set of potentially useful options. Hyperparameter tuning is, therefore, a process that does require a certain level of expert knowledge and experience that will help you achieve the correct trade-off between exploration (trying out different not-yet-explored options) and exploitation (narrowing down the parameter values to the ones that generally seem to work well).
As part of this chapter, we learned why hyperparameter tuning is useful, when to use it, and how. We learned...