Hyperparameter optimization in Flair
As we learned in Chapter 5, Training Sequence Labeling Models, the success of model training often depends on a potentially large number of correctly set hyperparameters. This creates the need for finding a set of hyperparameter values that yield optimal performance. We can do this using hyperparameter optimization. Luckily, Flair offers hyperparameter tuning out of the box. Let's learn how to do it.
Hyperparameter optimization in Flair is essentially a wrapper around Hyperopt, which we briefly covered in the previous section. The extra advantage of this wrapper is that it already feeds some sequence tagging specific information into Hyperopt so that we don't have to. In the bare-bones Hyperopt coding exercise, we had to provide all three objects: search space, optimization method, and an objective function. But in Flair, we only need to define the search space and the framework will do the rest.