-
Book Overview & Buying
-
Table Of Contents
-
Feedback & Rating
Natural Language Processing with Flair
By :
Word embeddings play an important, if not essential, role in sequence tagging models' performance. The details of what embeddings are, is covered in Chapter 3, Embeddings in Flair. But, for the purposes of this chapter, it's important to understand that embeddings are essentially word representations most often found in the form of real-valued vectors. These vectors can then be used as input on a number of downstream tasks, such as part-of-speech (PoS) tagging and named entity recognition (NER). Let's first quickly cover how Flair generates embeddings and how they are trained.
The term training embeddings is very often a confusing term that dates back to the older methods, where the result of training embeddings was a set of word embeddings. This term makes less sense with Flair's way of training embeddings, and even less so given that...