Advanced Word Vector Algorithms
In Chapter 3, Word2vec – Learning Word Embeddings, we introduced you to Word2vec, the basics of learning word embeddings, and the two common Word2vec algorithms: skip-gram and CBOW. In this chapter, we will discuss several other word vector algorithms:
- GloVe – Global Vectors
- ELMo – Embeddings from Language Models
- Document classification with ELMo
First, you will learn a word embedding learning technique known as Global Vectors (GloVe) and the specific advantages that GloVe has over skip-gram and CBOW.
You will also look at a recent approach for representing language called Embeddings from Language Models (ELMo). ELMo has an edge over other algorithms as it is able to disambiguate words, as well as capture semantics. Specifically, ELMo generates “contextualized” word representations, by using a given word along with its surrounding words, as opposed to treating word representations...