Word embeddings capture the meaning of the words. They translate a discrete input into an input that can be processed by neural nets.
Embeddings are the start of many applications linked to language:
Generating texts, as we'll see in the next chapter
Translation systems, where input and target sentences are sequences of words and whose embeddings can be processed by end-to-end neural nets (Chapter 8, Translating and Explaining with Encoding – decoding Networks)
Sentiment analysis (Chapter 5, Analyzing Sentiment with a Bidirectional LSTM)
Zero-shot learning in computer vision; the structure in the word language enables us to find classes for which no training images exist
Image annotation/captioning
Neuro-psychiatry, for which neural nets can predict with 100% accuracy some psychiatric disorders in human beings
Chatbots, or answering questions from a user (Chapter 9, Selecting Relevant Inputs or Memories with the Mechanism of Attention)
As with words, the principle of...