This chapter introduces a mechanism of attention to neural network performance, and enables networks to improve their performance by focusing on relevant parts of their inputs or memories.
With such a mechanism, translations, annotations, explanations, and segmentations, as seen in previous chapter, enjoy greater accuracy.
Inputs and outputs of a neural network may also be connected to reads and writes to an external memory. These networks, memory networks, are enhanced with an external memory and capable of deciding what information, and from where, to store or retrieve.
In this chapter, we'll discuss:
The mechanism of attention
Aligning translations
Focus in images
Neural Turing Machines
Memory networks
Dynamic memory networks