Decoding the original transformer architecture holistically
Before we look into the structure of the model, let’s talk about the basic intent of transformers.
As we covered in the previous chapter, transformers are also a family of architectures that utilize the concept of encoder and decoder. The encoder encodes data into what is known as the code and the decoder decodes the code into a data format that looks similar to raw, unprocessed data. The very first transformer used both the encoder and decoder concepts to build the entire architecture and demonstrated its application in text generation. The subsequent adaptations and improvements either used only the encoder or only the decoder to achieve different tasks. In a transformer, however, the encoder’s goal is not to compress the data to achieve a smaller and more compact representation of the data, but instead mainly to serve as a feature extractor. Additionally, the decoder’s goal for transformers is not...