Summary
Transformers have proved to be very successful in a range of natural language tasks, with numerous recently released chatbots outperforming existing models in their ability to understand and manipulate human language. In this chapter, we looked at how transformers can be used for the task of assigning emotions to informal texts and investigated how well they perform on this task with a range of datasets. We started by taking a brief look at transformers, focusing on the individual components of a transformer, and how data flows through them. Transformers need a lot of data to be effective and produce good results, and a huge amount of computing power and time is also needed. Then, we introduced Hugging Face, discussed why it was useful, and introduced some of the more common pretrained models that are available on the Hugging Face platform, before moving on to discussing how transformers are used for classification. Finally, we showed how to code classifiers using transformers...