Book Image

Transformers for Natural Language Processing - Second Edition

By : Denis Rothman
5 (1)
Book Image

Transformers for Natural Language Processing - Second Edition

5 (1)
By: Denis Rothman

Overview of this book

Transformers are...well...transforming the world of AI. There are many platforms and models out there, but which ones best suit your needs? Transformers for Natural Language Processing, 2nd Edition, guides you through the world of transformers, highlighting the strengths of different models and platforms, while teaching you the problem-solving skills you need to tackle model weaknesses. You'll use Hugging Face to pretrain a RoBERTa model from scratch, from building the dataset to defining the data collator to training the model. If you're looking to fine-tune a pretrained model, including GPT-3, then Transformers for Natural Language Processing, 2nd Edition, shows you how with step-by-step guides. The book investigates machine translations, speech-to-text, text-to-speech, question-answering, and many more NLP tasks. It provides techniques to solve hard language problems and may even help with fake news anxiety (read chapter 13 for more details). You'll see how cutting-edge platforms, such as OpenAI, have taken transformers beyond language into computer vision tasks and code creation using DALL-E 2, ChatGPT, and GPT-4. By the end of this book, you'll know how transformers work and how to implement them and resolve issues like an AI detective.
Table of Contents (25 chapters)
18
Other Books You May Enjoy
19
Index
Appendix I — Terminology of Transformer Models

Step 5: Downloading the 345M-parameter GPT-2 model

We will now download the trained 345M-parameter GPT-2 model:

#@title Step 5: Downloading the 345M parameter GPT-2 Model
# run code and send argument
import os # after runtime is restarted
os.chdir("/content/gpt-2")
!python3 download_model.py '345M'

The path to the model directory is:

/content/gpt-2/models/345M

It contains the information we need to run the model:

Figure III.4: The GPT-2 Python files of the 345M-parameter model

The hparams.json file contains the definition of the GPT-2 model:

  • "n_vocab": 50257, the size of the vocabulary of the model
  • "n_ctx": 1024, the context size
  • "n_embd": 1024, the embedding size
  • "n_head": 16, the number of heads
  • "n_layer": 24, the number of layers

encoder.json and vacab.bpe contain the tokenized vocabulary and the BPE word pairs. If necessary, take a few...