Book Image

Transformers for Natural Language Processing

By : Denis Rothman
Book Image

Transformers for Natural Language Processing

By: Denis Rothman

Overview of this book

The transformer architecture has proved to be revolutionary in outperforming the classical RNN and CNN models in use today. With an apply-as-you-learn approach, Transformers for Natural Language Processing investigates in vast detail the deep learning for machine translations, speech-to-text, text-to-speech, language modeling, question answering, and many more NLP domains with transformers. The book takes you through NLP with Python and examines various eminent models and datasets within the transformer architecture created by pioneers such as Google, Facebook, Microsoft, OpenAI, and Hugging Face. The book trains you in three stages. The first stage introduces you to transformer architectures, starting with the original transformer, before moving on to RoBERTa, BERT, and DistilBERT models. You will discover training methods for smaller transformers that can outperform GPT-3 in some cases. In the second stage, you will apply transformers for Natural Language Understanding (NLU) and Natural Language Generation (NLG). Finally, the third stage will help you grasp advanced language understanding techniques such as optimizing social network datasets and fake news identification. By the end of this NLP book, you will understand transformers from a cognitive science perspective and be proficient in applying pretrained transformer models by tech giants to various datasets.
Table of Contents (16 chapters)
13
Other Books You May Enjoy
14
Index

Preface

Transformers are a game-changer for Natural Language Understanding (NLU), a subset of Natural Language Processing (NLP), which has become one of the pillars of artificial intelligence in a global digital economy.

The global economy has been moving from the physical world to the digital world.

We are witnessing the expansion of social networks versus physical encounters, e-commerce versus physical shopping, digital newspapers, streaming versus physical theaters, remote doctor consultations versus physical visits, remote work instead of on-site tasks, and similar trends in hundreds of more domains.

Artificial intelligence-driven language understanding will continue to expand exponentially, as will the volumes of data these activities generate. Language understanding has become the pillar of language modeling, chatbots, personal assistants, question answering, text summarizing, speech-to-text, sentiment analysis, machine translation, and more.

Without AI language understanding, it would be incredibly difficult for society to use the Internet.

The Transformer architecture is both revolutionary and disruptive. The Transformer and subsequent transformer architectures and models are revolutionary because they changed the way we think of NLP and artificial intelligence itself. The architecture of the Transformer is not an evolution. It breaks with the past, leaving RNNs and CNNs behind. It takes us closer to seamless machine intelligence that will match human intelligence in the years to come.

The Transformer and subsequent transformer architectures, concepts, and models are disruptive. The various transformers we will explore in this book will progressively replace NLP as we knew it before their arrival.

Think of how many humans it would take to control the content of the billions of messages posted on social networks per day to decide if they are legal, ethical and extract the information they contain. Think of how many humans would be required to translate the millions of pages published each day on the web. Or imagine how many people it would take to control the millions of messages made per minute manually! Finally, think of how many humans it would take to write the transcripts of all of the vast amount of hours of streaming published per day on the web. Finally, think about the human resources required to replace AI image captioning for the billions of images that continuously appear online.

This leads us to a deeper aspect of artificial intelligence. In a world of exponentially growing data, AI performs more tasks than humans could ever perform. Think of how many translators would be required only to translate one billion online messages, whereas machine translations have no quantitative limits.

This book will show you how to improve language understanding. Each chapter will take you through the key aspects of language understanding from scratch in Python, PyTorch, and TensorFlow.

The demand for language understanding keeps increasing daily in many fields such as media, social media, and research papers, for example. Among hundreds of AI tasks, we need to summarize the vast amounts of data for research, translate documents for every area of our economy, and scan all social media posts for ethical and legal reasons.

Progress needed to be made. The Transformer, introduced by Google, provides novel approaches to language understanding through a novel self-attention architecture. OpenAI offers transformer technology, and Facebook's AI Research department provides high-quality datasets. Overall, the Internet giants have made transformers available to all, as we will discover in this book.

Transformers can outperform the classical RNN and CNN models in use today. English to French translation and English to German translation transformer models provide better results than ConvS2S (RNN), GNMT (CNN), and SliceNet (CNN), for example.

Throughout the book, you will work hands-on with Python, PyTorch, and TensorFlow. You will be introduced to the key AI language understanding neural network models. You will then learn how to explore and implement transformers.

The book's goal is to give readers the knowledge and tools for Python deep learning that are needed for effectively developing the key aspects of language understanding.