Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Book Overview & Buying Getting Started with Google BERT
  • Table Of Contents Toc
Getting Started with Google BERT

Getting Started with Google BERT

By : Sudharsan Ravichandiran
4.2 (50)
close
close
Getting Started with Google BERT

Getting Started with Google BERT

4.2 (50)
By: Sudharsan Ravichandiran

Overview of this book

BERT (bidirectional encoder representations from transformer) has revolutionized the world of natural language processing (NLP) with promising results. This book is an introductory guide that will help you get to grips with Google's BERT architecture. With a detailed explanation of the transformer architecture, this book will help you understand how the transformer’s encoder and decoder work. You’ll explore the BERT architecture by learning how the BERT model is pre-trained and how to use pre-trained BERT for downstream tasks by fine-tuning it for NLP tasks such as sentiment analysis and text summarization with the Hugging Face transformers library. As you advance, you’ll learn about different variants of BERT such as ALBERT, RoBERTa, and ELECTRA, and look at SpanBERT, which is used for NLP tasks like question answering. You'll also cover simpler and faster BERT variants based on knowledge distillation such as DistilBERT and TinyBERT. The book takes you through MBERT, XLM, and XLM-R in detail and then introduces you to sentence-BERT, which is used for obtaining sentence representation. Finally, you'll discover domain-specific BERT models such as BioBERT and ClinicalBERT, and discover an interesting variant called VideoBERT. By the end of this BERT book, you’ll be well-versed with using BERT and its variants for performing practical NLP tasks.
Table of Contents (15 chapters)
close
close
1
Section 1 - Starting Off with BERT
5
Section 2 - Exploring BERT Variants
8
Section 3 - Applications of BERT
BERT Variants I - ALBERT, RoBERTa, ELECTRA, and SpanBERT

In this chapter, we will understand different variants of BERT, such as ALBERT, RoBERTa, ELECTRA, and SpanBERT. We will start with understanding how ALBERT works. ALBERT is basically A Lite version of BERT model. The ALBERT model includes few architectural changes to the BERT to minimize the training time. We will cover how ALBERT works and how it differs from BERT in detail.

Moving on, we will learn about the RoBERTa model, which stands for a Robustly Optimized BERT pre-training Approach. RoBERTa is one of the most popular variants of the BERT and it is used in many state-of-the-art systems. RoBERTa works similar to BERT but with a few changes in the pre-training steps. We will explore how RoBERTa works and how it differs from the BERT model in detail.

Going ahead, we will learn about the ELECTRA model, which stands for...

Visually different images
CONTINUE READING
83
Tech Concepts
36
Programming languages
73
Tech Tools
Icon Unlimited access to the largest independent learning library in tech of over 8,000 expert-authored tech books and videos.
Icon Innovative learning tools, including AI book assistants, code context explainers, and text-to-speech.
Icon 50+ new titles added per month and exclusive early access to books as they are being written.
Getting Started with Google BERT
notes
bookmark Notes and Bookmarks search Search in title playlist Add to playlist download Download options font-size Font size

Change the font size

margin-width Margin width

Change margin width

day-mode Day/Sepia/Night Modes

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Confirmation

Modal Close icon
claim successful

Buy this book with your credits?

Modal Close icon
Are you sure you want to buy this book with one of your credits?
Close
YES, BUY

Submit Your Feedback

Modal Close icon
Modal Close icon
Modal Close icon