Book Image

Natural Language Understanding with Python

By : Deborah A. Dahl
5 (1)
Book Image

Natural Language Understanding with Python

5 (1)
By: Deborah A. Dahl

Overview of this book

Natural Language Understanding facilitates the organization and structuring of language allowing computer systems to effectively process textual information for various practical applications. Natural Language Understanding with Python will help you explore practical techniques for harnessing NLU to create diverse applications. with step-by-step explanations of essential concepts and practical examples, you’ll begin by learning about NLU and its applications. You’ll then explore a wide range of current NLU techniques and their most appropriate use-case. In the process, you’ll be introduced to the most useful Python NLU libraries. Not only will you learn the basics of NLU, you’ll also discover practical issues such as acquiring data, evaluating systems, and deploying NLU applications along with their solutions. The book is a comprehensive guide that’ll help you explore techniques and resources that can be used for different applications in the future. By the end of this book, you’ll be well-versed with the concepts of natural language understanding, deep learning, and large language models (LLMs) for building various AI-based applications.
Table of Contents (21 chapters)
1
Part 1: Getting Started with Natural Language Understanding Technology
4
Part 2:Developing and Testing Natural Language Understanding Systems
16
Part 3: Systems in Action – Applying Natural Language Understanding at Scale

Using BERT – a classification example

In this example, we’ll use BERT for classification, using the movie review dataset we saw in earlier chapters. We will start with a pretrained BERT model and fine-tune it to classify movie reviews. This is a process that you can follow if you want to apply BERT to your own data.

Using BERT for specific applications starts with one of the pretrained models available from TensorFlow Hub (https://tfhub.dev/tensorflow) and then fine-tuning it with training data that is specific to the application. It is recommended to start with one of the small BERT models, which have the same architecture as BERT but are faster to train. Generally, the smaller models are less accurate, but if their accuracy is adequate for the application, it isn’t necessary to take the extra time and computer resources that would be needed to use a larger model. There are many models of various sizes that can be downloaded from TensorFlow Hub.

BERT models...