Book Image

Hands-On Python Natural Language Processing

By : Aman Kedia, Mayank Rasu
4 (1)
Book Image

Hands-On Python Natural Language Processing

4 (1)
By: Aman Kedia, Mayank Rasu

Overview of this book

Natural Language Processing (NLP) is the subfield in computational linguistics that enables computers to understand, process, and analyze text. This book caters to the unmet demand for hands-on training of NLP concepts and provides exposure to real-world applications along with a solid theoretical grounding. This book starts by introducing you to the field of NLP and its applications, along with the modern Python libraries that you'll use to build your NLP-powered apps. With the help of practical examples, you’ll learn how to build reasonably sophisticated NLP applications, and cover various methodologies and challenges in deploying NLP applications in the real world. You'll cover key NLP tasks such as text classification, semantic embedding, sentiment analysis, machine translation, and developing a chatbot using machine learning and deep learning techniques. The book will also help you discover how machine learning techniques play a vital role in making your linguistic apps smart. Every chapter is accompanied by examples of real-world applications to help you build impressive NLP applications of your own. By the end of this NLP book, you’ll be able to work with language data, use machine learning to identify patterns in text, and get acquainted with the advancements in NLP.
Table of Contents (16 chapters)
1
Section 1: Introduction
4
Section 2: Natural Language Representation and Mathematics
9
Section 3: NLP and Learning

Exploring memory-based variants of the RNN architecture

Before we close this chapter, we will briefly look at GRUs and stacked LSTMs.

GRUs

As we saw, LSTMs are huge networks and they have a lot of parameters. Consequently, we need to update a lot of parameters that are highly computationally expensive. Can we do better?

Yes! GRUs can help us with it.

GRUs use only two gates instead of three, as we used in LSTMs. They combine the forget gate and the candidate-choice part in the input gate into one gate, called the update gate. The other gate is the reset gate, which decides how the memory should get updated with the newly computed information. Based on the output of these two gates, it is decided what to send across as the output from this cell and how the hidden state is to be updated. This is done via using something called a content state, which holds the new information. As a result, the number of parameters in the network is drastically reduced.

You can read more about GRUs here:...