Book Image

Natural Language Processing with Python Quick Start Guide

By : Nirant Kasliwal
Book Image

Natural Language Processing with Python Quick Start Guide

By: Nirant Kasliwal

Overview of this book

NLP in Python is among the most sought after skills among data scientists. With code and relevant case studies, this book will show how you can use industry-grade tools to implement NLP programs capable of learning from relevant data. We will explore many modern methods ranging from spaCy to word vectors that have reinvented NLP. The book takes you from the basics of NLP to building text processing applications. We start with an introduction to the basic vocabulary along with a work?ow for building NLP applications. We use industry-grade NLP tools for cleaning and pre-processing text, automatic question and answer generation using linguistics, text embedding, text classifier, and building a chatbot. With each project, you will learn a new concept of NLP. You will learn about entity recognition, part of speech tagging and dependency parsing for Q and A. We use text embedding for both clustering documents and making chatbots, and then build classifiers using scikit-learn. We conclude by deploying these models as REST APIs with Flask. By the end, you will be confident building NLP applications, and know exactly what to look for when approaching new challenges.
Table of Contents (10 chapters)

Linguistics and NLP

This section is dedicated to introducing you to the ideas and tools that have been around during several decades of linguistics. The most traditional way to introduce this is to take an idea, talk about it at length, and then put all of this together.

Here, I am going to do this the other way around. We will solve two problems and, in the process, look at the tools we will be using. Instead of talking to you about a number 8 spanner, I am giving you a car engine and the tools, and I will introduce the tools as I use them.

Most NLP tasks are solved in a sequential pipeline, with the results from one component feeding into the next.

There is a wide variety of data structures that are used to store pipeline results and intermediate steps. Here, for simplicity, I am going to use only the data structures that are already in spaCy and the native Python ones like...