Book Image

Python Text Processing with NLTK 2.0 Cookbook

By : Jacob Perkins
Book Image

Python Text Processing with NLTK 2.0 Cookbook

By: Jacob Perkins

Overview of this book

<p>Natural Language Processing is used everywhere – in search engines, spell checkers, mobile phones, computer games – even your washing machine. Python's Natural Language Toolkit (NLTK) suite of libraries has rapidly emerged as one of the most efficient tools for Natural Language Processing. You want to employ nothing less than the best techniques in Natural Language Processing – and this book is your answer.<br /><br /><em>Python Text Processing with NLTK 2.0 Cookbook</em> is your handy and illustrative guide, which will walk you through all the Natural Language Processing techniques in a step–by-step manner. It will demystify the advanced features of text analysis and text mining using the comprehensive NLTK suite.<br /><br />This book cuts short the preamble and you dive right into the science of text processing with a practical hands-on approach.<br /><br />Get started off with learning tokenization of text. Get an overview of WordNet and how to use it. Learn the basics as well as advanced features of Stemming and Lemmatization. Discover various ways to replace words with simpler and more common (read: more searched) variants. Create your own corpora and learn to create custom corpus readers for JSON files as well as for data stored in MongoDB. Use and manipulate POS taggers. Transform and normalize parsed chunks to produce a canonical form without changing their meaning. Dig into feature extraction and text classification. Learn how to easily handle huge amounts of data without any loss in efficiency or speed.<br /><br />This book will teach you all that and beyond, in a hands-on learn-by-doing manner. Make yourself an expert in using the NLTK for Natural Language Processing with this handy companion.</p>
Table of Contents (16 chapters)
Python Text Processing with NLTK 2.0 Cookbook
Credits
About the Author
About the Reviewers
Preface
Penn Treebank Part-of-Speech Tags
Index

Training a maximum entropy classifier


The third classifier which we will cover is the MaxentClassifier, also known as a conditional exponential classifier. The maximum entropy classifier converts labeled feature sets to vectors using encoding. This encoded vector is then used to calculate weights for each feature that can then be combined to determine the most likely label for a feature set.

Getting ready

The MaxentClassifier requires the numpy package, and optionally the scipy package. This is because the feature encodings use numpy arrays. Having scipy installed also means you will be able to use faster algorithms that consume less memory. You can find installation for both at http://www.scipy.org/Installing_SciPy.

Tip

Many of the algorithms can be quite memory hungry, so you may want to quit all your other programs while training a MaxentClassifier, just to be safe.

How to do it...

We will use the same train_feats and test_feats from the movie_reviews corpus that we constructed before, and...