Book Image

Clean Data

By : Megan Squire
Book Image

Clean Data

By: Megan Squire

Overview of this book

<p>Is much of your time spent doing tedious tasks such as cleaning dirty data, accounting for lost data, and preparing data to be used by others? If so, then having the right tools makes a critical difference, and will be a great investment as you grow your data science expertise.</p> <p>The book starts by highlighting the importance of data cleaning in data science, and will show you how to reap rewards from reforming your cleaning process. Next, you will cement your knowledge of the basic concepts that the rest of the book relies on: file formats, data types, and character encodings. You will also learn how to extract and clean data stored in RDBMS, web files, and PDF documents, through practical examples.</p> <p>At the end of the book, you will be given a chance to tackle a couple of real-world projects.</p>
Table of Contents (17 chapters)
Clean Data
Credits
About the Author
About the Reviewers
www.PacktPub.com
Preface
Index

Step eight – cleaning for lookup tables


In the Step seven – Separate user mentions, hashtags, and URLs section, we created new tables to hold the extracted hashtags, user mentions, and URLs, and then provided a way to link each row back to the original table via the id column. We followed the rules of database normalization by creating new tables that represent the one-to-many relationship between a tweet and user mentions, between a tweet and hashtags, or between a tweet and URLs. In this step, we will continue optimizing this table for performance and efficiency.

The column we are concerned with now is the query_phrase column. Looking at the column data, we can see that it contains the same phrases repeated over and over. These were apparently the search phrases that were originally used to locate and select the tweets that now exist in this dataset. Of the 498 tweets in the sentiment140 table, how many of the query phrases are repeated over and over? We can use the following SQL to detect...