Book Image

Frank Kane's Taming Big Data with Apache Spark and Python

By : Frank Kane
Book Image

Frank Kane's Taming Big Data with Apache Spark and Python

By: Frank Kane

Overview of this book

Frank Kane’s Taming Big Data with Apache Spark and Python is your companion to learning Apache Spark in a hands-on manner. Frank will start you off by teaching you how to set up Spark on a single system or on a cluster, and you’ll soon move on to analyzing large data sets using Spark RDD, and developing and running effective Spark jobs quickly using Python. Apache Spark has emerged as the next big thing in the Big Data domain – quickly rising from an ascending technology to an established superstar in just a matter of years. Spark allows you to quickly extract actionable insights from large amounts of data, on a real-time basis, making it an essential tool in many modern businesses. Frank has packed this book with over 15 interactive, fun-filled examples relevant to the real world, and he will empower you to understand the Spark ecosystem and implement production-grade real-time Spark projects with ease.
Table of Contents (13 chapters)
Title Page
Credits
About the Author
www.PacktPub.com
Customer Feedback
Preface
7
Where to Go From Here? – Learning More About Spark and Data Science

Running the minimum temperature example and modifying it for maximums


Let's see this filter in action and find out the minimum temperature observed for each weather station in the year 1800. Go to the download package for this book and download two things: the min-temperatures Python script and the 1800.csv data file, which contains our weather information. Go ahead and download these now. When you're done, place them into your C:SparkCourse folder or wherever you're storing all the stuff for this course:

When you're ready, go ahead and double-click on min-temperatures.py and open that up in your editor. I think it makes a little bit more sense once you see this all together. Feel free to take some time to wrap your head around it and figure out what's going on here and then I'll walk you through it.

Examining the min-temperatures script

We start off with the usual boilerplate stuff, importing what we need from pyspark and setting up a SparkContext object that we're going to call MinTemperatures...