Book Image

PySpark Cookbook

By : Denny Lee, Tomasz Drabas
Book Image

PySpark Cookbook

By: Denny Lee, Tomasz Drabas

Overview of this book

Apache Spark is an open source framework for efficient cluster computing with a strong interface for data parallelism and fault tolerance. The PySpark Cookbook presents effective and time-saving recipes for leveraging the power of Python and putting it to use in the Spark ecosystem. You’ll start by learning the Apache Spark architecture and how to set up a Python environment for Spark. You’ll then get familiar with the modules available in PySpark and start using them effortlessly. In addition to this, you’ll discover how to abstract data with RDDs and DataFrames, and understand the streaming capabilities of PySpark. You’ll then move on to using ML and MLlib in order to solve any problems related to the machine learning capabilities of PySpark and use GraphFrames to solve graph-processing problems. Finally, you will explore how to deploy your applications to the cloud using the spark-submit command. By the end of this book, you will be able to use the Python API for Apache Spark to solve any problems associated with building data-intensive applications.
Table of Contents (13 chapters)
Title Page
Packt Upsell
Contributors
Preface
Index

Installing Jupyter


Jupyter provides a means to conveniently cooperate with your Spark environment. In this recipe, we will guide you in how to install Jupyter on your local machine.

Getting ready

We require a working installation of Spark. This means that you will have followed the steps outlined in the first, and either the second or third recipes. In addition, a working Python environment is also required.

No other prerequisites are required.

How to do it...

If you do not have pip installed on your machine, you will need to install it before proceeding.

  1. To do this, open your Terminal and type (on macOS):
curl -O https://bootstrap.pypa.io/get-pip.py

Or the following on Linux:

wget https://bootstrap.pypa.io/get-pip.py
  1. Next, type (applies to both operating systems):
python get-pip.py

This will install pip on your machine.

  1. All you have to do now is install Jupyter with the following command:
pip install jupyter

How it works...

pip is a management tool for installing Python packages for PyPI, the Python Package Index. This service hosts a wide range of Python packages and is the easiest and quickest way to distribute your Python packages.

However, calling pip install does not only search for the packages on PyPI: in addition, VCS project URLs, local project directories, and local or remote source archives are also scanned.

Jupyter is one of the most popular interactive shells that supports developing code in a wide variety of environments: Python is not the only one that's supported.

Directly from http://jupyter.org:

"The Jupyter Notebook is an open-source web application that allows you to create and share documents that contain live code, equations, visualizations, and narrative text. Uses include: data cleaning and transformation, numerical simulation, statistical modeling, data visualization, machine learning, and much more."

Another way to install Jupyter, if you are using Anaconda distribution for Python, is to use its package management tool, the conda. Here's how:

conda install jupyter

Note that pip install will also work in Anaconda.

There's more...

Now that you have Jupyter on your machine, and assuming you followed the steps of either the Installing Spark from sources or the Installing Spark from binaries recipes, you should be able to start using Jupyter to interact with PySpark.

To refresh your memory, as part of installing Spark scripts, we have appended two environment variables to the bash profile file: PYSPARK_DRIVER_PYTHON and PYSPARK_DRIVER_PYTHON_OPTS. Using these two environment variables, we set the former to use jupyter and the latter to start a notebook service. 

If you now open your Terminal and type:

pyspark

When you open your browser and navigate to http://localhost:6661, you should see a window not that different from the one in the following screenshot:

See also