Book Image

Spark for Data Science

By : Srinivas Duvvuri, Bikramaditya Singhal
Book Image

Spark for Data Science

By: Srinivas Duvvuri, Bikramaditya Singhal

Overview of this book

This is the era of Big Data. The words ‘Big Data’ implies big innovation and enables a competitive advantage for businesses. Apache Spark was designed to perform Big Data analytics at scale, and so Spark is equipped with the necessary algorithms and supports multiple programming languages. Whether you are a technologist, a data scientist, or a beginner to Big Data analytics, this book will provide you with all the skills necessary to perform statistical data analysis, data visualization, predictive modeling, and build scalable data products or solutions using Python, Scala, and R. With ample case studies and real-world examples, Spark for Data Science will help you ensure the successful execution of your data science projects.
Table of Contents (18 chapters)
Spark for Data Science
Credits
Foreword
About the Authors
About the Reviewers
www.PacktPub.com
Preface

RDD operations


Spark programming usually starts by choosing a suitable interface that you are comfortable with. If you intend to do interactive data analysis, then a shell prompt would be the obvious choice. However, choosing a Python shell (PySpark) or Scala shell (Spark-Shell) depends on your proficiency with these languages to some extent. If you are building a full-blown scalable application then proficiency matters a great deal, so you should develop the application in your language of choice between Scala, Java, and Python, and submit it to Spark. We will discuss this aspect in more detail later in the book.

Creating RDDs

In this section, we will use both a Python shell (PySpark) and a Scala shell (Spark-Shell) to create an RDD. Both of these shells have a predefined, interpreter-aware SparkContext that is assigned to a variable sc.

Let us get started with some simple code examples. Note that the code assumes the current working directory is Spark's home directory. The following code...