Apache Spark is growing at a fast pace in terms of technology, community, and user base. Two new APIs were introduced in 2015: the DataFrame API and DataSet API. These two APIs are built on top of the core API, which is based on RDDs. It is essential to understand the deeper concepts of RDDs including runtime architecture and behavior on various resource managers of Spark.
This chapter is divided into the following sub topics:
Starting Spark daemons
Spark core concepts
Pairing RDDs
The lifecycle of a Spark program
Spark applications
Persistence and caching
Spark resource managers—Standalone, Yarn, and Mesos