Book Image

Apache Spark 2 for Beginners

By : Rajanarayanan Thottuvaikkatumana
Book Image

Apache Spark 2 for Beginners

By: Rajanarayanan Thottuvaikkatumana

Overview of this book

<p>Spark is one of the most widely-used large-scale data processing engines and runs extremely fast. It is a framework that has tools that are equally useful for application developers as well as data scientists.</p> <p>This book starts with the fundamentals of Spark 2 and covers the core data processing framework and API, installation, and application development setup. Then the Spark programming model is introduced through real-world examples followed by Spark SQL programming with DataFrames. An introduction to SparkR is covered next. Later, we cover the charting and plotting features of Python in conjunction with Spark data processing. After that, we take a look at Spark's stream processing, machine learning, and graph processing libraries. The last chapter combines all the skills you learned from the preceding chapters to develop a real-world Spark application.</p> <p>By the end of this book, you will have all the knowledge you need to develop efficient large-scale applications using Apache Spark.</p>
Table of Contents (15 chapters)
Apache Spark 2 for Beginners
Credits
About the Author
About the Reviewer
www.PacktPub.com
Preface

Summary


Spark SQL is a very highly useful library on top of the Spark core infrastructure. This library makes the Spark programming more inclusive to a wider group of programmers who are well versed with the imperative style of programming but not as competent in functional programming. Apart from this, Spark SQL is the best library to process structured data in the Spark family of data processing libraries. Spark SQL-based data processing application programs can be written with SQL-like queries or DSL style imperative programs of DataFrame API. This chapter has also demonstrated various strategies of mixing RDD and DataFrames, mixing SQL-like queries and DataFrame API. This gives amazing flexibility for the application developers to write data processing programs in the way they are most comfortable with, or that is more appropriate to the use cases, and at the same time, without compromising performance.

The Dataset API is as the next generation of programming model based on dataset in...