Book Image

Hands-On Data Analysis with Scala

By : Rajesh Gupta
Book Image

Hands-On Data Analysis with Scala

By: Rajesh Gupta

Overview of this book

Efficient business decisions with an accurate sense of business data helps in delivering better performance across products and services. This book helps you to leverage the popular Scala libraries and tools for performing core data analysis tasks with ease. The book begins with a quick overview of the building blocks of a standard data analysis process. You will learn to perform basic tasks like Extraction, Staging, Validation, Cleaning, and Shaping of datasets. You will later deep dive into the data exploration and visualization areas of the data analysis life cycle. You will make use of popular Scala libraries like Saddle, Breeze, Vegas, and PredictionIO for processing your datasets. You will learn statistical methods for deriving meaningful insights from data. You will also learn to create applications for Apache Spark 2.x on complex data analysis, in real-time. You will discover traditional machine learning techniques for doing data analysis. Furthermore, you will also be introduced to neural networks and deep learning from a data analysis standpoint. By the end of this book, you will be capable of handling large sets of structured and unstructured data, perform exploratory analysis, and building efficient Scala applications for discovering and delivering insights
Table of Contents (14 chapters)
Free Chapter
1
Section 1: Scala and Data Analysis Life Cycle
7
Section 2: Advanced Data Analysis and Machine Learning
10
Section 3: Real-Time Data Analysis and Scalability

Using Spark to explore data

Spark's SQL provides a convenient way to explore data and gain a deeper understanding of the data. Spark's DataFrame construct can be registered as temporary tables. It is possible to run SQL on these registered tables by performing all of the normal operations, such as joining tables and filtering data.

Let's look at an example Spark shell to learn how to explore data by using the following steps:

  1. Start the Spark shell in a Terminal as follows:
$ spark-shell
  1. Define the following Scala case called Person with the following three attributes:
    • fname: String
    • lname: String
    • age: Int
scala> case class Person(fname: String, lname: String, age: Int)
defined class Person
  1. Create a Scala list consisting of a few persons and put it into a Spark dataset of Person as follows:
scala> val personsDS = List(Person("Jon", "Doe...