Book Image

Mastering Apache Cassandra 3.x - Third Edition

By : Aaron Ploetz, Tejaswi Malepati, Nishant Neeraj
Book Image

Mastering Apache Cassandra 3.x - Third Edition

By: Aaron Ploetz, Tejaswi Malepati, Nishant Neeraj

Overview of this book

With ever-increasing rates of data creation, the demand for storing data fast and reliably becomes a need. Apache Cassandra is the perfect choice for building fault-tolerant and scalable databases. Mastering Apache Cassandra 3.x teaches you how to build and architect your clusters, configure and work with your nodes, and program in a high-throughput environment, helping you understand the power of Cassandra as per the new features. Once you’ve covered a brief recap of the basics, you’ll move on to deploying and monitoring a production setup and optimizing and integrating it with other software. You’ll work with the advanced features of CQL and the new storage engine in order to understand how they function on the server-side. You’ll explore the integration and interaction of Cassandra components, followed by discovering features such as token allocation algorithm, CQL3, vnodes, lightweight transactions, and data modelling in detail. Last but not least you will get to grips with Apache Spark. By the end of this book, you’ll be able to analyse big data, and build and manage high-performance databases for your application.
Table of Contents (12 chapters)

SparkR

SparkR is an interactive CLI, built-in with Spark, which provides an R interface of developing for processing large amounts of data either from a single source or aggregating from multiple sources. This is the statisticians' CLI for data interaction. As R is a statistician's language, it is a little more complicated than Python, due to the limitations and architecture of R.

SparkR can be found in the bin directory of the binary installations. It also has support for running in local or pseudo mode and, based on which, there would/wouldn't be any master and worker web UI. But the application web UI would be accessible regardless. Refer to the SparkR docs for further information at Spark: R on Spark: https://spark.apache.org/docs/latest/sparkr.html.

Unlike PySpark, SparkR requires the R package to be installed, much like PySpark requires Python, which is built...