Book Image

Big Data Analytics with R

By : Simon Walkowiak
Book Image

Big Data Analytics with R

By: Simon Walkowiak

Overview of this book

Big Data analytics is the process of examining large and complex data sets that often exceed the computational capabilities. R is a leading programming language of data science, consisting of powerful functions to tackle all problems related to Big Data processing. The book will begin with a brief introduction to the Big Data world and its current industry standards. With introduction to the R language and presenting its development, structure, applications in real world, and its shortcomings. Book will progress towards revision of major R functions for data management and transformations. Readers will be introduce to Cloud based Big Data solutions (e.g. Amazon EC2 instances and Amazon RDS, Microsoft Azure and its HDInsight clusters) and also provide guidance on R connectivity with relational and non-relational databases such as MongoDB and HBase etc. It will further expand to include Big Data tools such as Apache Hadoop ecosystem, HDFS and MapReduce frameworks. Also other R compatible tools such as Apache Spark, its machine learning library Spark MLlib, as well as H2O.
Table of Contents (16 chapters)
Big Data Analytics with R
About the Author
About the Reviewers

Traditional limitations of R

The usual scenario is simple. You've mined or collected unusually large amounts of data as part of your professional work, or university research, and you appreciate the flexibility of the R language and its ever-growing, rich landscape of useful and open-source libraries. So what next? Before too long you will be faced with two traditional limitations of R:

  • Data must fit within the available RAM

  • R is generally very slow compared to other languages

Out-of-memory data

The first of the claims against using R for Big Data is that the entire dataset you want to process has to be smaller than the amount of available RAM. Currently, most of the commercially sold, off-the-shelf personal computers are equipped with anything from 4GB to 16GB of RAM, meaning that these values will be the upper bounds of the size of your data which you will want to analyze with R. Of course, from these upper limits, you still need to deduct some additional memory resources for other processes...