Book Image

Mastering Apache Storm

By : Ankit Jain
Book Image

Mastering Apache Storm

By: Ankit Jain

Overview of this book

Apache Storm is a real-time Big Data processing framework that processes large amounts of data reliably, guaranteeing that every message will be processed. Storm allows you to scale your data as it grows, making it an excellent platform to solve your big data problems. This extensive guide will help you understand right from the basics to the advanced topics of Storm. The book begins with a detailed introduction to real-time processing and where Storm fits in to solve these problems. You’ll get an understanding of deploying Storm on clusters by writing a basic Storm Hello World example. Next we’ll introduce you to Trident and you’ll get a clear understanding of how you can develop and deploy a trident topology. We cover topics such as monitoring, Storm Parallelism, scheduler and log processing, in a very easy to understand manner. You will also learn how to integrate Storm with other well-known Big Data technologies such as HBase, Redis, Kafka, and Hadoop to realize the full potential of Storm. With real-world examples and clear explanations, this book will ensure you will have a thorough mastery of Apache Storm. You will be able to use this knowledge to develop efficient, distributed real-time applications to cater to your business needs.
Table of Contents (19 chapters)
Title Page
About the Author
About the Reviewers
Customer Feedback

Installation of Hadoop

Now that we have seen both the storage and processing parts of a Hadoop cluster, let's get started with the installation of Hadoop. We will be using Hadoop 2.2.0 in this chapter. Please note that this version is not compatible with Hadoop 1.X versions.

We will be setting up a cluster on a single node. Before starting, please make sure that you have the following installed on your system:

  • JDK 1.7
  • ssh-keygen

In case you don't have wget or ssh-keygen, install it with the following command:

# yum install openssh-clients

Next, we will need to set up a passwordless SSH on this machine as it is required for Hadoop.

Setting passwordless SSH

The following are the steps for setting up a passwordless SSH:

  1. Generate your SSH key pair by executing the following command:
$ ssh-keygen -t rsa -P ''Generating public/private rsa key pair.Enter file in which to save the key (/home/anand/.ssh/id_rsa): Your identification has been saved in /home/anand/.ssh/id_rsa.Your public key has been saved in...