Book Image

Learning Storm

By : Ankit Jain, Anand Nalya
Book Image

Learning Storm

By: Ankit Jain, Anand Nalya

Overview of this book

<p>Starting with the very basics of Storm, you will learn how to set up Storm on a single machine and move on to deploying Storm on your cluster. You will understand how Kafka can be integrated with Storm using the Kafka spout.</p> <p>You will then proceed to explore the Trident abstraction tool with Storm to perform stateful stream processing, guaranteeing single message processing in every topology. You will move ahead to learn how to integrate Hadoop with Storm. Next, you will learn how to integrate Storm with other well-known Big Data technologies such as HBase, Redis, and Kafka to realize the full potential of Storm.</p> <p>Finally, you will perform in-depth case studies on Apache log processing and machine learning with a focus on Storm, and through these case studies, you will discover Storm's realm of possibilities.</p>
Table of Contents (16 chapters)
Learning Storm
Credits
About the Authors
About the Reviewers
www.PacktPub.com
Preface
Index

Distributed RPC


Distributed RPC is used to query on and retrieve the result from the Trident topology on the fly. Storm has an in-built distributed RPC server. The distributed RPC server receives the RPC request from the client and passes it to the topology. The topology processes the request and sends the result to the distributed RPC server, which is redirected by the distributed RPC server to the client.

We can configure the distributed RPC server by setting the following properties in the storm.yaml file:

drpc.servers:
  - "nimbus-node"

Here, nimbus-node is the IP address of the distributed RPC server.

Now, run the following command on the nimbus-node machine to start the distributed RPC server:

bin/storm drpc

Let's consider that we are storing the count aggregation of the sample Trident topology in the database and want to retrieve the count for the given country on the fly. Then, we will need to use the distributed RPC feature to achieve this. The following example code shows how we can...