Book Image

Apache Kafka

By : Nishant Garg
Book Image

Apache Kafka

By: Nishant Garg

Overview of this book

<p>Message publishing is a mechanism of connecting heterogeneous applications together with messages that are routed between them, for example by using a message broker like Apache Kafka. Such solutions deal with real-time volumes of information and route it to multiple consumers without letting information producers know who the final consumers are.</p> <p>Apache Kafka is a practical, hands-on guide providing you with a series of step-by-step practical implementations, which will help you take advantage of the real power behind Kafka, and give you a strong grounding for using it in your publisher-subscriber based architectures.</p> <p>Apache Kafka takes you through a number of clear, practical implementations that will help you to take advantage of the power of Apache Kafka, quickly and painlessly. You will learn everything you need to know for setting up Kafka clusters. This book explains how Kafka basic blocks like producers, brokers, and consumers actually work and fit together. You will then explore additional settings and configuration changes to achieve ever more complex goals. Finally you will learn how Kafka works with other tools like Hadoop, Storm, and so on.</p> <p>You will learn everything you need to know to work with Apache Kafka in the right format, as well as how to leverage its power of handling hundreds of megabytes of messages per second from multiple clients.</p>
Table of Contents (15 chapters)

Kafka performance testing


Kafka contributors are still working on performance testing, and their goal is to produce a number of script files that help in running the performance tests. Some of them are provided in the Kafka bin folder:

  • Kafka-producer-perf-test.sh: This script will run the kafka.perf.ProducerPerformance class to produce the incremented statistics into a CSV file for the producers

  • Kafka-consumer-perf-test.sh: This script will run the kafka.perf.ConsumerPerformance class to produce the incremented statistics into a CSV file for the consumers

Some more scripts for pulling the Kafka server and ZooKeeper statistics are provided in the CSV format. Once CSV files are produced, the R script can be created to produce the graph images.

Note

For detailed information on how to go for Kafka performance testing, please refer to https://cwiki.apache.org/confluence/display/KAFKA/Performance+testing.