Book Image

Hadoop MapReduce v2 Cookbook - Second Edition: RAW

Book Image

Hadoop MapReduce v2 Cookbook - Second Edition: RAW

Overview of this book

Table of Contents (19 chapters)
Hadoop MapReduce v2 Cookbook Second Edition
Credits
About the Author
Acknowledgments
About the Author
About the Reviewers
www.PacktPub.com
Preface
Index

Hadoop counters to report custom metrics


Hadoop uses a set of counters to aggregate the metrics for MapReduce computations. Hadoop counters are helpful to understand the behavior of our MapReduce programs and to track the progress of the MapReduce computations. We can define custom counters to track the application-specific metrics in MapReduce computations.

How to do it...

The following steps show you how to define a custom counter to count the number of bad or corrupted records in our log processing application:

  1. Define the list of custom counters using enum:

      public static enum LOG_PROCESSOR_COUNTER {
         BAD_RECORDS
        };
  2. Increment the counter in your Mapper or Reducer:

    context.getCounter(LOG_PROCESSOR_COUNTER.BAD_RECORDS).increment(1);
  3. Add the following to your driver program to access the counters:

    Job job = new Job(getConf(), "log-analysis");
    ……
    Counters counters = job.getCounters();
    Counter badRecordsCounter = counters.findCounter(LOG_PROCESSOR_COUNTER.BAD_RECORDS);
    System.out.println...