Chapter 1. Getting Started with Apache Hadoop
Apache Hadoop is a widely used open source distributed computing framework that is employed to efficiently process large volumes of data using large clusters of cheap or commodity computers. In this chapter, we will learn more about Apache Hadoop by covering the following topics:
- History of Apache Hadoop and its trends
- Components of Apache Hadoop
- Understanding the Apache Hadoop daemons
- Introducing Cloudera
- What is CDH?
- Responsibilities of a Hadoop administrator