Introducing Hadoop, the Big Elephant
Doug Cutting created Hadoop. Doug Cutting is also the guy who created Apache Lucene. Apache Hadoop is a software library that was designed and implemented for storing and processing large datasets. When I say large, I mean massive amounts that cannot fit on a single box and thus need to be partitioned and distributed across multiple physical boxes – transparently to the user who still should see the data as one big massive chunk – logically. Hadoop’s design is based on Google’s Filesystem and is meant to scale from a single-node deployment to thousands of nodes, each of which offers local computation and storage capabilities.
Hadoop does not depend on Hardware for reliability. Instead, the software itself is capable of detecting and handling failures at the node level and thus becoming capable of delivering a Highly-Available Service on top of a cluster of computers.
Hadoop consists of three functional modules:
- Hadoop Distributed Filesystem: Distributed...