Book Image

Hadoop Operations and Cluster Management Cookbook

By : Shumin Guo
Book Image

Hadoop Operations and Cluster Management Cookbook

By: Shumin Guo

Overview of this book

<p>We are facing an avalanche of data. The unstructured data we gather can contain many insights that could hold the key to business success or failure. Harnessing the ability to analyze and process this data with Hadoop is one of the most highly sought after skills in today's job market. Hadoop, by combining the computing and storage powers of a large number of commodity machines, solves this problem in an elegant way!</p> <p>Hadoop Operations and Cluster Management Cookbook is a practical and hands-on guide for designing and managing a Hadoop cluster. It will help you understand how Hadoop works and guide you through cluster management tasks.</p> <p>This book explains real-world, big data problems and the features of Hadoop that enables it to handle such problems. It breaks down the mystery of a Hadoop cluster and will guide you through a number of clear, practical recipes that will help you to manage a Hadoop cluster.</p> <p>We will start by installing and configuring a Hadoop cluster, while explaining hardware selection and networking considerations. We will also cover the topic of securing a Hadoop cluster with Kerberos, configuring cluster high availability and monitoring a cluster. And if you want to know how to build a Hadoop cluster on the Amazon EC2 cloud, then this is a book for you.</p>
Table of Contents (15 chapters)
Hadoop Operations and Cluster Management Cookbook
Credits
About the Author
About the Reviewers
www.PacktPub.com
Preface
Index

Manipulating files on HDFS


Besides commands to copy files from the local directory, HDFS provides commands to operate on files. In this section, we will show you how to operate files, such as downloading files from HDFS, checking the content of files, and removing files from HDFS.

Getting ready

We assume that our Hadoop cluster has been properly configured and all the daemons are running without any issues.

How to do it...

Perform the following steps to check the status of files and the directory on HDFS:

  1. List files of the user's home directory on HDFS using the following command:

    hadoop fs -ls .
    

    For example, this command will give the following output on my machine:

    Found 7 items
    drwx------ - hduser supergroup   0 2013-02-21 22:17 /user/hduser/.staging
    -rw-r--r-- 2 hduser supergroup 646 2013-02-21 22:28 /user/hduser/file1
    -rw-r--r-- 2 hduser supergroup 848 2013-02-21 22:28 /user/hduser/file2
    ...
    

    Note

    To recursively list files in the home directory, we can use the command hadoop fs -lsr ...

  2. Check...