Book Image

Raspberry Pi Super Cluster

By : Andrew K. Dennis
Book Image

Raspberry Pi Super Cluster

By: Andrew K. Dennis

Overview of this book

A cluster is a type of parallel/distributed processing system which consists of a collection of interconnected stand-alone computers cooperatively working together. Using Raspberry Pi computers, you can build a two-node parallel computing cluster which enhances performance and availability. This practical, example-oriented guide will teach you how to set up the hardware and operating systems of multiple Raspberry Pi computers to create your own cluster. It will then navigate you through how to install the necessary software to write your own programs such as Hadoop and MPICH before moving on to cover topics such as MapReduce. Throughout this book, you will explore the technology with the help of practical examples and tutorials to help you learn quickly and efficiently. Starting from a pile of hardware, with this book, you will be guided through exciting tutorials that will help you turn your hardware into your own super-computing cluster. You'll start out by learning how to set up your Raspberry Pi cluster's hardware. Following this, you will be taken through how to install the operating system, and you will also be given a taste of what parallel computing is about. With your Raspberry Pi cluster successfully set up, you will then install software such as MPI and Hadoop. Having reviewed some examples and written some programs that explore these two technologies, you will then wrap up with some fun ancillary projects. Finally, you will be provided with useful links to help take your projects to the next step.
Table of Contents (15 chapters)
Raspberry Pi Super Cluster
About the Author
About the Reviewers

A brief introduction to Apache Hadoop

The technology known as Apache Hadoop is an open-source framework for developing distributed applications hosted by the Apache Software Foundation. The framework contains a number of subprojects. The one we are interested in is the Hadoop Core, also known as Hadoop Common.

The Hadoop Common project is located within the overall Hadoop framework. It allows the development of cloud computing environments via off-the-shelf hardware such as the Raspberry Pi. The developer interacts with it by using its Java based API.

Within Hadoop Common there are several significant areas that help us achieve our goal of developing parallel computing applications. Two of the most important areas are as follows:

  • Hadoop MapReduce environment

  • Hadoop Distributed File System (HDFS)

In this chapter both subjects will be touched upon during the installation and setup process and Chapter 5, MapReduce Applications with Hadoop and Java,provides an in-depth look at the HDFS and MapReduce...