Book Image

Securing Hadoop

By : Sudheesh Narayan
Book Image

Securing Hadoop

By: Sudheesh Narayan

Overview of this book

Security of Big Data is one of the biggest concerns for enterprises today. How do we protect the sensitive information in a Hadoop ecosystem? How can we integrate Hadoop security with existing enterprise security systems? What are the challenges in securing Hadoop and its ecosystem? These are the questions which need to be answered in order to ensure effective management of Big Data. Hadoop, along with Kerberos, provides security features which enable Big Data management and which keep data secure. This book is a practitioner's guide for securing a Hadoop-based Big Data platform. This book provides you with a step-by-step approach to implementing end-to-end security along with a solid foundation of knowledge of the Hadoop and Kerberos security models. This practical, hands-on guide looks at the security challenges involved in securing sensitive data in a Hadoop-based Big Data platform and also covers the Security Reference Architecture for securing Big Data. It will take you through the internals of the Hadoop and Kerberos security models and will provide detailed implementation steps for securing Hadoop. You will also learn how the internals of the Hadoop security model are implemented, how to integrate Enterprise Security Systems with Hadoop security, and how you can manage and control user access to a Hadoop ecosystem seamlessly. You will also get acquainted with implementing audit logging and security incident monitoring within a Big Data platform.
Table of Contents (15 chapters)
Securing Hadoop
Credits
About the Author
About the Reviewers
www.PacktPub.com
Preface
Index

Different Hadoop data encryption options


Let us have a look at the various options available.

Dataguise for Hadoop

Dataguise (DG) for Hadoop provides a symmetric-key-based encryption of the data. One of the key features of Dataguise is to identify and encrypt sensitive data. It supports encryption and masking techniques for sensitive data protection. It enables encryption of data with Hadoop API, Sqoop, and Flume. Thus, it can be used to encrypt data moving in and out of the Hadoop ecosystem. Administrators can schedule the data scan within the Hadoop ecosystem at regular intervals, and detect sensitive data and encrypt or mask it. More details on Dataguise are available at http://dataguise.com/products/dghadoop.html.

Gazzang zNcrypt

Gazzang zNcrypt provides a transparent block level encryption and provides the ability to manage the keys used for encryption. zNcrypt acts like a virtual filesystem that intercepts any application layer request to access the files. It encrypts the block as it is written to the disk. zNcrypt leverages the Intel AES-NI hardware encryption acceleration for maximum performance in the cryptographic process. It also provides role-based access control and policy-based management of the encryption keys. This can be used to implement multiple classification level security in a secured Hadoop cluster.

eCryptfs for Hadoop

eCryptfs is a cryptographic stacked Linux filesystem. eCryptfs stores cryptographic metadata in the header of each file written. When the encrypted files are copied between hosts, the file will be decrypted with the proper key in the Linux kernel key ring. We can set up a secured Hadoop cluster with eCryptfs on each node. This ensures that data is transparently shared between nodes, and that all the data is encrypted before being written to the disk.

More information on eCryptfs is available in the following link: https://launchpad.net/ecryptfs.