Book Image

Securing Hadoop

By : Sudheesh Narayan
Book Image

Securing Hadoop

By: Sudheesh Narayan

Overview of this book

Security of Big Data is one of the biggest concerns for enterprises today. How do we protect the sensitive information in a Hadoop ecosystem? How can we integrate Hadoop security with existing enterprise security systems? What are the challenges in securing Hadoop and its ecosystem? These are the questions which need to be answered in order to ensure effective management of Big Data. Hadoop, along with Kerberos, provides security features which enable Big Data management and which keep data secure. This book is a practitioner's guide for securing a Hadoop-based Big Data platform. This book provides you with a step-by-step approach to implementing end-to-end security along with a solid foundation of knowledge of the Hadoop and Kerberos security models. This practical, hands-on guide looks at the security challenges involved in securing sensitive data in a Hadoop-based Big Data platform and also covers the Security Reference Architecture for securing Big Data. It will take you through the internals of the Hadoop and Kerberos security models and will provide detailed implementation steps for securing Hadoop. You will also learn how the internals of the Hadoop security model are implemented, how to integrate Enterprise Security Systems with Hadoop security, and how you can manage and control user access to a Hadoop ecosystem seamlessly. You will also get acquainted with implementing audit logging and security incident monitoring within a Big Data platform.
Table of Contents (15 chapters)
Securing Hadoop
Credits
About the Author
About the Reviewers
www.PacktPub.com
Preface
Index

Challenges for securing the Hadoop ecosystem


Big Data not only brings challenges for storing, processing, and analysis but also for managing and securing these large data assets. Hadoop was not built with security to begin with. As enterprises started adopting Hadoop, the Kerberos-based security model evolved within Hadoop. But given the distributed nature of the ecosystem and wide range of applications that are built on top of Hadoop, securing Hadoop from an enterprise context is a big challenge.

A typical Big Data ecosystem has multiple stakeholders who interact with the system. For example, expert users (business analysts and data scientists) within the organization would interact with the ecosystem using business intelligence (BI) and analytical tools, and would need deep data access to the data to perform various analysis. A finance department business analyst should not be able to see the data from the HR department and so on. BI tools need a wide range of system-level access to the Hadoop ecosystem depending on the protocol and data that they use for communicating with the ecosystem.

One of the biggest challenges for Big Data projects within enterprises today is about securely integrating the external data sources (social blogs, websites, existing ERP and CRM systems, and so on). This external connectivity needs to be established so that the extracted data from these external sources is available in the Hadoop ecosystem.

Hadoop ecosystem tools such as Sqoop and Flume were not built with full enterprise grade security. Cloudera, MapR, and few others have made significant contributions towards enabling these ecosystem components to be enterprise grade, resulting in Sqoop 2, Flume-ng, and Hive Server 2. Apart from these, there are multiple security-focused projects within the Hadoop ecosystem such as Cloudera Sentry (http://www.cloudera.com/content/cloudera/en/products/cdh/sentry.html), Hortonworks Knox Gateway (http://hortonworks.com/hadoop/knox-gateway/), and Intel's Project Rhino (https://github.com/intel-hadoop/project-rhino/). These projects are making significant progress to make Apache Hadoop provide enterprise grade security. A detailed understanding of each of these ecosystem components is needed to deploy them in production.

Another area of concern within enterprises is the need the existing enterprise Identity and Access Management (IDAM) systems with the Hadoop ecosystem. With such integration, enterprises can extend the Identity and Access Management to the Hadoop ecosystem. However, these integrations bring in multiple challenges as Hadoop inherently has not been built with such enterprise integrations in mind.

Apart from ecosystem integration, there is often a need to have sensitive information within the Big Data ecosystem, to derive patterns and inferences from these datasets. As we move these datasets to the Big Data ecosystem we need to mask/encrypt this sensitive information. Traditional data masking and encryption tools don't scale well for large scale Big Data masking and encryption. We need to identify new means for encryption of large scale datasets.

Usually, as the adoption of Big Data increases, enterprises quickly move to a multicluster/multiversion scenario, where there are multiple versions of the Hadoop ecosystem operating in an enterprise. Also, sensitive data that was earlier banned from the Big Data platform slowly makes its way in. This brings in additional challenges on how we address security in such a complex environment, as a small lapse in security could result in huge financial loss for the organization.