Book Image

Data Lake Development with Big Data

By : Pradeep Pasupuleti, Beulah Salome Purra
Book Image

Data Lake Development with Big Data

By: Pradeep Pasupuleti, Beulah Salome Purra

Overview of this book

A Data Lake is a highly scalable platform for storing huge volumes of multistructured data from disparate sources with centralized data management services. This book explores the potential of Data Lakes and explores architectural approaches to building data lakes that ingest, index, manage, and analyze massive amounts of data using batch and real-time processing frameworks. It guides you on how to go about building a Data Lake that is managed by Hadoop and accessed as required by other Big Data applications. This book will guide readers (using best practices) in developing Data Lake's capabilities. It will focus on architect data governance, security, data quality, data lineage tracking, metadata management, and semantic data tagging. By the end of this book, you will have a good understanding of building a Data Lake for Big Data.
Table of Contents (13 chapters)

Data Governance components

Data Governance comprises of metadata management and lineage tracking, Data Security and privacy, and Information Lifecycle Management components. These are common components that cut across the Data Intake, management, and consumption tiers of the Data Lake. In the following sections, let us explore these components in detail.

Metadata management and lineage tracking

Big Data often relies on extracting value from huge volumes of unstructured data. The first thing we do after this data enters the Data Lake is classify it and "understand" it by extracting its metadata. Metadata is the fundamental building block, on which the success of any Data Governance endeavor depends.

Metadata captures vital information about the data as it enters the Data Lake and indexes this information while it is stored so that users can search for metadata before they access the data and perform any manipulation on it. Metadata capture is fundamental to make data more accessible and extract...