Book Image

Modern Data Architecture on AWS

By : Behram Irani
5 (1)
Book Image

Modern Data Architecture on AWS

5 (1)
By: Behram Irani

Overview of this book

Many IT leaders and professionals are adept at extracting data from a particular type of database and deriving value from it. However, designing and implementing an enterprise-wide holistic data platform with purpose-built data services, all seamlessly working in tandem with the least amount of manual intervention, still poses a challenge. This book will help you explore end-to-end solutions to common data, analytics, and AI/ML use cases by leveraging AWS services. The chapters systematically take you through all the building blocks of a modern data platform, including data lakes, data warehouses, data ingestion patterns, data consumption patterns, data governance, and AI/ML patterns. Using real-world use cases, each chapter highlights the features and functionalities of numerous AWS services to enable you to create a scalable, flexible, performant, and cost-effective modern data platform. By the end of this book, you’ll be equipped with all the necessary architectural patterns and be able to apply this knowledge to efficiently build a modern data platform for your organization using AWS services.
Table of Contents (24 chapters)
1
Part 1: Foundational Data Lake
5
Part 2: Purpose-Built Services And Unified Data Access
17
Part 3: Govern, Scale, Optimize And Operationalize

Sensitive data discovery with Amazon Macie

In the previous section, we saw how AWS Lake Formation helps with access control mechanisms, which is a vital piece of data governance. When certain datasets contain confidential data or sensitive data, you can use Lake Formation to selectively grant access to only certain columns by tagging them accordingly and granting access via those tags.

The big assumption we made was that data stewards of the data lake are already aware of all the confidential data in the data lake, along with its S3 bucket and filename. In a large implementation of a data lake with lots of contributing source systems, finding sensitive data and classifying it accordingly is like finding a needle in a haystack.

So many use cases require that data assets be classified and tagged accordingly so that accurate permissions can be granted to only the personas who should have access to the data. Doing this also ensures that such sensitive data is tracked as it migrates...