Book Image

Modern Data Architecture on AWS

By : Behram Irani
5 (1)
Book Image

Modern Data Architecture on AWS

5 (1)
By: Behram Irani

Overview of this book

Many IT leaders and professionals are adept at extracting data from a particular type of database and deriving value from it. However, designing and implementing an enterprise-wide holistic data platform with purpose-built data services, all seamlessly working in tandem with the least amount of manual intervention, still poses a challenge. This book will help you explore end-to-end solutions to common data, analytics, and AI/ML use cases by leveraging AWS services. The chapters systematically take you through all the building blocks of a modern data platform, including data lakes, data warehouses, data ingestion patterns, data consumption patterns, data governance, and AI/ML patterns. Using real-world use cases, each chapter highlights the features and functionalities of numerous AWS services to enable you to create a scalable, flexible, performant, and cost-effective modern data platform. By the end of this book, you’ll be equipped with all the necessary architectural patterns and be able to apply this knowledge to efficiently build a modern data platform for your organization using AWS services.
Table of Contents (24 chapters)
1
Part 1: Foundational Data Lake
5
Part 2: Purpose-Built Services And Unified Data Access
17
Part 3: Govern, Scale, Optimize And Operationalize

Data mesh on an Amazon S3-based data lake

If you recall from our previous chapter on data governance, we used AWS Lake Formation (LF) as a tool to provide fine-grained access control to data that resides in the S3 data lake via the Glue Data Catalog. The same LF permissions mechanism can be leveraged to share data but in a cross-AWS account manner, which opens the doors to implementing a true data mesh architecture, where the data lake doesn’t have to be a central repository for the whole enterprise. Each LOB can establish its own data lake on S3 inside its own AWS account. Some LOB accounts will be data owners, meaning they will produce, store, and consume their data for analytics purposes, from their own data lake on S3. However, if another LOB needs access to some datasets that belong to a different LOB, instead of copying data around, both the producer and consumer LOBs can leverage LF’s cross-account sharing mechanism.

Let’s introduce the use case for implementing...