Book Image

Data Engineering with Google Cloud Platform

By : Adi Wijaya
3 (1)
Book Image

Data Engineering with Google Cloud Platform

3 (1)
By: Adi Wijaya

Overview of this book

With this book, you'll understand how the highly scalable Google Cloud Platform (GCP) enables data engineers to create end-to-end data pipelines right from storing and processing data and workflow orchestration to presenting data through visualization dashboards. Starting with a quick overview of the fundamental concepts of data engineering, you'll learn the various responsibilities of a data engineer and how GCP plays a vital role in fulfilling those responsibilities. As you progress through the chapters, you'll be able to leverage GCP products to build a sample data warehouse using Cloud Storage and BigQuery and a data lake using Dataproc. The book gradually takes you through operations such as data ingestion, data cleansing, transformation, and integrating data with other sources. You'll learn how to design IAM for data governance, deploy ML pipelines with the Vertex AI, leverage pre-built GCP models as a service, and visualize data with Google Data Studio to build compelling reports. Finally, you'll find tips on how to boost your career as a data engineer, take the Professional Data Engineer certification exam, and get ready to become an expert in data engineering with GCP. By the end of this data engineering book, you'll have developed the skills to perform core data engineering tasks and build efficient ETL data pipelines with GCP.
Table of Contents (17 chapters)
1
Section 1: Getting Started with Data Engineering with GCP
4
Section 2: Building Solutions with GCP Components
11
Section 3: Key Strategies for Architecting Top-Notch Data Pipelines

Exercise: Build data pipeline orchestration using Cloud Composer

We will continue our bike-sharing scenario from Chapter 3, Building a Data Warehouse in BigQuery.

This exercise will be divided into five different DAG levels. Each DAG level will have specific learning objectives, as follows:

  • Level 1: Learn how to create a DAG and submit it to Cloud Composer.
  • Level 2: Learn how to create a BigQuery DAG.
  • Level 3: Learn how to use variables.
  • Level 4: Learn how to apply task idempotency.
  • Level 5: Learn how to handle late data.

It's important for you to understand that learning Airflow is as easy as learning Level 1 DAG, but as we go through each of the levels, you will see the challenges we may have in practicing it.

In reality, you can choose to follow all of the best practices or none of them—Airflow won't forbid you from doing that. Using this leveling approach, you can learn step by step from the simplest to the most complicated...