Book Image

Data Engineering with Google Cloud Platform

By : Adi Wijaya
3 (1)
Book Image

Data Engineering with Google Cloud Platform

3 (1)
By: Adi Wijaya

Overview of this book

With this book, you'll understand how the highly scalable Google Cloud Platform (GCP) enables data engineers to create end-to-end data pipelines right from storing and processing data and workflow orchestration to presenting data through visualization dashboards. Starting with a quick overview of the fundamental concepts of data engineering, you'll learn the various responsibilities of a data engineer and how GCP plays a vital role in fulfilling those responsibilities. As you progress through the chapters, you'll be able to leverage GCP products to build a sample data warehouse using Cloud Storage and BigQuery and a data lake using Dataproc. The book gradually takes you through operations such as data ingestion, data cleansing, transformation, and integrating data with other sources. You'll learn how to design IAM for data governance, deploy ML pipelines with the Vertex AI, leverage pre-built GCP models as a service, and visualize data with Google Data Studio to build compelling reports. Finally, you'll find tips on how to boost your career as a data engineer, take the Professional Data Engineer certification exam, and get ready to become an expert in data engineering with GCP. By the end of this data engineering book, you'll have developed the skills to perform core data engineering tasks and build efficient ETL data pipelines with GCP.
Table of Contents (17 chapters)
1
Section 1: Getting Started with Data Engineering with GCP
4
Section 2: Building Solutions with GCP Components
11
Section 3: Key Strategies for Architecting Top-Notch Data Pipelines

Exercise – deploying Cloud Composer jobs using Cloud Build

In this section, we will continue creating a Cloud Build pipeline. This time, I will help you get an idea of how this practice can be implemented in terms of data engineering. To do that, we will try to create a CI/CD pipeline for deploying a Cloud Composer DAG.

In this exercise, we will use the DAG from Chapter 4, Building Orchestration for Batch Data Loading Using Cloud Composer. Let's refresh a little bit on the exercises from that chapter.

In Chapter 4, Building Orchestration for Batch Data Loading Using Cloud Composer, we learned how Cloud Composer works. We learned that in Cloud Composer, you can develop DAGs to create data pipelines. These data pipelines can use Airflow Operators to manage BigQuery, Cloud SQL, GCS, or simple bash scripts. In those exercises, we practiced five levels of DAGs, with the level one DAG being the simplest one and the level five DAG being the most complex. To deploy a DAG...