Book Image

Modern Data Architectures with Python

By : Brian Lipp
3 (1)
Book Image

Modern Data Architectures with Python

3 (1)
By: Brian Lipp

Overview of this book

Modern Data Architectures with Python will teach you how to seamlessly incorporate your machine learning and data science work streams into your open data platforms. You’ll learn how to take your data and create open lakehouses that work with any technology using tried-and-true techniques, including the medallion architecture and Delta Lake. Starting with the fundamentals, this book will help you build pipelines on Databricks, an open data platform, using SQL and Python. You’ll gain an understanding of notebooks and applications written in Python using standard software engineering tools such as git, pre-commit, Jenkins, and Github. Next, you’ll delve into streaming and batch-based data processing using Apache Spark and Confluent Kafka. As you advance, you’ll learn how to deploy your resources using infrastructure as code and how to automate your workflows and code development. Since any data platform's ability to handle and work with AI and ML is a vital component, you’ll also explore the basics of ML and how to work with modern MLOps tooling. Finally, you’ll get hands-on experience with Apache Spark, one of the key data technologies in today’s market. By the end of this book, you’ll have amassed a wealth of practical and theoretical knowledge to build, manage, orchestrate, and architect your data ecosystems.
Table of Contents (19 chapters)
1
Part 1:Fundamental Data Knowledge
4
Part 2: Data Engineering Toolset
8
Part 3:Modernizing the Data Platform
13
Part 4:Hands-on Project

Databricks Workflows

Now that we’ve gone through the YAML deployment of workflows in dbx, next, we will look at the web console. Here, we have the main page for workflows. We can create a new workflow by clicking the Create job button at the top left:

Figure 9.1: Create job

Figure 9.1: Create job

When you create a workflow, you will be presented with a diagram of the workflow and a menu for each step:

Figure 9.2: My_workflow

Figure 9.2: My_workflow

Be sure to match the package name and entry point with what is defined in setup.py if you’re using a package:

Figure 9.3: Workflow diagram

Figure 9.3: Workflow diagram

When you run your workflow, you will see each instance run, its status, and its start time:

Figure 9.4: Workflow run

Figure 9.4: Workflow run

Here is an example of a two-step workflow that has failed:

Figure 9.5: Workflow flow

Figure 9.5: Workflow flow

You can see your failed runs individually in the console:

Figure 9.6: Workflow run failed
...