Book Image

Azure Data Factory Cookbook

By : Dmitry Anoshin, Dmitry Foshin, Roman Storchak, Xenia Ireton
Book Image

Azure Data Factory Cookbook

By: Dmitry Anoshin, Dmitry Foshin, Roman Storchak, Xenia Ireton

Overview of this book

Azure Data Factory (ADF) is a modern data integration tool available on Microsoft Azure. This Azure Data Factory Cookbook helps you get up and running by showing you how to create and execute your first job in ADF. You’ll learn how to branch and chain activities, create custom activities, and schedule pipelines. This book will help you to discover the benefits of cloud data warehousing, Azure Synapse Analytics, and Azure Data Lake Gen2 Storage, which are frequently used for big data analytics. With practical recipes, you’ll learn how to actively engage with analytical tools from Azure Data Services and leverage your on-premise infrastructure with cloud-native tools to get relevant business insights. As you advance, you’ll be able to integrate the most commonly used Azure Services into ADF and understand how Azure services can be useful in designing ETL pipelines. The book will take you through the common errors that you may encounter while working with ADF and show you how to use the Azure portal to monitor pipelines. You’ll also understand error messages and resolve problems in connectors and data flows with the debugging capabilities of ADF. By the end of this book, you’ll be able to use ADF as the main ETL and orchestration tool for your data warehouse or data platform projects.
Table of Contents (12 chapters)

Connecting Azure Data Lake to Azure Data Factory and loading data

Moving data is one of the typical tasks done by data engineers. In this recipe, we will be connecting Azure Data Factory to external storage (Azure Blob Storage) and moving the Chicago Safety Data dataset to Azure Data Lake Gen2 that we set up in the previous recipe.

Getting ready

Make sure you have set up Azure Data Lake Gen2 in the Setting up Azure Data Lake Storage Gen 2 recipe.

The dataset that we are going to use in this recipe, Chicago Safety Data, is stored here: https://azure.microsoft.com/en-us/services/open-datasets/catalog/chicago-safety-data/. This dataset is published as a part of Azure Open Datasets, which is built to distribute data.

How to do it...

To transfer the dataset from Azure Blob storage to Azure Data Lake Gen2 with Data Factory, first, let's go to the Azure Data Factory UI:

  1. Click + and select Copy Data tool as shown in the following screenshot:

    Figure 4.5 –...