Book Image

Hands-On Data Warehousing with Azure Data Factory

By : Christian Cote, Michelle Gutzait, Giuseppe Ciaburro
Book Image

Hands-On Data Warehousing with Azure Data Factory

By: Christian Cote, Michelle Gutzait, Giuseppe Ciaburro

Overview of this book

ETL is one of the essential techniques in data processing. Given data is everywhere, ETL will always be the vital process to handle data from different sources. Hands-On Data Warehousing with Azure Data Factory starts with the basic concepts of data warehousing and ETL process. You will learn how Azure Data Factory and SSIS can be used to understand the key components of an ETL solution. You will go through different services offered by Azure that can be used by ADF and SSIS, such as Azure Data Lake Analytics, Machine Learning and Databrick’s Spark with the help of practical examples. You will explore how to design and implement ETL hybrid solutions using different integration services with a step-by-step approach. Once you get to grips with all this, you will use Power BI to interact with data coming from different sources in order to reveal valuable insights. By the end of this book, you will not only learn how to build your own ETL solutions but also address the key challenges that are faced while building them.
Table of Contents (12 chapters)

Copy data from SQL Server to sales-data


All the steps we took in the previous section were preparing the copy activity we're going to do in this section. What we have done manually so far is essentially take the same steps that the copy activity wizard would have done. Since we're now more familiar with ADF, it's good to do it manually to better understand all the steps of a copy activity process.

We'll now add a new pipeline to our factory. We could work from the one we created to hold our Execute SSIS package activity, but since we want to test and debug the new code, it's easier to use another pipeline. If we don't do it, we would have to execute all the pipeline every time we want to test a single activity. Once we are satisfied with our work, we will copy/paste it back into the main pipeline.

Drag a Copy activity from the Dataflow section of the activities. Name it Copy_ADFFV2Book_Sales_Blob, as shown in the following screenshot:

Click on the Source tab and, using the drop-down list, select...