Book Image

Azure Data Engineering Cookbook

By : Ahmad Osama
Book Image

Azure Data Engineering Cookbook

By: Ahmad Osama

Overview of this book

Data engineering is one of the faster growing job areas as Data Engineers are the ones who ensure that the data is extracted, provisioned and the data is of the highest quality for data analysis. This book uses various Azure services to implement and maintain infrastructure to extract data from multiple sources, and then transform and load it for data analysis. It takes you through different techniques for performing big data engineering using Microsoft Azure Data services. It begins by showing you how Azure Blob storage can be used for storing large amounts of unstructured data and how to use it for orchestrating a data workflow. You'll then work with different Cosmos DB APIs and Azure SQL Database. Moving on, you'll discover how to provision an Azure Synapse database and find out how to ingest and analyze data in Azure Synapse. As you advance, you'll cover the design and implementation of batch processing solutions using Azure Data Factory, and understand how to manage, maintain, and secure Azure Data Factory pipelines. You’ll also design and implement batch processing solutions using Azure Databricks and then manage and secure Azure Databricks clusters and jobs. In the concluding chapters, you'll learn how to process streaming data using Azure Stream Analytics and Data Explorer. By the end of this Azure book, you'll have gained the knowledge you need to be able to orchestrate batch and real-time ETL workflows in Microsoft Azure.
Table of Contents (11 chapters)

Copying data from Azure Data Lake Gen2 to an Azure Synapse SQL pool using the copy activity

The copy activity, as the name suggests, is used to copy data quickly from a source to a destination. In this recipe, we'll learn how to use the copy activity to copy data from Azure Data Lake Gen2 to an Azure Synapse SQL pool.

Getting ready

Before you start, do the following:

  1. Log in to Azure from PowerShell. To do this, execute the Connect-AzAccount command and follow the instructions to log in to Azure.
  2. Open https://portal.azure.com and log in using your Azure credentials.

How to do it…

Follow the given steps to perform the activity:

  1. The first step is to create a new Azure Data Lake Gen2 storage account and upload the data. To create the storage account and upload the data, execute the following PowerShell command:
    .\ADE\azure-data-engineering-cookbook\Chapter04\1_UploadOrderstoDataLake.ps1 -resourcegroupname packtade -storageaccountname packtdatalakestore...