Book Image

Azure Synapse Analytics Cookbook

By : Gaurav Agarwal, Meenakshi Muralidharan
Book Image

Azure Synapse Analytics Cookbook

By: Gaurav Agarwal, Meenakshi Muralidharan

Overview of this book

As data warehouse management becomes increasingly integral to successful organizations, choosing and running the right solution is more important than ever. Microsoft Azure Synapse is an enterprise-grade, cloud-based data warehousing platform, and this book holds the key to using Synapse to its full potential. If you want the skills and confidence to create a robust enterprise analytical platform, this cookbook is a great place to start. You'll learn and execute enterprise-level deployments on medium-to-large data platforms. Using the step-by-step recipes and accompanying theory covered in this book, you'll understand how to integrate various services with Synapse to make it a robust solution for all your data needs. Whether you're new to Azure Synapse or just getting started, you'll find the instructions you need to solve any problem you may face, including using Azure services for data visualization as well as for artificial intelligence (AI) and machine learning (ML) solutions. By the end of this Azure book, you'll have the skills you need to implement an enterprise-grade analytical platform, enabling your organization to explore and manage heterogeneous data workloads and employ various data integration services to solve real-time industry problems.
Table of Contents (11 chapters)

Adding a trigger to a data flow pipeline

In the Moving and transforming data using a data flow recipe, we created a data flow and manually performed the execution of pipelines, also known as on-demand execution, to test the functionality and the results of the pipelines that we created. However, in a production scenario, we need the pipeline to be triggered at a specific time as per our data loading strategy. We need an automated method to schedule and trigger these pipelines, which is done using triggers.

The schedule trigger is used to execute ADF pipelines or data flows on a wall-clock schedule.

Getting ready

Make sure you have the pipeline demonstrated in the previous recipes created and published.

The data flow is created by calling the existing pipeline, as we did in the previous recipe.

How to do it…

Let's begin:

  1. An ADF trigger can be created under the Manager page, by clicking on the Add trigger | New/Edit | Create Trigger option from...