Building the ingestion pipelines
You might recall from previous sections that we decided to create two ingestion pipelines – batch ingestion and streaming ingestion. Each one of these pipelines will be built using a different set of Azure services. For batch ingestion, we will use Azure Data Factory, and for streaming ingestion, we will use Azure Event Hubs Capture. So, let's get going, as we still have a long way to go.
Building a batch ingestion pipeline
Before proceeding with the actual creation of the batch pipeline, let me remind you of a key requirement of the Electroniz lakehouse. Previously, Electroniz stated that the transactions within the sales database and online store happen very frequently throughout the day. They wanted the lakehouse to be kept up to date with the newly created data with a maximum delay of 1 hour.
To satisfy this requirement, we will need to structure the pipeline using the watermark approach. Simply put, a watermark is a column...