Book Image

Learn Azure Synapse Data Explorer

By : Pericles (Peri) Rocha
Book Image

Learn Azure Synapse Data Explorer

By: Pericles (Peri) Rocha

Overview of this book

Large volumes of data are generated daily from applications, websites, IoT devices, and other free-text, semi-structured data sources. Azure Synapse Data Explorer helps you collect, store, and analyze such data, and work with other analytical engines, such as Apache Spark, to develop advanced data science projects and maximize the value you extract from data. This book offers a comprehensive view of Azure Synapse Data Explorer, exploring not only the core scenarios of Data Explorer but also how it integrates within Azure Synapse. From data ingestion to data visualization and advanced analytics, you’ll learn to take an end-to-end approach to maximize the value of unstructured data and drive powerful insights using data science capabilities. With real-world usage scenarios, you’ll discover how to identify key projects where Azure Synapse Data Explorer can help you achieve your business goals. Throughout the chapters, you'll also find out how to manage big data as part of a software as a service (SaaS) platform, as well as tune, secure, and serve data to end users. By the end of this book, you’ll have mastered the big data life cycle and you'll be able to implement advanced analytical scenarios from raw telemetry and log data.
Table of Contents (19 chapters)
1
Part 1 Introduction to Azure Synapse Data Explorer
6
Part 2 Working with Data
12
Part 3 Managing Azure Synapse Data Explorer

Configuring continuous data export

All three data export mechanisms that we have explored so far with server-side data push are quite similar in syntax and in how they operate. They require someone to start the actual data export task, monitor it, and implement the logic to copy new data in the future. There is, however, one important advantage of exporting data to external tables: the possibility to use continuous data export to keep your external table updated as new records are inserted in the original table.

We’ll continue to build on the table created in the Exporting to external tables section. We did the initial data load and now we would like to set up a process that updates the content of the external table with new records inserted in the fleet data table every two hours. Continuous data export allows you to configure this process, ensuring that each record in your table is processed exactly once.

Here’s the syntax to create a new continuous data export...