Book Image

Learn Azure Synapse Data Explorer

By : Pericles (Peri) Rocha
Book Image

Learn Azure Synapse Data Explorer

By: Pericles (Peri) Rocha

Overview of this book

Large volumes of data are generated daily from applications, websites, IoT devices, and other free-text, semi-structured data sources. Azure Synapse Data Explorer helps you collect, store, and analyze such data, and work with other analytical engines, such as Apache Spark, to develop advanced data science projects and maximize the value you extract from data. This book offers a comprehensive view of Azure Synapse Data Explorer, exploring not only the core scenarios of Data Explorer but also how it integrates within Azure Synapse. From data ingestion to data visualization and advanced analytics, you’ll learn to take an end-to-end approach to maximize the value of unstructured data and drive powerful insights using data science capabilities. With real-world usage scenarios, you’ll discover how to identify key projects where Azure Synapse Data Explorer can help you achieve your business goals. Throughout the chapters, you'll also find out how to manage big data as part of a software as a service (SaaS) platform, as well as tune, secure, and serve data to end users. By the end of this book, you’ll have mastered the big data life cycle and you'll be able to implement advanced analytical scenarios from raw telemetry and log data.
Table of Contents (19 chapters)
1
Part 1 Introduction to Azure Synapse Data Explorer
6
Part 2 Working with Data
12
Part 3 Managing Azure Synapse Data Explorer

Technical requirements

In case you haven’t yet, make sure you download this book’s material from the GitHub repository at https://github.com/PacktPublishing/Learn-Azure-Synapse-Data-Explorer. You can download the full repository by selecting Code, and then Download ZIP, or by cloning the repository by using your git client of choice. This chapter uses a version of the fleet data table, but we will create it and ingest some data for this chapter, so don’t worry about creating it at this time.

In this chapter, we will go back to data ingestion for a quick example of how we can tag extents at the time of data ingestion. Later, we will use this data to move extents around and perform a data purge operation. This ingestion uses the simplified drone telemetry dataset, which needs to be copied to a container on your ADLS Gen2 storage account before we can use it. If you don’t have this file on your storage account yet, make sure you follow the steps provided...