Book Image

Limitless Analytics with Azure Synapse

By : Prashant Kumar Mishra
Book Image

Limitless Analytics with Azure Synapse

By: Prashant Kumar Mishra

Overview of this book

Azure Synapse Analytics, which Microsoft describes as the next evolution of Azure SQL Data Warehouse, is a limitless analytics service that brings enterprise data warehousing and big data analytics together. With this book, you'll learn how to discover insights from your data effectively using this platform. The book starts with an overview of Azure Synapse Analytics, its architecture, and how it can be used to improve business intelligence and machine learning capabilities. Next, you'll go on to choose and set up the correct environment for your business problem. You'll also learn a variety of ways to ingest data from various sources and orchestrate the data using transformation techniques offered by Azure Synapse. Later, you'll explore how to handle both relational and non-relational data using the SQL language. As you progress, you'll perform real-time streaming and execute data analysis operations on your data using various languages, before going on to apply ML techniques to derive accurate and granular insights from data. Finally, you'll discover how to protect sensitive data in real time by using security and privacy features. By the end of this Azure book, you'll be able to build end-to-end analytics solutions while focusing on data prep, data management, data warehousing, and AI tasks.
Table of Contents (20 chapters)
1
Section 1: The Basics and Key Concepts
4
Section 2: Data Ingestion and Orchestration
8
Section 3: Azure Synapse for Data Scientists and Business Analysts
14
Section 4: Best Practices

Bringing data to your Synapse SQL pool using Copy Data tool

Copy Data tool makes it very easy to bring your data to Azure Synapse. This is not that different to using the Copy activity of Azure Data Factory, except you do not have to spin up another service for data ingestion in Azure Synapse. You need to make sure you have applied all of the technical requirements before you start following these steps:

  1. Click on Copy Data tool as highlighted in Figure 3.1. This will open a new window where you need to provide the source and destination connection details.
  2. Provide an appropriate name for your pipeline, along with a brief description.
  3. You can choose to run this pipeline once only, or you can schedule it to run regularly. For this example, we are going to schedule our pipeline to run on a daily basis.

    Click on Run regularly on schedule and select the Schedule trigger type.

  4. Provide an appropriate value for Start Date (UTC). This is auto populated with the current date...