-
Book Overview & Buying
-
Table Of Contents
ETL with Azure Cookbook
By :
When I first started in the data warehousing business, something like 20 years ago, I was asked in an interview to define ETL. Being at my first job interview, I had no clue what the interviewer meant by ETL. Luckily, the interviewer kind of liked me and hired me anyway. He told me that I would know all about ETL quite soon. Being in data warehouse businesses for many years, and more recently a data engineer, ETL is what has kept me busy most of the time since then.
ETL stands for Extract, Transform, and Load. ETL is a data moving technique that has been used in various forms since the first enterprise data warehouses' inceptions.
Microsoft formalized the ETL concept near the end of the 1990s with a tool called DTS: Data Transformation Service. This ETL tool, aimed at helping database administrators load data into and from SQL Server, used SQL and ActiveX to move and transform data on-premises.
Microsoft brought its ETL tool to the cloud with the introduction of Azure Data Factory (ADF). In 2018, Microsoft extensively overhauled ADF to create Azure Data Factory v2, which allowed the user to complete many tasks within ADF that had previously required the use of more software.
Another commonly used Azure ETL tool is Databricks. This tool uses Apache Spark as a compute service, allowing developers to use many languages to develop their transformations: Python, Scala, R, and SQL. Java can also be used to develop shared components to be used by many ETL pipelines.
Doing ETL is a necessary step for any data warehouse or data science project. It is used in various forms and shapes in IT for tasks such as the following:
In the next chapters, we will explore all these tools and give you access to recipes that will show you how to do ETL in Azure.
In this chapter, we will cover the following recipes: