Book Image

Business Intelligence with Databricks SQL

By : Vihag Gupta
Book Image

Business Intelligence with Databricks SQL

By: Vihag Gupta

Overview of this book

In this new era of data platform system design, data lakes and data warehouses are giving way to the lakehouse – a new type of data platform system that aims to unify all data analytics into a single platform. Databricks, with its Databricks SQL product suite, is the hottest lakehouse platform out there, harnessing the power of Apache Spark™, Delta Lake, and other innovations to enable data warehousing capabilities on the lakehouse with data lake economics. This book is a comprehensive hands-on guide that helps you explore all the advanced features, use cases, and technology components of Databricks SQL. You’ll start with the lakehouse architecture fundamentals and understand how Databricks SQL fits into it. The book then shows you how to use the platform, from exploring data, executing queries, building reports, and using dashboards through to learning the administrative aspects of the lakehouse – data security, governance, and management of the computational power of the lakehouse. You’ll also delve into the core technology enablers of Databricks SQL – Delta Lake and Photon. Finally, you’ll get hands-on with advanced SQL commands for ingesting data and maintaining the lakehouse. By the end of this book, you’ll have mastered Databricks SQL and be able to deploy and deliver fast, scalable business intelligence on the lakehouse.
Table of Contents (21 chapters)
1
Part 1: Databricks SQL on the Lakehouse
9
Part 2: Internals of Databricks SQL
13
Part 3: Databricks SQL Commands
16
Part 4: TPC-DS, Experiments, and Frequently Asked Questions

Working with Delta Lake maintenance commands

Like any system, be it hardware or software, Delta Lake also requires periodic maintenance. In this section, we will learn about some of the commands to use for different maintenance operations. These commands are relevant to data engineering and data science teams. Business intelligence users need not concern themselves with these activities.

Vacuuming your Delta Lake

As we learned in Chapter 8, The Delta Lake, with every data insert, update, or delete, new files are created. After each such activity, the transaction log of the delta table is updated to reflect the set of files that constitute the table’s current or latest version. So, while the execution of user queries will ignore the non-current files, those files still exist on your cloud storage and are incurring costs. While these are not excessive costs, they can grow over time if left unchecked. This is where the VACUUM command comes in. True to its name, it vacuums...