Book Image

Business Intelligence with Databricks SQL

By : Vihag Gupta
Book Image

Business Intelligence with Databricks SQL

By: Vihag Gupta

Overview of this book

In this new era of data platform system design, data lakes and data warehouses are giving way to the lakehouse – a new type of data platform system that aims to unify all data analytics into a single platform. Databricks, with its Databricks SQL product suite, is the hottest lakehouse platform out there, harnessing the power of Apache Spark™, Delta Lake, and other innovations to enable data warehousing capabilities on the lakehouse with data lake economics. This book is a comprehensive hands-on guide that helps you explore all the advanced features, use cases, and technology components of Databricks SQL. You’ll start with the lakehouse architecture fundamentals and understand how Databricks SQL fits into it. The book then shows you how to use the platform, from exploring data, executing queries, building reports, and using dashboards through to learning the administrative aspects of the lakehouse – data security, governance, and management of the computational power of the lakehouse. You’ll also delve into the core technology enablers of Databricks SQL – Delta Lake and Photon. Finally, you’ll get hands-on with advanced SQL commands for ingesting data and maintaining the lakehouse. By the end of this book, you’ll have mastered Databricks SQL and be able to deploy and deliver fast, scalable business intelligence on the lakehouse.
Table of Contents (21 chapters)
1
Part 1: Databricks SQL on the Lakehouse
9
Part 2: Internals of Databricks SQL
13
Part 3: Databricks SQL Commands
16
Part 4: TPC-DS, Experiments, and Frequently Asked Questions

Frequently asked questions

Databricks SQL is part of an entirely new product category called the Lakehouse. The Lakehouse is an alternative to data lakes and data warehouses. This prompts a lot of interest and questions from prospective customers. I am sure that you will also have a lot of questions, even after spending time reading this book.

So, here is a list of such questions and their answers, in no particular order.

How does Databricks SQL define small, medium, and large table sizes?

If we think about defining the size of tables on traditional systems, it could depend on the number of rows, the length of the records, or the number of nodes that the table is sharded across.

Since the Lakehouse enables big data processing, it can accommodate all sizes of datasets. You do not have to provision computation resources separately for small, medium, and large tables. Tuning the warehouse’s size is easy as well – if queries are running slow, increase the warehouse...