Book Image

Snowflake Cookbook

By : Hamid Mahmood Qureshi, Hammad Sharif
Book Image

Snowflake Cookbook

By: Hamid Mahmood Qureshi, Hammad Sharif

Overview of this book

Snowflake is a unique cloud-based data warehousing platform built from scratch to perform data management on the cloud. This book introduces you to Snowflake's unique architecture, which places it at the forefront of cloud data warehouses. You'll explore the compute model available with Snowflake, and find out how Snowflake allows extensive scaling through the virtual warehouses. You will then learn how to configure a virtual warehouse for optimizing cost and performance. Moving on, you'll get to grips with the data ecosystem and discover how Snowflake integrates with other technologies for staging and loading data. As you progress through the chapters, you will leverage Snowflake's capabilities to process a series of SQL statements using tasks to build data pipelines and find out how you can create modern data solutions and pipelines designed to provide high performance and scalability. You will also get to grips with creating role hierarchies, adding custom roles, and setting default roles for users before covering advanced topics such as data sharing, cloning, and performance optimization. By the end of this Snowflake book, you will be well-versed in Snowflake's architecture for building modern analytical solutions and understand best practices for solving commonly faced problems using practical recipes.
Table of Contents (12 chapters)

Identifying and reducing unnecessary Fail-safe and Time Travel storage usage

Through this recipe, we will learn how to identify tables that may be used for ETL-like workloads and therefore do not need Fail-safe and Time Travel storage capabilities. Such tables can be altered to remove Fail-safe and Time Travel storage, resulting in lower overall storage costs.

Getting ready

You will need to be connected to your Snowflake instance via the web UI or the SnowSQL client to execute this recipe.

How to do it…

We will simulate a fictitious ETL process in which we use a temporary table for holding some data. Data from the interim table is then processed and aggregated into a target table. Once the target table is loaded, the ETL process deletes the data from the temporary table. The purpose of this is to explain what the best table type for interim ETL tables is. The steps for this recipe are as follows:

  1. We will start by creating a new database and a table that will...