Book Image

Data Democratization with Domo

By : Jeff Burtenshaw
Book Image

Data Democratization with Domo

By: Jeff Burtenshaw

Overview of this book

Domo is a power-packed business intelligence (BI) platform that empowers organizations to track, analyze, and activate data in record time at cloud scale and performance. Data Democratization with Domo begins with an overview of the Domo ecosystem. You’ll learn how to get data into the cloud with Domo data connectors and Workbench; profile datasets; use Magic ETL to transform data; work with in-memory data sculpting tools (Data Views and Beast Modes); create, edit, and link card visualizations; and create card drill paths using Domo Analyzer. Next, you’ll discover options to distribute content with real-time updates using Domo Embed and digital wallboards. As you advance, you’ll understand how to use alerts and webhooks to drive automated actions. You’ll also build and deploy a custom app to the Domo Appstore and find out how to code Python apps, use Jupyter Notebooks, and insert R custom models. Furthermore, you’ll learn how to use Auto ML to automatically evaluate dozens of models for the best fit using SageMaker and produce a predictive model as well as use Python and the Domo Command Line Interface tool to extend Domo. Finally, you’ll learn how to govern and secure the entire Domo platform. By the end of this book, you’ll have gained the skills you need to become a successful Domo master.
Table of Contents (26 chapters)
1
Section 1: Data Pipelines
7
Section 2: Presenting the Message
12
Section 3: Communicating to Win
17
Section 4: Extending
21
Section 5: Governing

Introducing the sculpting tools for persistent datasets

In Domo, the sculpting tools are called Magic because they make it simple to transform data. The tools allow you to create dataflows that output new persistent datasets – ones that are stored on physical media. Let's look at the data transformation tools that are available for creating persistent datasets:

  • Extract, Transform, and Load (ETL) dataflows establish a sequence of operations through a visual programming interface that takes input datasets and performs transformations when executed that create and/or update output datasets.
  • SQL dataflows use SQL commands and sequence them together into a dataflow that, when executed, reads the input datasets, transforms the data according to the SQL commands, and outputs datasets.

The following table contains guidance for when to use which tool:

Figure 4.1 – Magic transform persistent tools guidance

Now that we know how to...