Book Image

DAX Cookbook

By : Greg Deckler
Book Image

DAX Cookbook

By: Greg Deckler

Overview of this book

DAX provides an extra edge by extracting key information from the data that is already present in your model. Filled with examples of practical, real-world calculations geared toward business metrics and key performance indicators, this cookbook features solutions that you can apply for your own business analysis needs. You'll learn to write various DAX expressions and functions to understand how DAX queries work. The book also covers sections on dates, time, and duration to help you deal with working days, time zones, and shifts. You'll then discover how to manipulate text and numbers to create dynamic titles and ranks, and deal with measure totals. Later, you'll explore common business metrics for finance, customers, employees, and projects. The book will also show you how to implement common industry metrics such as days of supply, mean time between failure, order cycle time and overall equipment effectiveness. In the concluding chapters, you'll learn to apply statistical formulas for covariance, kurtosis, and skewness. Finally, you'll explore advanced DAX patterns for interpolation, inverse aggregators, inverse slicers, and even forecasting with a deseasonalized correlation coefficient. By the end of this book, you'll have the skills you need to use DAX's functionality and flexibility in business intelligence and data analytics.
Table of Contents (15 chapters)

Calculating Shannon entropy

Shannon entropy, or more generally information entropy, is an important concept in information theory, the field of study that concerns the quantification of information used in communication. In thermodynamics and other fields, entropy generally refers to the disorder or uncertainty within a system. Claude Shannon introduced the concept of information entropy in the late 1940s, and its definition is effectively equivalent to the definition used in the field of thermodynamics.

Today, the concept of information entropy has a wide range of uses spanning security, encryption, and even such things as machine learning and artificial intelligence. The mathematical definition for information entropy is as follows:

Here, X is a random variable that has xi possible outcomes. P is the probability. The b in this formula can be one of several values, but is most...