Book Image

Mastering Tableau

By : David Baldwin
Book Image

Mastering Tableau

By: David Baldwin

Overview of this book

Tableau has emerged as one of the most popular Business Intelligence solutions in recent times, thanks to its powerful and interactive data visualization capabilities. This book will empower you to become a master in Tableau by exploiting the many new features introduced in Tableau 10.0. You will embark on this exciting journey by getting to know the valuable methods of utilizing advanced calculations to solve complex problems. These techniques include creative use of different types of calculations such as row-level, aggregate-level, and more. You will discover how almost any data visualization challenge can be met in Tableau by getting a proper understanding of the tool’s inner workings and creatively exploring possibilities. You’ll be armed with an arsenal of advanced chart types and techniques to enable you to efficiently and engagingly present information to a variety of audiences through the use of clear, efficient, and engaging dashboards. Explanations and examples of efficient and inefficient visualization techniques, well-designed and poorly designed dashboards, and compromise options when Tableau consumers will not embrace data visualization will build on your understanding of Tableau and how to use it efficiently. By the end of the book, you will be equipped with all the information you need to create effective dashboards and data visualization solutions using Tableau.
Table of Contents (18 chapters)
Mastering Tableau
Credits
About the Author
www.Packtpub.com
Preface

Summary


We began this chapter with a discussion on the Performance Recording dashboard. This was important because many of the subsequent exercises utilized the Performance Recording dashboard to examine underlying queries. Next was a discussion on hardware and On-the-fly techniques where the intent was to communicate hardware considerations for good Tableau performance and, in the absence of optimal hardware, techniques for getting the best possible performance out of any computer.

Then we covered working with data sources, including a section titled Single Data Source > Joining > Blending and another titled Efficiently working with Data Sources. This was followed by a discussion on generating and using extracts as efficiently as possible. By focusing on data sources for these three sections, we learned best practices and what to avoid when working with either remote datasets or extracts.

The next sections, Using filters wisely and Efficient calculations, explored performance implications...