Book Image

Salesforce Data Architect Certification Guide

By : Aaron Allport
Book Image

Salesforce Data Architect Certification Guide

By: Aaron Allport

Overview of this book

The Salesforce Data Architect is a prerequisite exam for the Application Architect half of the Salesforce Certified Technical Architect credential. This book offers complete, up-to-date coverage of the Salesforce Data Architect exam so you can take it with confidence. The book is written in a clear, succinct way with self-assessment and practice exam questions, covering all the topics necessary to help you pass the exam with ease. You’ll understand the theory around Salesforce data modeling, database design, master data management (MDM), Salesforce data management (SDM), and data governance. Additionally, performance considerations associated with large data volumes will be covered. You’ll also get to grips with data migration and understand the supporting theory needed to achieve Salesforce Data Architect certification. By the end of this Salesforce book, you'll have covered everything you need to know to pass the Salesforce Data Architect certification exam and have a handy, on-the-job desktop reference guide to re-visit the concepts.
Table of Contents (23 chapters)
1
Section 1: Salesforce Data Architect Theory
9
Section 2: Salesforce Data Architect Design
15
Section 3: Applying What We've Learned – Practice Questions and Revision Aids

Consolidating data attributes from multiple sources

When designing an MDM strategy, we need to determine what data attributes constitute the golden record, along with the source of the data attributes. Where multiple sources may contain the same attribute, it may be necessary to implement a cleansing and de-duplication process in order to ensure that the same data attribute values are preserved throughout the enterprise. Depending on the MDM implementation method being used, it may be necessary to allow any source system to facilitate the update of a data attribute to the golden record, which then invokes a process to push that update down to all source systems for that data attribute. Therefore, when implementing the golden record for the first time, data quality and classification are crucial, and so time should be given to a data cleansing, matching, and de-duplication strategy as part of the initial golden record population. It may then be wise to consider locking down the source...