Book Image

Salesforce Data Architecture and Management

By : Ahsan Zafar
Book Image

Salesforce Data Architecture and Management

By: Ahsan Zafar

Overview of this book

As Salesforce orgs mature over time, data management and integrations are becoming more challenging than ever. Salesforce Data Architecture and Management follows a hands-on approach to managing data and tracking the performance of your Salesforce org. You’ll start by understanding the role and skills required to become a successful data architect. The book focuses on data modeling concepts, how to apply them in Salesforce, and how they relate to objects and fields in Salesforce. You’ll learn the intricacies of managing data in Salesforce, starting from understanding why Salesforce has chosen to optimize for read rather than write operations. After developing a solid foundation, you’ll explore examples and best practices for managing your data. You’ll understand how to manage your master data and discover what the Golden Record is and why it is important for organizations. Next, you'll learn how to align your MDM and CRM strategy with a discussion on Salesforce’s Customer 360 and its key components. You’ll also cover data governance, its multiple facets, and how GDPR compliance can be achieved with Salesforce. Finally, you'll discover Large Data Volumes (LDVs) and best practices for migrating data using APIs. By the end of this book, you’ll be well-versed with data management, data backup, storage, and archiving in Salesforce.
Table of Contents (14 chapters)
Section 1: Data Architecture and Data Management Essentials
Section 2: Salesforce Data Governance and Master Data Management
Section 3: Large Data Volumes (LDVs) and Data Migrations

Types of backup

In this section, we will look at the different types of backup and the use cases for them. There are three types of backup that can be performed:

  • Full backup: The scope of this is all data, including metadata. This backup provides the peace of mind that all your data is backed up. The downside is that because it's a full database, restoring a subset of data can take a lot of time.
  • Incremental backup: The scope of this type of backup is that any data that has not been backed up since the last full backup will be included in an incremental backup. In this type of backup, it is much quicker to retrieve data from a certain point in time. If this is restored, the system may not have all the related data and, in order to recreate that, a full backup restore would be required. Then, all the subsequent incremental backups would need to be applied to get to the point just prior to the data loss. This also depends on the frequency of data backups, as backup jobs...