Book Image

SAP Data Services 4.x Cookbook

Book Image

SAP Data Services 4.x Cookbook

Overview of this book

Want to cost effectively deliver trusted information to all of your crucial business functions? SAP Data Services delivers one enterprise-class solution for data integration, data quality, data profiling, and text data processing. It boosts productivity with a single solution for data quality and data integration. SAP Data Services also enables you to move, improve, govern, and unlock big data. This book will lead you through the SAP Data Services environment to efficiently develop ETL processes. To begin with, you’ll learn to install, configure, and prepare the ETL development environment. You will get familiarized with the concepts of developing ETL processes with SAP Data Services. Starting from smallest unit of work- the data flow, the chapters will lead you to the highest organizational unit—the Data Services job, revealing the advanced techniques of ETL design. You will learn to import XML files by creating and implementing real-time jobs. It will then guide you through the ETL development patterns that enable the most effective performance when extracting, transforming, and loading data. You will also find out how to create validation functions and transforms. Finally, the book will show you the benefits of data quality management with the help of another SAP solution—Information Steward.
Table of Contents (19 chapters)
SAP Data Services 4.x Cookbook
Credits
About the Author
About the Reviewers
www.PacktPub.com
Preface
Index

Use case example – populating dimension tables


In this recipe, we will build the ETL job to populate two dimension tables in the AdventureWorks_DWH database, DimGeography and DimSalesTerritory, with the data from the operational database AdventureWorks_OLTP.

Getting ready

For this recipe, you will have to create new job. Also, create two new schemas in the STAGE database: Extract and Transform. To do this, open the SQL Server Management Studio, expand Databases | STAGE | Security | Schemas, right-click on the Schemas folder, and choose the New Schema… option from the context menu. Specify your administrator user account as a schema owner.

How to do it…

  1. In the first step, we will create extraction processes using these steps:

    1. Open the job context and create the WF_extract workflow.

    2. Open the WF_extract workflow in the workspace and create four workflows: each for every source table we extract from the OLTP database: WF_Extract_SalesTerritory, WF_Extract_Address, WF_Extract_StateProvince, WF_Extract_CountryRegion...