Book Image

SAP Data Services 4.x Cookbook

Book Image

SAP Data Services 4.x Cookbook

Overview of this book

Want to cost effectively deliver trusted information to all of your crucial business functions? SAP Data Services delivers one enterprise-class solution for data integration, data quality, data profiling, and text data processing. It boosts productivity with a single solution for data quality and data integration. SAP Data Services also enables you to move, improve, govern, and unlock big data. This book will lead you through the SAP Data Services environment to efficiently develop ETL processes. To begin with, you’ll learn to install, configure, and prepare the ETL development environment. You will get familiarized with the concepts of developing ETL processes with SAP Data Services. Starting from smallest unit of work- the data flow, the chapters will lead you to the highest organizational unit—the Data Services job, revealing the advanced techniques of ETL design. You will learn to import XML files by creating and implementing real-time jobs. It will then guide you through the ETL development patterns that enable the most effective performance when extracting, transforming, and loading data. You will also find out how to create validation functions and transforms. Finally, the book will show you the benefits of data quality management with the help of another SAP solution—Information Steward.
Table of Contents (19 chapters)
SAP Data Services 4.x Cookbook
Credits
About the Author
About the Reviewers
www.PacktPub.com
Preface
Index

Debugging job execution


Here, I will explain the use of Data Services Interactive Debugger. In this recipe, I will debug the DF_Transform_DimGeography dataflow.

The debugging process is the process of defining the points in the ETL code (dataflow in particular) that you want to monitor closely during job execution. By monitoring it closely, I mean to actually see the rows passing through or even to have control to pause the execution at those points to investigate the current passing record more closely.

Those points in code are called breakpoints, and they are usually placed before and after particular transform objects in order to see the effect made by particular a transformation on the passing row.

Getting ready…

The easiest way to debug a specific dataflow is to copy it in a separate test job. Create a new job called Job_Debug and copy DF_Transform_DimGeography in it from the workflow workspace that it's currently located in, or just drag and drop the dataflow object in the Job_Debug workspace...