Book Image

SAP Data Services 4.x Cookbook

Book Image

SAP Data Services 4.x Cookbook

Overview of this book

Want to cost effectively deliver trusted information to all of your crucial business functions? SAP Data Services delivers one enterprise-class solution for data integration, data quality, data profiling, and text data processing. It boosts productivity with a single solution for data quality and data integration. SAP Data Services also enables you to move, improve, govern, and unlock big data. This book will lead you through the SAP Data Services environment to efficiently develop ETL processes. To begin with, you’ll learn to install, configure, and prepare the ETL development environment. You will get familiarized with the concepts of developing ETL processes with SAP Data Services. Starting from smallest unit of work- the data flow, the chapters will lead you to the highest organizational unit—the Data Services job, revealing the advanced techniques of ETL design. You will learn to import XML files by creating and implementing real-time jobs. It will then guide you through the ETL development patterns that enable the most effective performance when extracting, transforming, and loading data. You will also find out how to create validation functions and transforms. Finally, the book will show you the benefits of data quality management with the help of another SAP solution—Information Steward.
Table of Contents (19 chapters)
SAP Data Services 4.x Cookbook
About the Author
About the Reviewers

Creating variables and parameters

In this recipe, we will extend the functionality of our Hello World dataflow (see the Understanding the Designer tool recipe from Chapter 2, Configuring the Data Services Environment). Along with the first row saying "Hello World!", we will generate the second row, providing you with the name of the Data Services job that generated the greetings.

This example will not just allow us to get familiar with how variables and parameters are created but also introduce us to one of the Data Services functions.

Getting ready

Launch your Designer tool and open the Job_HelloWorld job created in the previous chapter.

How to do it…

We will parameterize our dataflow so that it can receive the external value of the job name where it is being executed, and create the second row accordingly.

We will also require an extra object in our job, in the form of a script that will be executed before the dataflow and that will initialize our variables before passing their values to the...