Book Image

Microsoft SQL Server 2012 Integration Services: An Expert Cookbook

Book Image

Microsoft SQL Server 2012 Integration Services: An Expert Cookbook

Overview of this book

SQL Server Integration Services (SSIS) is a leading tool in the data warehouse industry - used for performing extraction, transformation, and load operations. This book is aligned with the most common methodology associated with SSIS known as Extract Transform and Load (ETL); ETL is responsible for the extraction of data from several sources, their cleansing, customization, and loading into a central repository normally called Data Warehouse or Data Mart.Microsoft SQL Server 2012 Integration Services: An Expert Cookbook covers all the aspects of SSIS 2012 with lots of real-world scenarios to help readers understand usages of SSIS in every environment. Written by two SQL Server MVPs who have in-depth knowledge of SSIS having worked with it for many years.This book starts by creating simple data transfer packages with wizards and illustrates how to create more complex data transfer packages, troubleshoot packages, make robust SSIS packages, and how to boost the performance of data consolidation with SSIS. It then covers data flow transformations and advanced transformations for data cleansing, fuzzy and term extraction in detail. The book then dives deep into making a dynamic package with the help of expressions and variables, and performance tuning and consideration.
Table of Contents (23 chapters)
Microsoft SQL Server 2012 Integration Services: An Expert Cookbook
Credits
Foreword
About the Authors
About the Reviewers
www.PacktPub.com
Preface
Index

Dynamic data transfer with different data structures


Dynamic data transfer is one of the most challenging tasks in Data Integration tools. This challenge will be much greater when structure of data is different. As you saw in the previous recipe, the SSIS Data Flow Task cannot work with dynamic metadata and this means that if structure of columns (name of columns, data types of columns, and number of columns) changes, the Data Flow Task will fail on validation and stop working.

Good news is that there are alternative ways for data transfer based on source and destination outside the Data Flow Task.

In this recipe, we will see an example of dynamic data transfer from SQL server database to CSV flat files. The source tables may have different data structure. At the end of this recipe, we will talk about other sources and destinations.

Getting ready

Xp_CmdShell should be enabled for this recipe. To enable xp_cmdShell, run the following statements on SSMS on the local default instance of the SQL...