Book Image

Microsoft SQL Server 2012 Integration Services: An Expert Cookbook

Book Image

Microsoft SQL Server 2012 Integration Services: An Expert Cookbook

Overview of this book

SQL Server Integration Services (SSIS) is a leading tool in the data warehouse industry - used for performing extraction, transformation, and load operations. This book is aligned with the most common methodology associated with SSIS known as Extract Transform and Load (ETL); ETL is responsible for the extraction of data from several sources, their cleansing, customization, and loading into a central repository normally called Data Warehouse or Data Mart.Microsoft SQL Server 2012 Integration Services: An Expert Cookbook covers all the aspects of SSIS 2012 with lots of real-world scenarios to help readers understand usages of SSIS in every environment. Written by two SQL Server MVPs who have in-depth knowledge of SSIS having worked with it for many years.This book starts by creating simple data transfer packages with wizards and illustrates how to create more complex data transfer packages, troubleshoot packages, make robust SSIS packages, and how to boost the performance of data consolidation with SSIS. It then covers data flow transformations and advanced transformations for data cleansing, fuzzy and term extraction in detail. The book then dives deep into making a dynamic package with the help of expressions and variables, and performance tuning and consideration.
Table of Contents (23 chapters)
Microsoft SQL Server 2012 Integration Services: An Expert Cookbook
Credits
Foreword
About the Authors
About the Reviewers
www.PacktPub.com
Preface
Index

Working with flat files in Data Flow


In this recipe, we will demonstrate the use of the Flat File Source component that is often used in data integration projects. As explained in Chapter 2, Control Flow Tasks for security reasons, the Operational Systems (OS) owners usually prefer to push data to an external location in spite of providing direct access to the OS.

Data quality problems exist in conventional databases such as SQL Server, Oracle, DB2, and so on. It's possible to imagine the quality problems that could arise when dealing with flat files, the construction of these files could generate several problems because the external system that will read each record in this file (for example, the Extract step of the ETL process) needs to know how to split the record into columns and rows. The Row delimiter is required in order to split each row whereas the Column delimiter is required to split each column from each row. But in many cases, the Column delimiter can appear in the column content...