Book Image

SQL Server 2017 Integration Services Cookbook

By : Christian Cote, Dejan Sarka, David Peter Hansen, Matija Lah, Samuel Lester, Christo Olivier
Book Image

SQL Server 2017 Integration Services Cookbook

By: Christian Cote, Dejan Sarka, David Peter Hansen, Matija Lah, Samuel Lester, Christo Olivier

Overview of this book

SQL Server Integration Services is a tool that facilitates data extraction, consolidation, and loading options (ETL), SQL Server coding enhancements, data warehousing, and customizations. With the help of the recipes in this book, you’ll gain complete hands-on experience of SSIS 2017 as well as the 2016 new features, design and development improvements including SCD, Tuning, and Customizations. At the start, you’ll learn to install and set up SSIS as well other SQL Server resources to make optimal use of this Business Intelligence tools. We’ll begin by taking you through the new features in SSIS 2016/2017 and implementing the necessary features to get a modern scalable ETL solution that fits the modern data warehouse. Through the course of chapters, you will learn how to design and build SSIS data warehouses packages using SQL Server Data Tools. Additionally, you’ll learn to develop SSIS packages designed to maintain a data warehouse using the Data Flow and other control flow tasks. You’ll also be demonstrated many recipes on cleansing data and how to get the end result after applying different transformations. Some real-world scenarios that you might face are also covered and how to handle various issues that you might face when designing your packages. At the end of this book, you’ll get to know all the key concepts to perform data integration and transformation. You’ll have explored on-premises Big Data integration processes to create a classic data warehouse, and will know how to extend the toolbox with custom tasks and transforms.
Table of Contents (18 chapters)
Title Page
Credits
About the Authors
About the Reviewers
www.PacktPub.com
Customer Feedback
Preface

Managing data with Pig Latin


Pig Latin is one of the programs available in big data clusters. The purpose of this program is to run scripts that can accept any type of data. "Pig can eat everything," as the mantra of the creators states.

This recipe is just meant to show you how to call a simple Pig script. No transformations are done. The purpose of the script is to show you how we can use an Azure Pig task with SSIS.

Getting ready

This recipe assumes that you have created a HDInsight cluster successfully.

How to do it...

  1. In the StgAggregatedSales.dtsx SSIS package, drag and drop an Azure Pig Task onto the control flow. Rename it apt_AggregateData.
  1. Double-click on it to open the Azure HDInsight Pig Task Editor and set the properties as shown in the following screenshot:

  1. In the script property, insert the following code:
SalesExtractsSource = LOAD 'wasbs:///Import/FactOrdersAggregated.txt'; 
rmf wasbs:///Export/; 
STORE SalesExtractsSource INTO 'wasbs:///Export/' USING PigStorage('|'); 
  1. The first...