Book Image

Pentaho 3.2 Data Integration: Beginner's Guide

Book Image

Pentaho 3.2 Data Integration: Beginner's Guide

Overview of this book

Pentaho Data Integration (a.k.a. Kettle) is a full-featured open source ETL (Extract, Transform, and Load) solution. Although PDI is a feature-rich tool, effectively capturing, manipulating, cleansing, transferring, and loading data can get complicated.This book is full of practical examples that will help you to take advantage of Pentaho Data Integration's graphical, drag-and-drop design environment. You will quickly get started with Pentaho Data Integration by following the step-by-step guidance in this book. The useful tips in this book will encourage you to exploit powerful features of Pentaho Data Integration and perform ETL operations with ease.Starting with the installation of the PDI software, this book will teach you all the key PDI concepts. Each chapter introduces new features, allowing you to gradually get involved with the tool. First, you will learn to work with plain files, and to do all kinds of data manipulation. Then, the book gives you a primer on databases and teaches you how to work with databases inside PDI. Not only that, you'll be given an introduction to data warehouse concepts and you will learn to load data in a data warehouse. After that, you will learn to implement simple and complex processes.Once you've learned all the basics, you will build a simple datamart that will serve to reinforce all the concepts learned through the book.
Table of Contents (27 chapters)
Pentaho 3.2 Data Integration Beginner's Guide
Credits
Foreword
The Kettle Project
About the Author
About the Reviewers
Preface
Index

Time for action – loading the fact table using a range of dates obtained from the command line


Now you will get the range of dates from the command line and load the fact table using that range:

  1. Create a new transformation.

  2. With a Get system info step, get the first two arguments from the command line and name them date_from and date_to.

  3. By using a couple of steps, check that the arguments are not null, have the proper format (yyyy-mm-dd), and are valid dates.

  4. If something is wrong with the arguments, abort.

  5. If the arguments are valid, use a Set variables step to set two variables named DATE_FROM and DATE_TO.

  6. Save the transformation in the same folder you saved the transformation that loads the fact table.

  7. Test the transformation by providing valid and invalid arguments to see that it works as expected.

  8. Create a job and save it in the same folder you saved the job that loads the dimensions.

  9. Drag to the canvas a START and two transformation job entries, and link them one after the other.

  10. Use the first...