Debugging job execution
Here, I will explain the use of Data Services Interactive Debugger. In this recipe, I will debug the DF_Transform_DimGeography
dataflow.
The debugging process is the process of defining the points in the ETL code (dataflow in particular) that you want to monitor closely during job execution. By monitoring it closely, I mean to actually see the rows passing through or even to have control to pause the execution at those points to investigate the current passing record more closely.
Those points in code are called breakpoints, and they are usually placed before and after particular transform objects in order to see the effect made by particular a transformation on the passing row.
Getting ready…
The easiest way to debug a specific dataflow is to copy it in a separate test job. Create a new job called Job_Debug
and copy DF_Transform_DimGeography
in it from the workflow workspace that it's currently located in, or just drag and drop the dataflow object in the Job_Debug
workspace...