http://visakhm.blogspot.in/2012/05/package-to-implement-daily-processing.html
where I explained the usage of looping logic in SSIS to do automated file processing on daily basis. One of the common issues we face in above scenario is ability to handle error files within the loop. The default behavior is package failing with error when it encounters a file with errors. But in reality we might want to continue processing rest of the loop after recording the error file details. This blog post will give you an approach which can be used to handle such scenarios.
The package would look like below
The steps involved are
- Truncate step to truncate the processing tables beforehand for each run
- For each loop to loop through files for processing and do following activities:- Capture audit start info,do actual data transfer and capture audit end info
The source files are as follows
I’ve added files with error information (wrong metadata) to mimic the scenario. I’ve also added a OnError event handler in package for data flow task to capture error file details as follows
The audit information would be captured via execute sql tasks to a table called FileProcessControl.
The flat file connection has metadata setup as follows
Any file having metadata different from above will go to error processing path.
The default package behavior of failing on error file processing can be overridden as follows
In the OnError Event handler for data flow task select variables and click on system variables. Then change value for propagate variable from True to False as shown below
This will ensure the error doesn’t get propagate to cause package failure. On executing package the errors will not cause loop to break.
If we check the audit table we can see that it has captured error details and continued with next file
The result shows package executing all files regardless of intermediate error file and also captured status in audit table.
Hope this logic can be used in similar scenarios for error handling within loops in SSIS .
There are also other properties you can take advantage of for error processing like ForcedexecutionResult, ForcedExecutionValue etc depending on your scenario.
Feel to free to revert if you need any more info on this. In case you need the full package drop a comment and I’ll share it for you.
HI,
This is Sawan I'm looking for interview questions on Sql Server And Msbi, So that i can prepare myself for interview.
Hi,
I have implemented the same thing by setting the propagate option to False in event handler,For me Dataflow Task status is showing as sucess for the bad file also and it's not executing the event handler script
The Data Flow Task is showing means there was no failure to start with. Then On Error event handler will never get fired.
Can I ask what according to you means a bad file? Why data flow task succeeded if the file was bad Do you mean metadata of the file was ok but it had some spurious data?