Welcome to EMC Consulting Blogs Sign in | Join | Help

SSIS Junkie

SSIS: Checkpoint clarification - Recovering failed packages

What are checkpoints?

SQL Server Integration Services (SSIS) contains a fantastic new feature called Checkpoints which enables us to restart a package from a previous point of failure rather than run the package from the start. Details about a package's execution are stored in a checkpoint file which will be read upon the next execution so that the package can determine which task it should start execution from.

Why should we use them?

The benefits of this are obvious:

  • No need to re-run tasks that have already completed - invaluable if these tasks took a long time to complete.
  • We can effectively group together tasks into logical units of work that should complete or fail as one. In fact I would recommend this as a best practice.

How should we use them?

Checkpoints should be designed as part of your ETL process - do not assume that a checkpoint can be dropped into any point in your package and the package will automatically recover itself upon failure as you would expect. The use of checkpoints should be planned and should be placed at logical places in the package control-flow.

Here is a "for instance". Your package contains multiple sequence containers that run in parallel. Each container has an Execute SQL Task that truncates a table and then a data-flow that populates that table from a flat file. Each of the sequence containers has an OnSuccess precedence constraint going to a task that logs the fact that all of the data staging has occurred successfully. See the next screenshot:

It wouldn't make sense to put a checkpoint on any of these parallel sequence containers. They execute in parallel so we want them to complete or fail as one logical unit of work so we would put the checkpoint on the "Log Staging Completion" task. This means that:

  1. If any of the file staging routines fail, they effectively all fail. Next time round we run the package and all files will be staged again (without error this time one would hope). Note that he package has been designed accordingly by ensuring that the destination tables are truncated each time. This ensures that the tables do not not contain the same data twice.
  2. If any downstream tasks fail, we won't need to re-stage the files

How do we configure a package to use checkpoints?

I have recently had some fun and games trying to get checkpoints to work for me and BOL wasn't too particularly helpful so I thought I'd put the information up here.

3 properties of the package must be set for which the effect is fairly self-explanatory

  • CheckpointFileName - Name and location of the checkpoint file
  • CheckpointUsage - should the package use a checkpoint file or not
  • SaveCheckpoints - Should the package save checkpoints or not.

In my tests I set:

  • CheckpointFileName="c:\MyCheckpoint.txt"
  • CheckpointUsage=IfExists
  • SaveCheckpoints=TRUE

There is a fourth property called FailPackageOnFailure that can affect behaviour and BOL is a little ambiguous about its affects. I'll explain...

I tested behaviour using a simple package containing 3 script tasks executing in series. The first always executes successfully, the second with failure and the third successfully.

[You can download this test package from here]

The FailPackageOnFailure property is a property of the package and of each container (including the TaskHost container) in your package. I tried various combinations of settings of this property on both the package and the "Failure" task. Here are the results.

Package.FailPackageOnFailure "Failure".FailPackageOnFailure Resultant behaviour
False False

No checkpoint file is created

Subsequent executions begin at task "Success1"

False True

The checkpoint file is created

Subsequent executions begin at task "Failure"

True False

The checkpoint file is created

Subsequent executions begin at task "Success2"

True True

The checkpoint file is created

Subsequent executions begin at task "Failure"

First thing to realise, if the FailPackageOnFailure property of your package and the failing task are set to FALSE then you won't get a checkpoint file, regardless of CheckpointFileName, CheckpointUsage, SaveCheckpoints.

I especially want to draw your attention to the third row in this table. On first execution the package stopped at task "Failure" (as expected) but on the subsequent execution this task was ignored and execution continued merrily on its way from the next task. Effectively the failed task has been ignored. The hard and fast rule is if you want execution from begin from a task following a failure on that task, set its FailPackageOnFailure property to true. This isn't immediately evident from reading BOL and is the main thing I wanted to draw your attention to here.

I hope I've managed to exhibit what I set out to. That is, ensure you understand exactly how the configuration of your package and the tasks within that package will affect execution in the event of a failure. If you want execution to start from a given failed task, set that task's FailPackageOnFailure property to true (the default is false) or you won't get expected results.

Feel free to download the test package and try for yourself. Checkpoints are wonderfully powerful but must be understood fully before you begin to use them!


P.S. Thanks to Nick Berezansky from the SSIS dev team for the help he has given me in understanding checkpoints.

Published Wednesday, April 20, 2005 9:36 PM by jamie.thomson



prasad said:

The downloaded package works fine. If I try to set the FailPackageonFailure = true, it is erroring out saying "FailPackageOnFailure property is not applicable to the package container". Please post a follow up.
July 26, 2005 2:21 PM

jamie.thomson said:

Hi Prasad,
I cannot replicate this problem. Please could you email me (via the Contact link on my blog) so that I can reply with my email asdress and you can send me the package.

July 28, 2005 9:02 AM

Robert said:

I am attempting to use an expression to specify a checkpoint filename however the file does not exist prior to package execution.  The package fails with a cannot find checkpoint file.

"\\" + "\\" + @[System::MachineName] + "\\" + @[User::EXECUTION_ENVIRONMENT] + "\\" + "checkpoint" +"\\"+  @[System::PackageName] + "_chkpoint.txt"

I indend on using the expression for a template package which will run in multiple execution environments without having to define a checkpoint file during setup.  Any ideas on how to have the package create the file?

February 7, 2007 6:49 PM

jamie.thomson said:


Make sure you have CheckpointUsage='IfExists'


February 7, 2007 7:11 PM

Robert said:

Thanks! I was using CheckpointUsage= 'Required'. CheckpointUsage='IfExists' creates the checkpoint file when the package runs.

February 7, 2007 8:24 PM

Christophe said:

Hi Jamie,

I'm experiencing the same thing as Prasad, since I today upgraded to SP2. I'm no longer able to set the FailPackageOnFailure to True on the package itself...

Nevermind, I'm looking at a situation where I want to load heavy files in parallel, and it seems that the checkpointfile is written whenever the first error occurs.

Here's my "for instance". I have to executes in parallel. One of them fails, the other continues for a while and then ends succesfull. I see a checkpoint is created, but at the next run both parallel executes are run again (and now both fail of course).

Both executes have FailPackageOnFailure set to True, the package's is set to False. (as like in row 2 in your table)

May 29, 2007 3:58 PM

Christophe said:

Apparently there are some checkpoint issues with parallel tasks


June 13, 2007 8:37 AM

sreenivas said:

Shall i run my packages from where it excatly got failed??(from row level)

Scenario is : out of 100 rows i was successfully loaded 95 rows and got fail at the 96th row..can i able to restart the package from 96th row using check points??is their any work around approcah for it??

July 17, 2007 7:28 AM

jamie.thomson said:


No, you can't do that.


July 17, 2007 2:44 PM

Jim said:

Jamie -

You make the statement that "If any downstream tasks fail, we won't need to re-stage the files". This is what I'm hoping for, but not what I'm experiencing. I wonder if you can shed some light...

Suppose I had a package where I needed to FTP a file and then load the data it into a table. Suppose my package had the following 4 tasks.

1) File system task to create the FTP directory

2) FTP task to download the file

3) Some other task (truncate table for instance)

4) Data Flow to load the staging table from the file downloaded in step 2.

Now, assume that a failure on step 4 should re-run steps 3 and 4, but not 1 and 2. To accomplish this, I would expect to set 'FailOnPackageFailure' to True for Task 2 and 4, but false on Tasks 1 and 3. However, set like this, a failure on step 4 restarts at step 4. How do I make it restart at step 3?

Thanks in advance for any ideas...

- Jim

August 8, 2007 9:52 PM

jamie.thomson said:

August 8, 2007 10:10 PM

Eve said:

I have the same issue as Prasad. When I try to set the Package.FailPackageonFailure to  true, I recieve an error saying "FailPackageOnFailure property is not applicable to the package container". Any resolution to this?

August 17, 2007 7:54 PM

jamie.thomson said:


There is no resolution. Nor does there need to be - this is quite correct. There is no reason to set FailPackageOnFailure=TRUE on the package container, it needs to be set on tasks and containers.


August 17, 2007 8:24 PM

Louw said:

Hi Jamie,

Were you able to resolve Prasad's issue? Since I have the same limitation. I can not set the "FailPackageOnFailure" property on the package level to True. I keep on getting an error message.



November 19, 2007 12:44 PM

Dushyant said:

Hi Jamie,

I am trying to use required option for TransactionOption property of package and supported for rest of the task and containers in the package. My requiremnt is to rollback all previous tasks in case of any of the task failing.

If I set Required in TransactionOption property for a Sequence Container and supported for tasks inside it, will all the task that have completed succesfully inside rollback, if any subsequent task fails ?

I am facing an error stating that "Error: The SSIS Runtime has failed to start the distributed transaction due to error 0x8004D01B "The Transaction Manager is not available.". The DTC transaction failed to start. This could occur because the MSDTC Service is not running.".This is happening for REQUIRED option only.

January 27, 2008 9:42 AM

nick said:

hi people, i have crashed on a bit different problem. I have a sequence container and on this container I have set "FailPackageOnFailure=true". Now in this container there are 2 tasks. The first one is preceeding the second one. Now both this task have set "FailParentOnFailure=true". Both task are the same and their purpose is to drop table A.

1) I run the package and it fails, because there is no table to drop.

2) I create the table manualy and run package again.

3) I see, that the first task is beeing just SIMPLE OMMITED and the second task runs

In general, everytime any task in a sequence container invokes failure, next time is beeing ommited regardelss of its status. How can this be fixed ? Thanks

March 6, 2008 5:11 PM

pritesh said:

Hi Jamie,

         I have to load large files into my databases. There files have 10 million + records, i want to know how can i ensure consistency of the load. As in , if the load fails in between one file like after 5 million rows , then does SSIS automatically rolls back all the batches that were successfully commited , or do we have to make some settings so that it manages it. Do tell me different ways to manage consistency.

April 21, 2008 2:08 PM

jamie.thomson said:


It depends how you are loading the data. if you want all to fail when one fails then you would be best to load them in a single transaction. hence, try using the Union All component or the Multiflatfile adapter.

This may help:

Processing data from multiple files all at once



April 21, 2008 2:29 PM

pritesh said:

Ok, i think i was not clear in my question. I need to load one file at a time. But i need to do it in a way such that if the load fails midway then the rows that are aready written to the table gets rolled back.

 I am using OLEDB destination with fast load, i have 10 milliion rows per file to load. if i keep the batch size as 10,000 then if the load fails after 5 millionth row, i need to roll back all the batches that were successfully loaded and start over again for that particular file.

April 21, 2008 3:37 PM

jamie.thomson said:

If you are using OLE DB with Fastload then all rows get committed or rolled back as awhole (I think) hence you don't need to do anything extra in order to get the behaviour you are after.


April 21, 2008 3:41 PM

Vik said:

Hi Jamie,

---"If you are using OLE DB with Fastload then all rows get committed or rolled back as awhole (I think) hence you don't need to do anything extra in order to get the behaviour you are after."

Pritesh has set Rows Per Batch setting in OLEDB destination to 10000, so wont it commit in batches of 10000?

April 23, 2008 4:27 PM

jamie.thomson said:

Oh I can never remember the intracasies of this stuff:)

Try it and find out.

April 24, 2008 10:15 AM

Krishna said:


When I use parallel dataflow tasks after the sqltask which truncates all the tables in one step and reloads the tables in parallel once the sql task succedes. But I am getting incorrect no of records in the table. Is there any issue with this approach?

April 25, 2008 8:02 PM

jamie.thomson said:


Are you loading the same table using these 2 data flwo tasks?


April 27, 2008 7:38 PM

Krishna said:


Thank you so much for your response.

No I am loading into 2 different tables. With the sql task I am truncating those two tables.After the SqlTask succedes , I am loading these two tables in parallel. In visual studio it gives the correct number of records that it loaded into the tables. But, when query the tables I get only one record in one table and in the second table I get the correct results.



April 28, 2008 4:37 PM

jamie.thomson said:


Sorry, but without being there its impossible to say I'm afraid.

Use SQL Profiler to see what SQL statements get issued against the destination.


April 29, 2008 11:57 AM

Krishna said:


I have created the trace files using the SQL profiler. It is setting the rowcount to 1 by SET ROWCOUNT 1 before the source sql runs. That's why I am getting only one record in the target table. I used the SET ROWCOUNT 0 in the sql query and I am getting the desired results.

But, Where in the package or dataflow it is setting the ROWCOUNT to 0? I have no clue at all.

When run in sequential I don't see any problems.



April 29, 2008 10:50 PM

Krishna, said:

I meant to say

But, Where in the package or dataflow it is setting the ROWCOUNT to 1? I have no clue at all.


April 29, 2008 11:37 PM

Krishna said:


It is working fine in parallel now. I changed the data access mode in the destination from 'Table or view - fast load' to Table or view.



April 29, 2008 11:47 PM

Ric said:


I have an SSIS Package that uses a ForEach Loop Containerwith an Execute Package Task inside.  It basically loops though a list of SSIS packages and runs them.

If one of the child packages falls over, the parent package fails.  If it is run again, it starts from the beginning (and not with the erroneous package).

So I have tried to implement Checkpoints...

I have tried all sorts of combinations but cannot get Checkpoints to work.

For the parent package:

checkpointfilename has a text file

checkpointusage = Ifexists

savecheckpoints = true

failpackageonfailure = false

the ForEach Loop Container has failpackageonfailure = true

But the package never manages to run even when the error is fixed.

Is it possible to checkpoint such an SSIS package as this??


May 11, 2008 12:21 PM

jamie.thomson said:

Hi Ric,

Does this help:

"The task host container, which encapsulates a single task, is the smallest atomic unit of work that can be restarted. The Foreach Loop container and a transacted container are also treated as atomic units of work."



May 12, 2008 9:53 AM

ric said:

Thanks Jamie...

Does that mean that you can only restart a package with a checkpoint from the beginning of a foreach loop?


May 13, 2008 2:14 PM

jamie.thomson said:


Yep, afraid so.


May 13, 2008 2:18 PM

Subrat said:

Hi Jamie,

First of all a big thanks for running this useful forum.

My problem:

I have an SSIS package which contains a single Sequence container and within it 8 data flow tasks. These data flow tasks inserts into 8 tables respectively.

I am unable to implement the functionality that - if any one of the DFT fails, all the insertions in previous data flow tasks should get roll backed.

For example - if DFT 3 fails, then records inserted by DFT1 and DFT2 should get rollbacked.

Please tell me how do i do this?



July 10, 2008 9:53 AM

jamie.thomson said:


Try using transactions:

SSIS: Package transactions



July 10, 2008 10:03 AM

Fadi Lteif said:

My problem is that I want to automatically re-run the failed package instead of waiting for me to come in the morning and re-run it manually from the SQL Agent. My Load runs at midnight, if it fails, i want it to re-run automatically. How can that be achieved? Thank you

July 24, 2008 4:45 PM

jamie.thomson said:


Have you tried upping the retry count on the job?


July 24, 2008 4:54 PM

Fadi Lteif said:

Where and how do you do that?

July 24, 2008 7:50 PM

Fadi Lteif said:

I think you are referring to SQL Agent Job. I will give that a try along with using checkpoints..  Thanks

July 24, 2008 8:14 PM

Rajesh said:


   I created SSIS Package for migrating the flat files data to the SQL2005. I am reading the files from one shared folder.I am using the For-Each Loop Container to loop through the files in the folder.

  I am totally having the 4 data flow task's. In 1st data flow task i will be opening the flat file that will be given by the For-Each Loop container, and i will do some operations inside that data flow(But i am not editing any thing in the source file.)

Then in next Data Flow Task, i am once again opening the same flat-file.(Assuming that the File which i opened in the 1st data flow will be closed in the memory but i am not sure about it.)

Here the actual problem that i am facing is : Package which is scheduled is getting failing frequently but not every time.

If anyone have any suggestions or solutions for the above problem please help me out........

Thank you in advance...

August 28, 2008 7:43 AM

jamie.thomson said:


You'll have to provide more info (such as an error message).


August 28, 2008 9:09 AM

leo802 said:


I have question regarding "

Robert said:

I am attempting to use an expression to specify a checkpoint filename however the file does not exist prior to package execution.  The package fails with a cannot find checkpoint file.

"\\" + "\\" + @[System::MachineName] + "\\" + @[User::EXECUTION_ENVIRONMENT] + "\\" + "checkpoint" +"\\"+  @[System::PackageName] + "_chkpoint.txt"

I indend on using the expression for a template package which will run in multiple execution environments without having to define a checkpoint file during setup.  Any ideas on how to have the package create the file?



I cant do this. i beat my headup but cant make a path dynamic.

It seems like path is created for "Checkpointfilename" when package gets started.

I make a variable which will make a path based on envirnment on share site some folder.

Like //ABC/Bin/ Dev or Integ or Prod folders

I made a path like //ABC/Bin/Integ/checkpoint_Packagename_checkpoint.xml

when ever i run the package i get the default path which is in Path variable.

if i make blank then it will put under root drive like C:\

What should i do. i dont get. all variables are configured for expression.

package is Ifexists also.  expression for filename is also ok.

Plz any one can help?



March 4, 2009 9:12 PM

SQL Server Best Practice » Use Checkpoints for SSIS Package Restartability said:

May 18, 2009 2:44 PM
New Comments to this post are disabled

This Blog


Powered by Community Server (Personal Edition), by Telligent Systems