Welcome to EMC Consulting Blogs Sign in | Join | Help

SSIS Junkie

SSIS: Custom Logging Using Event Handlers

SQL Server Integration Services (SSIS) contains some really useful logging procedures but as with most things in SSIS, it is extensible. There are 2 methods of extending the logging capability of SSIS:

  • Build a custom log provider
  • Use event handlers

I am going to demonstrate the second of these methods - using SSIS event handlers to log to a custom logging table.

Step 1 - Create the log table

The script below will create a table into which we are going to log events from our SSIS packages.

CREATE TABLE SSISLog (
  EventID int identity(1,1) NOT NULL
, EventType varchar(20) NOT NULL       --e.g. OnPostExecute, OnError
, PackageName varchar(50) NOT NULL
, TaskName varchar(50) NOT NULL       --Task that raises the event
, EventCode int NULL                              --Some events (e.g. OnError) will provide a code
, EventDescription varchar(1000)              --Some events (e.g. OnInformation) will provide an event description
, PackageDuration int                                --How long has the package been executing for
, ContainerDuration int                              --How long has the task that raised the event been executing for
, InsertCount int                                        --How many rows were inserted somewhere by the task
, UpdateCount int                                     --How many rows were updated by the task
, DeleteCount int                                       --How many rows were deleted by the task
, Host varchar(50)                                    --Which machine was the package running on
, constraint PK_SSISLog PRIMARY KEY CLUSTERED (EventID DESC)
)

This table is by no means exhaustive. There is a multitude of other things that we could log should we want but for now, this will do for demo purposes.

Some things to notice here. All of the information that we are going to log will come straight out of SSIS. Most of the information is provided by system variables however in the case of InsertCount, UpdateCount & DeleteCount we are going to have to do a bit of work in the package to get these populated as you will see later.

Step 2 - Build the logging functionality

As I said we are going to use event handlers to demonstrate the custom logging ability of SSIS. The event handlers provided with SSIS (N.B. The event handlers themselves are also extensible) are:

  • OnError
  • OnExecStatusChanged
  • OnInformation
  • OnPostExecute
  • OnPostValidate
  • OnPreExecute
  • OnPreValidate
  • OnProgress
  • OnQueryCancel
  • OnTaskFailed
  • OnVariableValueChanged
  • OnWarning

For the purposes of logging the most important ones are OnPostExecute, OnError & OnWarning and of these 3 the most important one of all is probably OnPostExecute because that enables us to put a message into our log table after each task in our package has completed. Hence, I will use the OnPostExecute event handler for demo purposes.

The relevance of the container hierarchy

It is important to realise the role that the container hierarchy of a SSIS package plays here. Understanding containers is central to understanding SSIS and they are especially relevant for event handlers. Event handlers can be attached to any container in the package and that event handler will catch all events raised by that container and any child containers of that container. Hence, we can attach an event handler to the package (which is in itself a container) and this one event handler will catch all events raised of that event type by every container in the package. This is fantastically powerful because it means we do not have to build event handlers for each task in the package.

Note that it is possible for a container to "opt out" of having its events captured by an event handler so if, say, you had a sequence container for which you didn't see the relevance of capturing events, you can simply switch them off using the sequence container's DisableEventHandlers property.

Similarly, if you wanted only certain events of that sequence task to be captured by an event handler, you could control this using the System::Propogate variable - but I'll leave further discussion of that for another day!

I have implemented the insertions to the log table using an INSERT statement in an Execute SQL Task. Using the venerable property expressions to generate the SQLStatementSource property of the Execute SQL Task at runtime we can dynamically log the execution duration, host, task that raised the event and a plethora of other useful information at our disposal. Here is a screenshot of our property expression that does this. This where "the magic" happens if you like :)

Some things that you should note at this point:

  • We are mostly using system variables to get the information we're interested in. In other words, most of what you want is provided for you, you just have to pick it up and use it.
  • We are type casting all all numeric values into strings. At frst glance that may seem an odd thing to do but remember that all we're building here is a value for the SQLStatementSource property of the Execute SQL Task. This property is of type string. We're concatenating various parts of our dynamic SQL statement and we want to be concatenating strings, not numbers.
  • PackageDuration and ContainerDuration are generated using the DATEDIFF function which returns the number of seconds between System::StartTime/System::ContainerStartTime and the current datetime returned by GETDATE().
  • The GETDATE() function used herein is an SSIS function, NOT the T-SQL function of the same name.
  • EventCode & EventDescription are populated with a zero and an empty string respectively. This is because they are not relevant for the OnPostExecute event and are more likely to be used with OnError, OnWarning & OnInformation.

For clarity, here is a screenshot showing the property expression for the OnError event.

As you can see, here we are able to use the System::ErrorCode & System::ErrorDescription variables to pick up information about the error that has been raised.

There's one other important thing we need to do in our OnPostExecute event handler. You'll notice we are using 3 variables to track the number of rows processed by our tasks (more on this later). It is important that we reset these values to zero after every task otherwise the next task may report the same values. We do this using a script task. Here's the code to reset them (its very easy - just don't forget to add the variables to the ReadWriteVariables property of the script task).

Here's what your OnPostExecute event handler should now look like like.

 

Step 3 - Lets test it!

OK, we've got our logging mechanism constructed, now lets use it! I've constucted a simple table with some data in it to which we are going to apply some inserts, updates & deletes. Here's the script to create the table and populate it with some data.

CREATE TABLE ExampleCustomerTable (
CustomerID int identity(1,1)
,
[Name] varchar(30)
,
Age int
, DateOfBirth datetime

)

GO

INSERT ExampleCustomerTable ([Name], Age, DateOfBirth) VALUES ('Jamie', 26, '19770621')
INSERT ExampleCustomerTable ([Name], Age, DateOfBirth) VALUES ('Helen', 28, '19770520')
INSERT ExampleCustomerTable ([Name], Age, DateOfBirth) VALUES ('Jon', 27, '19770502')
INSERT ExampleCustomerTable ([Name], Age, DateOfBirth) VALUES ('Gerald', 66, '19390215')
INSERT ExampleCustomerTable ([Name], Age, DateOfBirth) VALUES ('Samantha', 33, '19741105')

And here's our data-flow that does some inserts, updates and deletes based on certain criteria (not important what that criteria is for now).

As I intimated to earlier, we have to populate the InsertCount, UpdateCount & DeleteCount variables from within our data-flow. That we do using the Rowcount transformation. You can see in the above graphic that we have 3 Rowcount transformations, one for each of the the 3 variables.

To test our OnError event handler I needed a task that raised an error. The simplest way of doing that is to have a script task that returns failure instead of success.

Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime


Public Class ScriptMain
    Public Sub Main()
        Dts.TaskResult = Dts.Results.Failure
    End Sub
End
Class

OK then, that's everything. Here is the 2 tasks in our package. Note the captions in the tasks - these are the names of the tasks and you're going to be seeing them very soon.

 

Executing the package gives us the following in our logging table:

And that's it! Things to note about the log table here:

  • We have successfully got our rowcounts in there
  • The name of the task (actually the container) raising each event is in there as the machine that I ran it on (CGOJTHOMSON)
  • The error message from our script task has gotten logged successfully
  • The "Raise an error" task raised an OnPostExecute event even though it also raised an OnError event
  • The package itself raised an OnPostExecute event (EventID=4)

[The only problem at the moment is that ContainerDuration is not getting populated but I think that's a seperate issue and doesn't affect us here. I'll take it up with the dev team later.]

Let's round up what we've done:

  1. Created a logging table with our own customised schema
  2. Devised a method of logging to this table using the Execute SQL Task, property expressions and event handlers.
  3. Demo'd the use of the logging event handlers by building a simple package containing a data-flow.
  4. Demo'd the use of the Rowcount component to generate custom logging data that isn't available through the system variables.

You can download the package that I've built to demo this from here. The only thing you'll have to do to get it working is change the server name of the "LogDB" connection manager. All the scripts that you need to create the tables and data are somewhere to be found in this blog post. If there's any problems - let me know.

I hope this serves as a useful demo of how to incorporate custom logging into your systems.

-Jamie

N.B. This was built on IDW15, a beta version of SSIS that came out in June 2005.

 

UPDATE, 8th June 2008: In SSIS 2008 we have the ability to capture much richer contextual information than in SSIS 2005. Head for Using @[System::SourceParentGUID] for custom logging  to find out more.

Published Saturday, June 11, 2005 3:29 PM by jamie.thomson

Comments

 

jamie.thomson said:

Jamie,

Thanks for the very well documented example! I started adding this to
my package, and ran into this:

My package has multiple parallel data flows. Using your code verbatim
won't work (fair enough:-) since it would create a race condition with
multiple packages writing to the same variables, all defined at the
global scope.

One way to address this would be to duplicate the variables at a lower
level (in my case at 5 foreach loops), but I suspect that would also
require duplicating 5 event handlers at the same scope level, while the
top-level foreach loops would need their own event handler to log their
execution (but without count variables.) Not very pretty.

A second way would be to instead put each data flow in its own package,
and include your code verbatim. Then add a master package to run the 5
sub packages in parallel. While this has a small drawback of making it
slightly harder to propagate code changes between my 5 data flows, it's
much cleaner and I'll be testing this.

Do you think my mini-analysis makes sense, or are there better ways to
handle event based logging with multiple data flows?

Thanks!
Gary
June 13, 2005 2:02 PM
 

jamie.thomson said:

Gary,
Thanks for trying this out - its good to know people are finding it useful.

As a matter of fact I came up against the same problem myself recently whilst running parallel data-flows just last week - and yes, it IS a problem.
There is no obvious "correct" way of getting around it and both the methods you suggested are valid. I don't particularly like the idea of creating new event handlers that aren't at the package level because you lose the nice generic approach - but if its the bet way to do it then so be it.

Having said that, I like the idea of splitting into seperate packages because this has the advantage of each data-flow being able to grab more memory than it would otherwise have been able to. Of course, if you're not processing huge amounts of data this isn't really a problem.

If you are going to split into sub-packages, have you considered whether you would be able to build the package just once and then "parameterise" it (using parent package variables) and hence call the same package 5 times with different parameters?

My solution when I faced this problem was to split into sub-packages. I initially had just one package which I called X times but ultimately I had to split into X distinct packages because I was using LOOKUP components and it is not possible to make the LOOKUP query dynamic.


Hope this helps.

Regards
Jamie
June 13, 2005 2:14 PM
 

jamie.thomson said:

Jamie (I couldn't respond over on the MSDN forum because of the bug in hotmail/passport leading to redirection failure, too many url redirections on Microsoft's servers),

This looks very promising.

However, I would not look forward to the copy&paste nightmare of copying and pasting this exact INSERT statement across every DTS package in every DTS solution -- the usual problem of widespread copy&paste, that it would discourage making any subsequent changes to it, because of the cost of changing all the copies, especially slowly in the Visual Studio editor.

I'm thinking that some of that problem could be alleviated by encapsulating the statement into a UDF (SQL Server user defined function). The variable access couldn't be so encapsulated unfortunately, but at least the INSERT statement could.

This really goes back to what I assume is still an outstanding problem in SQL Server 2005 SSIS, which is code reuse in expressions -- I don't know any solution for that -- that is, to be able to encapsulate expressions for reuse. I've not yet tried the June CTP though.

June 13, 2005 3:38 PM
 

jamie.thomson said:

Perry,
I couldn't agree more that code-reuse is the single biggest stumbling block at the moment in SSIS - and the situation isn't any better in the June CTP either I'm afraid. I have it on good authority that this is one of the major things they are looking at improving in vNext though.

My solution to this problem (in the scenario of logging) is two-fold
1) Encapsulate the Execute SQL task that does the logging in another package - all you do is pass it the values that it is going to log using parent package configurations. I have got this working and it works very well (except that it is slow when running in the designer with debugging).
2) Have a template package containing all of the pre-built event handlers (whether these are Execute SQL tasks or Execute Package tasks) and build all of your packages on top of this template. The logging is then something you don't even have to think about (theoretically anyway).

-Jamie
June 13, 2005 4:05 PM
 

jamie.thomson said:

> have you considered whether you would be able to build the package just
> once and then "parameterise" it (using parent package variables)

No can do - while similar, the 5 flat file flows have different columns,
data types etc.


> 1) Encapsulate the Execute SQL task that does the logging in
another package...

Although I can pass values down to the Executed Package when
invoked by the event handler, I can't reset the variables
(InsertCount...) in the parent package from the called package. Do
you know of any practical way of doing that?
June 14, 2005 12:53 AM
 

jamie.thomson said:

Gary,
You can edit variables in a parent package using the script task.

Granted there may be a problem because you will undoubtedly have a variale called InsertCount in the child package as well but this can be alleviated by calling it something different or giving it a different namespace (I use the latter approach).

-Jamie
June 14, 2005 10:58 AM
 

jamie.thomson said:

Gotcha, thanks! Seeing how very slow the debugging was, I implemented
it differently, and only a few grey hairs later it seems to work
fine:-)


1) Set the SQL insert statement(s) as an expression string in a
variable with global scope, but with EvaluateAsExpression = FALSE.
This is the only place I need to set the SQL statement(s):

OnPostExecute_Sql = "INSERT INTO ..." + @[User::InsertRow] + ...
OnError_Sql = "INSERT INTO ..." + @[System::ErrorCode] + ...

2) Create a variable with global scope to hold a list of the
counting variables that needs resetting to 0 after each event:

Log_Variables_To_Reset = User::Delete_Count,User::Insert_Count...

3) In the parallel containers, create local row count variables
Insert_Count..., i.e. one set of variables for each container.
This keeps the containers separate when they're writing to
variables with otherwise the same name.

4) Create an event handler in one of the parallel containers, i.e.
with scope limited to one of the parallel containers, it has:

a) A local variable in the event handler with EvaluateAsExpression
= TRUE

User::OnPostExecute_Sql_Local

b) "Expand OnPostExecute_Sql" script with a single line (this is the
'trick' that does an extra EVALUATE() on the SQL expression string,
I couldn't find another way of doing it, but this works fine.)

Dts.Variables("User::OnPostExecute_Sql_Local").Expression = _
Dts.Variables("User::OnPostExecute_Sql").Value.ToString()

c) The Execute SQL task has a property expression with

SqlStatementSource = @[User::OnPostExecute_Sql_Local]

d) A script task to reset the appropriate variables:

Dim Vars As String = Dts.Variables("User::Log_Variables_To_Reset") _
.Value.ToString()
Dim VarArr() As String = Split(Vars, ",")
For i As Integer = 0 To VarArr.Length - 1
Dts.Variables(VarArr(i)).Value = 0
Next

5) Now, for any new parallel container, copy&paste the container and
the two event handlers, and create just the local variable
OnPostExecute_Sql_Local with EvaluateAsExpression=TRUE in the event handler,
and you're done - no need to edit any SQL or expressions in the event
handlers, and the containers can run and log in parallel.

6) To change the SQL insert statement across all parallel containers,
add additional variables to log etc, simply edit

OnPostExecute_Sql, OnError_Sql, Log_Variables_To_Reset

in one place in the global scope.


One gotcha: Enabling precompiled scripts in the event handlers
made exectuion time 5x faster, so that's fairly important to set!

In my case I added/changed some fields to log:

[Date Time] = GETDATE()
[Task ID] = @[System::SourceID]
[Package Duration] / [Container Duration] = (DT_STR, 10, 1252)
(DATEDIFF( "Ms", @[System::StartTime] , GETDATE() )/1000.0)
I.e. make them reals with fractional seconds included
[Read Count] = #imported rows
[Reject Count] = #rejected rows


Any comments or thoughts on all this?

Do you think a custom log provider that picks up row count variables
etc. would be a better choice?


Thanks,

Gary
June 14, 2005 3:17 PM
 

jamie.thomson said:

Gary,
I'm delighted to see someone taking this on, adapting it for their own needs, and basically improving it. The SSIS community will get stronger as more people build on each other's ideas like this. Pity you don't have a blog to share it on really :)

A custom log provider is, I feel, the way to go in the long term and one day I hope to find time to put one together. Mind you, I have cmoe across clients in the past that wouldn't let us use DTS custom tasks because they violated their policy of not having any 3rd party code on their boxes - who knows, they could have the same issue with custom log providers.

Thanks very very much for the feedback.

-Jamie
June 14, 2005 3:29 PM
 

Frans van Bree said:

Hi, I am following this with interest. What I was wondering: isn't there (also) a way to modify the built-in logging functionality?

For example, say I wanted to log the onvariablevaluechanged event and don't need more than the built-in logging functionality stores in the sysdtslog90 table. EXCEPT: I want it to log the value of the variable that changed in the "text" field, or the name and value concatened, but not just the variable's name, as is currently the case.

Is there a (simple) way to accomplish this?
June 24, 2005 3:29 PM
 

Jamie Thomson - Life, the universe and SSIS! said:

As some of you may know I've been writing about SSIS on this blog for a lon-n-n-n-n-g time now. Too long...
October 10, 2005 9:07 PM
 

Professional Association for SQL Server (PASS) SIG said:

October 10, 2005 10:36 PM
 

Jamie Thomson - Life, the universe and SSIS! said:

As some of you may know I've been writing about SSIS on this blog for a lon-n-n-n-n-g time now. Too long...
December 18, 2005 11:40 AM
 

Jamie Thomson - Life, the universe and SSIS! said:

Some time ago when SSIS was still in the beta stage I started to build my own custom logging procedures...
January 16, 2006 9:26 PM
 

Jamie Thomson - Life, the universe and SSIS! said:

A little tip here for any of you building SSIS packages.
Some of you may know that I'm a big advocate...
January 23, 2006 9:23 AM
 

Michael said:

Thanks for describing the method for capturing events in SSIS.
The unsolved question that I have is related to getting control over event handling in SSIS in general. In my project I am trying to design a package that validates type conversions for flat files. For example: input flat file column is defined in conversion task as date. The package I'm designing should: for all input records that contain uncovertable data (i.e. wrong date format) capture (1) that column name, (2) conversion error description, (3) data for that field.
My problem is with the event control: if I specify fail component for all fields that need conversion validation, the OnError will be thrown for the first bad conversion only; if I specify Ignore, no error events are captured.
I also tried using error output stream (redirect), recording ErrorCode, but it seemed to contain only first error for each problematic data field.
Is there method for creating described functionality?
January 30, 2006 9:58 PM
 

Professional Association for SQL Server (PASS) SIG said:

February 8, 2006 6:32 PM
 

Paul de Prado said:

Hi Michael,

I recommend using SSIS to load the flat file data into nvarchar columns first.  Then test for numbers and dates with the T-SQL functions IsDate(col001) and IsNumeric(col002).

e.g.

insert into my_table_errors
select * from my_table where IsDate(col001) = 0

...and...

insert into my_table_errors
select * from my_table where IsNumeric(col002) = 0

If anyone can think of a simpler method that is just as reliable, please respond.
April 5, 2006 4:55 AM
 

jamie.thomson said:

Michael,
You could use a script component to do explicit checks on the data. From there you could have multiple error outputs, one for each check that you are doing. That's effectively doing the same as what Paul suggested except in the pipeline instead.

Whether this is a good approach or not is a different question. The error output is there to tell you that it has discovered problems with the data and then allow you to capture that to some sort of data sink to be examined later.

Interesting problem though. I'd be keen to understand how you progress.

-Jamie
April 5, 2006 9:32 AM
 

Terry Ward said:

I am trying to create a SSIS package that will select specific user info within a table from one database server (A) and insert/update the selected user info in another table on a different database server (B).

Also, would like to use and if/then statement to filter certain user info.

What SSIS task should be used?

How do I start?

Are there any excellent web sites that may assist in this urgent business matter?

Because of internal urgency in finding a solution, I am willing to pay for the service.

Thank you in advance for your assistance.

Terry Ward (847) 597-2812
April 12, 2006 2:28 PM
 

jamie.thomson said:

Terry,
It sounds like you want to decide whether the data should be inserted or updated depending on whether its primary key already exists in the target table. Is that correct? If so, this article will be of interest to you: http://www.sqlis.com/default.aspx?311">http://www.sqlis.com/default.aspx?311 because it explains exactly how you go about doing that.
To filer data inside a data-flow you should use the conditional split transformation.

I would recommend that you use the following sites as references:
http://www.sqlis.com
http://forums.microsoft.com/MSDN/ShowForum.aspx?ForumID=80&SiteID=1
http://blogs.conchango.com/jamiethomson

-Jamie


April 12, 2006 2:49 PM
 

Only Talking Sense » SSIS Event Handling said:

June 7, 2006 4:29 PM
 

philipdesouza said:

Hi Jamie! What a great post!

Has your dev team found out how to make the ContainerDuration component work yet???

This process is truly awesome otherwise... =D

Many thanks,

Philip de Souza

September 14, 2006 1:37 AM
 

jamie.thomson said:

Hi Phil,

Thanks for the comments. I'm glad its proving useful.

I'm not quite sure why you refer to the SSIS dev team as MY dev team. if only that were true - I could get them to do whatever I wanted :)

Anyway, container duration is a funny one. I went thru alot of pain trying to get functionality added into SSIS that would enable us to log ContainerDuration in this way but unfortunately it just wasn't possible. They nearly (tiny gap between thumb and forefinger) achieved it - but not quite.

In the meantime I devised an alternative method here:

Workaround to the System::ContainerStartTime anomoly - derive the execution duration of a container

(http://blogs.conchango.com/jamiethomson/archive/2006/01/16/2620.aspx)

-Jamie

September 14, 2006 12:19 PM
 

philipdesouza said:

Jamie, I have to hand it to you once more, as that is a lovely little workaround you've devised there!

Only problem I'm having with it though (and here's where my lack of comparative knowledge with SSIS will no doubt shine like a beacon) is getting the

" + (DT_STR, 6, 1252) DATEDIFF( "ss", @[User::HashTable], GETDATE() ) + ", --ContainerDuration

to function, as the variable is of type Object, and it refuses to allow the use of the DATEDIFF function as such. Plus the tweaking of your code to eliminate the message box and save the wiping of the HashTable value until after it's been written to the SSISLog has got me a bit stumped. I'm assuming I should break out the "myHash.Remove(sourceID)" and place it in the Reset Variables script task which follows the SQL execution (which in turn follows your PostExec ContainerDuration script)?

Many thanks again for your time in this....

Sincerely,

Philip

September 20, 2006 3:51 PM
 

jamie.thomson said:

Philip,

You need to extract the appropriate value out of the hash table and then use THAT value in the datediff function. Do not use the hashtable itself.

-Jamie

September 20, 2006 4:06 PM
 

Paul Ibison said:

Hi Jamie,

apart from the methods you mention, if someone just wants to get a custom entry into the log file to monitor a variable etc then there is a very simple method available from the script task:

Dim dataBytes(0) As Byte

Dts.Log("My custom message here", 0, dataBytes)

or from the script transformation:

Dim dataBytes(0) As Byte

Me.Log("My custom message here", 0, dataBytes)

Rgds,

Paul Ibison

January 11, 2007 4:28 PM
 

SSIS Junkie said:

I have just been watching Joy Mundy's webcast " Loading a Kimball method data warehouse using SQL Server

January 16, 2007 6:00 PM
 

SSIS Junkie said:

Back in June 2005 I published what I consider to be one of my more useful blog posts amongst my plethora

January 16, 2007 6:07 PM
 

SSIS Junkie said:

A little tip here for any of you building SSIS packages. Some of you may know that I'm a big advocate

January 16, 2007 6:11 PM
 

SSIS Junkie said:

As some of you may know I've been writing about SSIS on this blog for a lon-n-n-n-n-g time now. Too long

January 16, 2007 6:14 PM
 

Phil K said:

I like the structure of your logging solution.  I'm trying to find a way to get detailed logging information for an error that occurs within a script component.  I'm using a script component to build a custom data source for our Pervasive database. All the outputs from the script component work excepting one.  (I built multiple sql queries into the script component and I'm executing data readers to fill and create the output rows.)  I'm getting an error indicating the source data is too large for the output buffer but I can't tell what column is the culprit.  I've reverified all the source column sizes and types and destination columns within the script component but without success.  Is there a way to see the detailed error information for the failing row within the createnewoutput rows method of the script component?  I'm using a try catch block but I can't find the detail I need in the exception message.

thanks

February 14, 2007 10:36 PM
 

jamie.thomson said:

Hi Phil,

Not sure about this but if the error is occurring when you try and put stuff into the pipeline then the info that you are after may be available in the error output of the component.

-Jamie

February 22, 2007 1:22 AM
 

john said:

i'm trying to work through your example and can not parse the insert stmt for SqlStatementSource. I can't make it past:

"+ [DT_STR, 6, 1252] DATEDIFF( "ss", @[System::StartTime] , GETDATE() ) + ",

any ideas? The error says the expression might contain an invalid token....

March 21, 2007 7:10 PM
 

jamie.thomson said:

John,

Its hard to say because you haven't posted the full expression, what you have posted looks fine to me. Could you post the full thing?

-Jamie

March 21, 2007 7:26 PM
 

John Johur said:

I'm getting an error about the length of the error description. I've added a SUBSTRING but it seems to ignore it. Do you have any other ideas?

thanks

VALUES (?,?, 'OnError', --EventType

'SPO_X_IMP_NETWORK_PART', --PackageName

'Execute SQL Task Insert into Error log', --Taskname

-1073450752, --EventCode

'The variable System::ErrorDescription contains a string that exceeds the maximum allowed length of 4000 characters.

March 27, 2007 10:04 PM
 

john said:

this works for me...(the strings required a single tick then a double quote )

"INSERT INTO SSISLog(TaskStartTime,SourceFileName,EventType, PackageName, TaskName, EventCode, EventDescription, PackageDuration, ContainerDuration, InsertCount, UpdateCount, DeleteCount, Host)

VALUES(  '" + (DT_STR, 4, 1252) DATEPART("yyyy", @[System::EventHandlerStartTime]) + "-"

+(DT_STR, 4, 1252) DATEPART("mm", @[System::EventHandlerStartTime]) + "-"

+(DT_STR, 4, 1252) DATEPART("dd", @[System::EventHandlerStartTime]) + " "

+(DT_STR, 4, 1252) DATEPART("hh", @[System::EventHandlerStartTime]) + ":"

+(DT_STR, 4, 1252) DATEPART("mi", @[System::EventHandlerStartTime]) + ":"

+(DT_STR, 4, 1252) DATEPART("ss", @[System::EventHandlerStartTime]) +"',

'"+@[User::FTPSourceFileName]+"',

'OnError',

'"+@[System::PackageName]+"',

'"+@[System::TaskName]+"',

"+ (DT_STR,15,1252)@[System::ErrorCode]+",

'"+ @[System::ErrorDescription] +"',

"+ (DT_STR, 6, 1252)DATEDIFF( "ss", @[System::StartTime] , GETDATE() ) + ",  

"+ (DT_STR, 6, 1252)DATEDIFF( "ss", @[System::ContainerStartTime] , GETDATE() ) + ",

411,

2,

3,

'"+ @[System::MachineName] +"')"

April 19, 2007 8:55 PM
 

jamie.thomson said:

John,

Are you running this in an EventHandler? The reason I ask is that System::ContainerStartTime & System::EventHandlerStartTime always return the same thing from within an event handler.

-Jamie

April 19, 2007 9:12 PM
 

john said:

Yes, I am running this from the EventHandler tab. It is assigned to the Package -- OnInformation. This way I can have a historical log of the processes and their outcomes.

Is their a better way to attach a timestamp to the row in my log?

My current problem: The goal of my package is to upload a csv from an ftp site. I am grabbing a file with the date in it using the ForLoop container and am able to grab the files for the last three days. I'm trying to define how i can avoid running a file through the process if it has already been uploaded. Perhaps query the db to see if it exists??? I'm looking into using a sql task but not sure how to implement the logic ie- query table to see if file exists, if it does, skip to next file, if it doesn't then continue with process...

April 20, 2007 4:15 PM
 

jamie.thomson said:

John,

Assuming you have a list of files you have loaded then this is fairly easy. You will need to use a conditional precedence constraint at some point.

-Jamie

April 20, 2007 5:07 PM
 

john said:

I have an FTP task and am considering using the option "do not overwrite file". The FTP task transfers a file to the "Processing" folder and after the data flow task is complete the same file is then transferred to the "Processed Folder".

I'm new to this so I'm open to different means of defining the logic check to see if the file already exists...Is there a preferred means or best practice?

Do you have a link or sample?

thanks

great blog by the way...it's helped me tremendously!!

April 20, 2007 8:42 PM
 

David Smulders said:

I used this approach to create a template for all packages, and while I am really chuffed with the capabilities I am getting an error when I copy the template to a new package.

The last step in the logging process is to reset the variables, which uses the script component. When I copy the package, this new package cannot find the scripts, until I open the components inside the event handler, open the script in the editor and save it again. The step fails, resulting in the values being stuck at their last value until another step uses them.

I cannot put this template to production with this flaw in it, as ETL people will undoubtly forget to follow the above described procedure for a couple of packages they create.

Any ideas how to solve this? Does SSIS store the script in a pre-parsed location with an unique ID? Should I put the script in a variable from which all components read at runtime?

May 3, 2007 4:27 PM
 

jamie.thomson said:

David,

What is the PreCompile property of the script task set to?

Note that its a script TASK, not a script COMPONENT.

-Jamie

May 3, 2007 5:06 PM
 

David Smulders said:

Thumbs up for the quick reply!

I thought about setting the PreCompiled option to false, but the task will not run since you must have this option set to True on 64 bit machines. And the server the packages need to run on....is a 64 bit machine.

May 4, 2007 8:56 AM
 

David Smulders said:

A small update:

I replaced the script task with a dataflow that uses a database connection and query (select 1 as dummy where 1=0) to produce 0 rows, which are then run through 3 rowcounters for the variables. An awful solution in my opinion...but it fixes the problem on our 64 bit machine for now.

When I have the time I'll dig into tackeling the root problem - 64 bit not being able to compile the code without PreCompile in certain situations. (And determining in which situations the problem occurs at all for that matter)

Thanks again for your input...

Regards,

Dave

May 4, 2007 10:51 AM
 

Paul Ibison said:

Jamie - you'll need to escape single quotes in the event description or the TSQL may fail. Probably quite rare, but I have : "SELECT permission denied on object 'tblEmployee', database 'CRMCOMMON', owner 'dbo'." messages which bomb out. Using the following will solve the issue:

REPLACE( @[System::ErrorDescription], "'", "''")

Cheers, Paul

May 9, 2007 5:45 PM
 

jamie.thomson said:

Paul,

Spot on mate :) I've encountered that many a time myself.

The other way around this is to not use expressions and use parameters to inject the values instead - let the OLE DB Provider do the work for you. It goes against everything I've ever said about "Use expressions to build SQL statements dynamicaly" but here is one case where its appropriate to do it "the other way".

-Jamie

May 9, 2007 6:18 PM
 

Robert Harmon said:

Jamie,

Thanks for the great blogs...

In a previous run you mentioned building your own custom log provider.  I am interested in what exactly this means.  For example, do you no longer want to utlize the built in event handling and fire events on your own?  Or when you are talking about the logging you mean the log providers found inside of SSIS to the sysdtslog table.  Here is an interesting blog from Brian Knight, about parsing the OnPipeLienRowsSent, http://whiteknighttechnology.com/cs/blogs/brian_knight/archive/2006/04/14/146.aspx.  This to me would be a good way to extend the out of the box logging..

Thanks,

Robert

May 11, 2007 4:15 PM
 

jamie.thomson said:

Robert,

You know the list of log providers that you get when you set up logging in your package(s)? i.e. Text File, SQL Server, Event log etc...

A custom log provider will show up in there.

The benefit of them is that you can capture whatever you want rather than just a subset what Microsoft have allowed you to capture. For example, you could capture the duration of all your tasks - that's not included in the out-of-the-box providers.

-Jamie

May 11, 2007 5:26 PM
 

Paul Ibison said:

Hi Jamie,

- just a heads-up. I'm using these event handlers for all my logging and noticed that sometimes the error event handler doesn't fire, even though there was an error. This has happened when prevalidation fails (the stored procedure of datasource was missing). The text-file logging picks the 'On Error' event up but not the custom event handler - very odd. Anyway, not sure if there's a more elegant solution but in this case the custom event handler can be forced to fire if "delay validation" is set to true on the data flow task.

Cheers,

Paul

May 17, 2007 2:28 PM
 

Paul Ibison said:

Me Again! I have found another situation which is also troublesome. I suppose it's obvious really but if the indirect configuration can't connect, then the logging will fail, but therefore the failure to connect isn't itself logged. I'm now recommending my colleagues to use custom logging in tandem with logging to a text file. The custom logging is used for reporting, but if there is no info in the custom log, we start looking at the text file to troubleshoot.

Cheers,

Paul

May 24, 2007 10:06 AM
 

jamie.thomson said:

Hi Paul,

I completely agree. If asked, I always advocate doing some logging to text files alongside any custom logging that you may be doing.

-Jamie

May 24, 2007 12:00 PM
 

David Smulders said:

I think I stumbled over something odd in SSIS using custom logging in the way as described above:

When packages are run, they log perfectly into a preset table, and everything turns up as expected in the logging table. However, when the same packages are run from a parent package, they suddenly not only produce event information on the steps in the package, but also the steps of the event-handler itself. Which causes a lot of overhead. (PostExecute fires on the actual insertion of the information, and reset of the ins/upd/del counters)

Any idea why a package called from a parent package suddenly produces events on event handler steps?

May 30, 2007 4:31 PM
 

jamie.thomson said:

David,

Believe it or not this is actually by design. Eventhandlers are, themselves, containers and thus produce events. The difference is that the events that they throw up are not caught by eventhandlers in the same package.

That isn't to say there isn't still a problem. Eventhandler containers do not possess the DisableEventhandlers property and that isa  bug in my opinion. The way I get around this is to prefix all tasks in my eventhandlers with some sort of code (I use two underscores) and then in the parent package only log events where @[System::SourceName] does not have that prefix. its a bit cludgy but it works.

-Jamie

May 30, 2007 5:16 PM
 

SSIS Junkie said:

I've been blogging on this site for just over 2 and a half years now and thought now would be a good

June 27, 2007 3:50 AM
 

Nick Jones said:

Hi Jamie,

Thanks for the great article. I notice that my event handlers fire when I execute the package (right click the package in Solution Explorer) but not if I only execute the task (right-click the task itself in the designer). My event handler is OnPostExecute at the Executables level rather than at the Package level so I would expect it to fire. Is this behaviour to be expected?

Cheers

Nick

July 12, 2007 12:13 PM
 

jamie.thomson said:

Nick,

Yes, that's by design.

-Jamie

July 12, 2007 2:43 PM
 

Franck Vergniol said:

Hi Jamie,

Great job that you've done! Many Thanx.

Is there a way at run time in a script to get the container parent id in order to log the container hierarchy ?

Regrds,

Franck

July 17, 2007 3:41 PM
 

jamie.thomson said:

Hi Franck,

No, there isn't. I would love this functionality though so I'm delighted to see someone else asking for it. I've asked for it here:

SSIS: Make container stack available

(https://connect.microsoft.com/SQLServer/feedback/ViewFeedback.aspx?FeedbackID=207395)

Please clickthrough, vote and most importantly, add a comment.

-Jamie

July 17, 2007 4:20 PM
 

Franck Vergniol said:

Did it Jamie.

around the same question, the fact that the container ID is not unique on a project scope due to code reuse such as template and copy & paste.

dtsutil.exe is great on a package level because it renumber the package, but does not the trick for containers unfortunately. And I've seen no ways of changing it manually.

What would be the trick? find and replace with a text editor on the xml file ?

Regards,

Franck

July 18, 2007 1:14 PM
 

jamie.thomson said:

Franck,

Yeah, that might be your only option. Don't forget that Visual Studio can generate GUIDs for you .

Your point about templates is a good one. I have previously submitted a request to have the behaviour changed: https://connect.microsoft.com/SQLServer/feedback/ViewFeedback.aspx?FeedbackID=277249

which I talked about here: http://blogs.conchango.com/jamiethomson/archive/2007/05/15/SSIS_3A00_-New-suggestion-for-templates-in-Katmai.aspx

-Jamie

July 18, 2007 4:44 PM
 

Jason said:

Hi Jamie,

Thanks for the great info on event handling!  I got 90% done, but I am still having trouble capturing the ContainerStartTime on tasks that run in parallel.  

Do you have an example I can download?  In my test environment I copied your Sequence Container multiple times so they would run in parallel, and this is where my issue with the ContainerStartTime begins.  By setting on OnPostExecute_SQL variable to Evaluate Expression = False, and then using an OnPostExecute_SQL_Local variable with Evaluate Expression = True, I am able to accurately capture the Insert, Update, and DeleteCounts.  However, the ContainerStartTime taken fromt hash table is not working.  

If I add the OnPreExecute and OnPostExecute script tasks to the individual tasks that run in parallel, my ContainerStartTime is something like 12/1/1989  00:11:00.  The actual value is too big for the DIFF function calculating the duration.

Thanks for your help!

Jason

July 25, 2007 1:09 AM
 

Langston said:

Hi Jamie,

First off, thanks for sharing your wisdom, your advice is invaluable.

I posted a comment and rating at MS for making the container stack available, too.

I have a parent/child package design. While implementing custom logging, when an event occurs the System::SourceName variable stores the container that produces the event from the child package, which is great. But the System::PackageName variable stores the name of the parent package as opposed to the child package whose container fired the event.

August 29, 2007 12:23 AM
 

PedroCGD said:

Jamie,

Your post is very good and I will try to customize to my projects.

Could you tell me where is the download of the example?

Kind regards,

Pedro

August 31, 2007 10:31 AM
 

Kanis said:

My problem is to catch detailed information when the event fires up. Like, I'd want the primary key of the offending row and be able to log it in such a way that we can automatically parse the log file and re-massage the data and try again.

September 27, 2007 2:41 AM
 

Pseudo-Random said:

SSIS provides several log providers, including the Log Provider for SQL Server (my personal favorite). ...

October 10, 2007 5:13 AM
 

SSIS Junkie said:

Ever heard of custom log providers? They are a mechanism within SSIS by which we can write our own log

October 22, 2007 3:52 AM
 

SimonS Blog on SQL Server Stuff said:

Did you know that your package has an identifier in it? Well it does and it can be used to provide automated

November 6, 2007 11:02 AM
 

SimonS Blog on SQL Server Stuff said:

Did you know that your package has an identifier in it? Well it does and it can be used to provide automated

November 6, 2007 9:48 PM
 

Satyananda said:

Excellent.

November 13, 2007 12:37 PM
 

John said:

I have been reading through all this and other various sites but no nearer to what I need.

Another handicap I am operating under is that I'm new to SSIS and haven;t done any serious coding since cobol!

I have an 'execute sql task' that contains various amounts if sql commands that insert/update or delete information from tables.

I would like to be able to output - similar to the progress information when executing in debug mode - the error if the 'execute sql task' if say the insert into the table fails then emails the information or text file log of the job.

How do I do this simply?

thanks

November 21, 2007 10:44 PM
 

kral said:

Thank you very much

November 28, 2007 10:56 AM
 

hugo oyunları said:

I've been blogging on this site for just over 2 and a half years now and thought now would be a good

December 3, 2007 5:21 PM
 

Pseudo-Random said:

SSIS provides several log providers, including the Log Provider for SQL Server (my personal favorite

January 21, 2008 2:51 AM
 

Gayatri Kulkarni said:

Thanks for the great information on logging.

Is there any way to re-use these event handlers? As in, I want to have the event handlers (post execute) for few of the tasks (not all) of my package. Is there any way I can create just one event handler task for logging and use the same for all the tasks in the package. ( I dont want to copy and paste the event handler task in the event handler of each and every task of the package).

Basically, is there any way to create our own reusable components in SSIS? Not only for logging but in gerenal, how to create reusable components like some business logic implemented in the data flow task which is requried repeatedly?

Regards,

Gayatri.

February 5, 2008 6:32 AM
 

jamie.thomson said:

Gayatri,

The only way to do it is to put the reusable stuff into its own package and call it using the Execute Package Task.

DO NOT do this for eventhandlers though. The overhead of spinning up the package each time is far too much to bear. Read this: http://blogs.conchango.com/jamiethomson/archive/2005/03/01/SSIS_3A00_-A-warning-about-using-child-packages_2100_.aspx

-Jamie

February 5, 2008 12:59 PM
 

Gayatri Kulkarni said:

Thanks Jamie.

As you said, we can expect something on reusability in the next version of SSIS.

Also, I would like to read more on the memory usage of SSIS - like -

Whats the difference in the memory utilization when you write a business logic directly in the package or write it in a child package?

I have been reading a lot about the performance issues of SSIS transformations these days. Would be great if you could share some information regarding performance etc.

Thanks in advance,

Gayatri.

February 8, 2008 6:33 AM
 

jamie.thomson said:

Gayatri,

Perf in SSIS is a huge area. There's no doubt that the dataflow can be a screamingly fast data movement vehicle but you need to know how to use it properly.

Regarding memory utilization, maybe this will help: http://blogs.conchango.com/jamiethomson/archive/2005/05/29/SSIS_3A00_-Memory-Usage.aspx

-Jamie

February 8, 2008 8:33 AM
 

Gayatri Kulkarni said:

I went through the link along with the related blogs.

How can we make pivot transformation dynamic as in if we consider the example specified by Ashwini, we have to have a list of all possible products and have to create those many output columns manually while configuring the task. Is there any way to make it dynamic?

Also, on the other lines, I have a scenario where I am reading data from CSV files and want to validate it before putting into SQL tables. Using SSIS how can I validate the data types of data? I know I can implement it by using a script component. However, I couldn't find any way to implement it using other transformations in SSIS.

Regards,

Gayatri.

February 14, 2008 10:40 AM
 

jamie.thomson said:

Gayatri,

No, the pivot cannot be dynamic. If the list of products is finite then you will have to have a column for each product.

Regarding data type validation. The question is too general, you'd have to give a specific example. Please could you offer that example elsewhere (like the SSIS forum on MSDN) as your questions don't really relate to the subject of the blog entry to which you are replying.

Sorry i can't be of more help.

Thanks

Jamie

February 15, 2008 1:34 AM
 

Bertyrandr said:

Hi Jamie,

I've got an issue concerning ErrorCode/ErrorDescription ...

For example when my package end with success the ErrorCode returned at the OnInformation event handler level is :

0x4001100B

1073811467

DTS_I_COMMITTINGTRANSACTION

Committing distributed transaction started by this container.

Instead of :

0x40013001

1073819649

DTS_MSG_PACKAGESUCCESS

Package "__" finished successfully.

And, it's always the same scenario, I don't get the real final message.

cf http://forums.microsoft.com/TechNet/ShowPost.aspx?PostID=2868942&SiteID=17&mode=1

Do you know why and how to get it ?

Thanks a lot. Your blog is such the an SSIS treasure.

Cheers,

Bertrandr

February 19, 2008 2:42 PM
 

jamie.thomson said:

Bertrand,

Sorry, that one's a mystery to me. If you find anything out please reply and let me know.

-Jamie

February 19, 2008 2:57 PM
 

Bertrandr said:

Snif ;-)

I'll temporarly hard code the error code passed to my sp for pakage success or failure.

But I still keep in mind to do it dynamically and let you know if I find how.

Bertrand

February 19, 2008 3:18 PM
 

Bertrandr said:

Jamie,

I finally found a way to do what I need.

Thanks to J. Welch and his blog (http://agilebi.com/cs/blogs/jwelch/archive/2008/01/15/handling-multiple-errors-in-ssis-revisited.aspx).

I found that when OnError the ErrorCode and ErrorDescription really usefull is the first one raised.

For example :

[Data Conversion [70]] Error: Data conversion failed while converting column ...

is more usefull for than

[DTS.Pipeline] Error: SSIS Error Code DTS_E_THREADFAILED

So I used the piece of code of JWelch to store all the ErrorCodes and ErrorDescirptions in ObjectType variables and then affect the first value of these ones to variables that I pass as parameter to my sp for logging.

Here's the code :

Variables :

errorMessages,errorID both of ObjectType

L_STATUS_VALUE Int32

L_LOAD_MESSAGE String

ScriptTask :

ReadOnlyVariables : System::ErrorCode, System::ErrorDescription

ReadWriteVariables : User::errorMessages,User::errorID,User::L_STATUS_VALUE, User::L_LOAD_MESSAGE

' Microsoft SQL Server Integration Services Script Task

' Write scripts using Microsoft Visual Basic

' The ScriptMain class is the entry point of the Script Task.

Imports System

Imports System.Data

Imports System.Math

Imports Microsoft.SqlServer.Dts.Runtime

Public Class ScriptMain

   Public Sub Main()

       Dim status As Collections.ArrayList

       Dim messages As Collections.ArrayList

       Try

           status = CType(Dts.Variables("User::errorID").Value, Collections.ArrayList)

           messages = CType(Dts.Variables("User::errorMessages").Value, Collections.ArrayList)

       Catch ex As Exception

           status = New Collections.ArrayList()

           messages = New Collections.ArrayList()

       End Try

       status.Add(Dts.Variables("System::ErrorCode").Value.ToString())

       Dts.Variables("User::errorID").Value = status

       messages.Add(Dts.Variables("System::ErrorDescription").Value.ToString)

       Dts.Variables("User::errorMessages").Value = messages

       Dts.Variables("User::L_STATUS_VALUE").Value = CInt(status(0))

       Dts.Variables("User::L_LOAD_MESSAGE").Value = CStr(messages(0).ToString())

       Dts.TaskResult = Dts.Results.Success

   End Sub

End Class

Hope it will helps.

Cheers,

Bertrand

February 20, 2008 3:11 PM
 

vijay said:

hi, ur original blog was very informative but when i tried it i am getting a syntax error when i execute the SQL Execute task. i have created a small table with columsn for error description and error code and startdate. when i insert the error desc using the system variables i have this syntax error at the start of the values ( place of the insert stmt.

please tell me if this can be overcome.

Thanks

Vijay

March 6, 2008 11:34 AM
 

Subah said:

Hello,

Thanks for the informative blog. I was able to successfully implement logging in my project with the exception of one glitch on the 'OnError' event. One of my tasks executes a statement like - UPDATE TABLE1 SET COL1 = '123' WHERE DESC = 'ABC'

When this task fails, the ErrorDescription looks like - Executing the query "UPDATE TABLE1 SET COL1 = 123 WHERE DESC = 'ABC' " .

I am having trouble inserting the above message into a table since, adding single quotes '' around the message gives out an error since the values for DESC is surrounded by quotes as well. Unfortunately, I can only add the escape characters surrounding the ErrorDescription but not in between the message itself. Any thoughts would be appreciated.

Thanks,

Subah.

March 13, 2008 3:14 PM
 

jamie.thomson said:

Hi Subah,

Yeah, I've come across this problem in the past but can't quite remember how I solved it. I think I just used the REPLACE function to either strip out all the apostrophes or replace them with two apostrophes which is SQL Server's escape mechanism for apostrophes. Can't remember which I did to be honest - but it worked.

Another option is to not build the SQL statement dynamically and instead use parameters (standard Execute SQL Task functionality) to insert your ErrorDescription ito your logging table.

Hope that helps.

-Jamie

March 13, 2008 3:25 PM
 

Subah said:

Hi Jamie,

Thanks for the quick response. I was looking at the double apostrophes option, however that was not of much help since I am unable to manipulate the text coming out of the description.

On the second option just to confirm my understanding - your suggestion is to build the ErrorDescription instead of using the system variable, correct?

Thanks again,

Subah.

March 13, 2008 3:30 PM
 

Subah said:

Just to add, the problem I am facing is that I am not able to represent the text coming out of the errorDescription as a 'proper' string which prevents me from using any string operations.

Thanks.

March 13, 2008 3:34 PM
 

jamie.thomson said:

Subah,

"I was looking at the double apostrophes option, however that was not of much help since I am unable to manipulate the text coming out of the description. "

Not sure I understand. You're inserting this into some sort of error log table right? So you are trying to dynamically build a string like this:

INSERT LogTable (ErrorMessage) VALUES ('<your_error_message>')

right?

So your expression would look like this:

"INSERT LogTable (ErrorMessage) VALUES ('+ REPLACE(@[System::ErrorDescription], "'", "") + "')"

================================================

"your suggestion is to build the ErrorDescription instead of using the system variable, correct?"

No, the Execute SQL Task allows you to use placeholders for parameters like so:

INSERT LogTable (ErrorMessage) VALUES ( ? )

and then you would assign @[System::ErrorDescription] to the question mark placeholder. That's sandard fucntionality in Execute SQL Task.

-Jamie

March 13, 2008 4:09 PM
 

Subah said:

Hi Jamie,

In my case, the Replace('') function does not seem to work because the Replace function takes in a 'String' value, however the generated text from the

@[System::ErrorDescription] is not being considered as a text.

I tried the Execute SQL task option and am getting an error - "An error occurred while extracting the result into a variable of type (DBTYPE_I4)""

I am currently trying to capture the message into a variable using a script task and am battling the "A deadlock was detected while trying to lock variables" error message when trying to write the value to a variable.

Appreciate your response. Will keep posted if I manage to find a solution.

Thanks,

Subah.

March 13, 2008 5:22 PM
 

Subah said:

Hi Jamie,

Finally the approach worked. For some reason, it is unable to do a Replace on the variable at runtime just before inserting data. The approach consistently failed when I used the Execute SQL task and supplied the @System::ErrorDescription variable or when I used the Expression Builder.

I used a script task to capture the @System::ErrorDescription into a variable defined within the script say, ED and Wrote Replace(ED, " ' ", " ") into a user variable User::ErrorDesc. In my Insert statement, I used the User::ErrorDesc instead of directly using the System::ErrorDescription. This worked!

Thanks for your help.

Subah.

March 13, 2008 5:45 PM
 

DB_guy said:

Hi Jamie,

Is there a way to redirect failed rows from within the Script task and continue processing the rest of the records without stopping?

Thanks

May 8, 2008 7:39 PM
 

jamie.thomson said:

May 9, 2008 9:53 AM
 

Charley said:

Hi Jamie,

Thanks a lot for the great solution, it does solve my problem. But I have an issue with the size of the dynamic SQL that's in the expression of SQLStatementSource property for a Execute SQL Task. In my case, it's about 587 charactors long. When I try to Evaluate Expression, it cause a truncation error. It is worked out by simplifying the SQL statement to reduce it to 367 charactors. Is there a size limitation for the SSIS expression editor or the SQLStatementSource property?

Thanks,

Charley

May 21, 2008 4:10 PM
 

jamie.thomson said:

Charley,

Yes there is a limitation. The result of the expression has a maximum lenth of 4000 characters.

I don't know of any limitation on the length of an expression (note the subtle difference here).

-Jamie

May 22, 2008 9:57 AM
 

chung said:

This example is great and is very helpful.

May 27, 2008 9:22 PM
 

SSIS Junkie said:

In my blog entry Custom Logging Using Event Handlers from June 2005 I described an easy technique for

June 8, 2008 12:42 PM
 

Asaf said:

I have read your post with great interest and was wondering if you have a solution for the following to build-in to your logging mechanism: "How could we measure the amount of data in bytes read from the source and transferred to the destination for each package execution?".

June 11, 2008 10:05 AM
 

jamie.thomson said:

Asaf,

There's no way to get that info I'm afraid. You can catch the number of rows using a rowcount component and multiply that by the width of a row to work out the maximum number of bytes that would have been sent through and that probably gives you a good indication. Note that this is the MAXIMUM, it doesn't allow for (e.g.) DT_STR/DT_WSTR columns that are not filled up.

-Jamie

June 11, 2008 10:10 AM
 

ml2008 said:

Hi, Could someone please help me with the following issue:

I have a windows service which makes call to a file based SSIS Package. In this package I have 3 tasks under control flow tab which run in sequence they are connected in. First task just delete some reocords, second task, reads data from flat file and rowcount updates global variable, let's say InputRowCount, third task is just a script task that does something else. Once package runs fully, I try to access this global variable in windows service througth package object that was used to call the package. I don't get the updated value from this variable. But if I remove the first task from control flow tab and task that reads the file and update the rowcount becomes first, it can get the proper value from this variable. Also want to mention that I pass PackageEvents object to execute method to get the events for logging.  Packages works fine when run through DtsExec and in BIDS. Could someone please help.

June 13, 2008 8:38 PM
 

Albert said:

Hi,

I implemented custom logging using event handlers in my package and everything works fine, but when i use checkpoint file to attain restartability of the package the event handlers fail to work, it kinda makes sense because eventhandlers are containers within ssis and check point file treats it like any other container and checks it done during the first run.

I was wondering if there is a work around for this issue

July 23, 2008 4:19 PM
 

jamie.thomson said:

Albert,

I don't know of a workaround. Annoying isn't it? I think its a bug personally.

-Jamie

July 23, 2008 4:27 PM
 

Albert said:

Thanks for the response Jamie and yes its annoying, is there a way we can edit the checkpoint and manually delete the eventhandler part?

July 23, 2008 4:36 PM
 

jamie.thomson said:

Albert,

No, because I don't think that is the problem. It is simply that eventhandlers don't fire if the package is started from a checkpoint, regardless of whether that eventhandler has fired on the previous execution or not.

It would't make sense to record eventhandler completion in a checkpoint file anyway because some of them (particularly those scoped to the package) will fire many times furing package execution.

hence...big fat hairy bug in my opinion!!

-Jamie

July 23, 2008 4:42 PM
 

Philis Jopapa said:

Sweet... this was a very good basic explanation... I needed a logging function and thanks to the magic of Google and SSIS junkie it's all working perfectly and quickly..... They do want to log the transactions/record numbers and I don't have that piece just yet (I need a good way to count inserts from a flat file?) but that shouldn't bee too hard!

August 1, 2008 5:36 PM
 

Zoz said:

I implemented something similar, but I'd like also to log information when I cancel the package execution. The thing is there is no event handler "OnPackageCancel", so I cannot write in the table when I cancel the package execution. Does anybody have an idea?

Thanks!!

August 5, 2008 4:02 PM
 

Arvind said:

Jamie,

    This blog was really really a nice one for the starters like me. I have a doubt here. Is there a way to capture the component which fails in the data flow task and log it. For example, in my control flow I have a data flow task and under this data flow, I have multiple data flow components (couple of look up's , a conditional split etc). Lets say one of the lookup fails and throws an error. Is there a possibility to capture which component (I want to capture component name) under the data flow task fails and log it? If so please let me know how.

Expecting your valuable answer.

August 5, 2008 7:51 PM
 

jamie.thomson said:

Arvind,

This information is usually captured in the standard log providers. Are you not seeing it there?

-Jamie

August 6, 2008 11:11 AM
 

jamie.thomson said:

Zoz,

I'm clutching at straws here but would the OnExecStatusChanged event handler (http://msdn.microsoft.com/en-us/library/ms140223.aspx) be of any use?

-jamie

August 6, 2008 1:02 PM
 

Zoz said:

Hello Jamie,

Thanks for your answer. I tryied what you suggested, but I had a strange behavior. The event is always executed... I set the scope of the event on the hole package. Maybe it,s wrong...

Did you ever see a similar case?

Thanks a lot for your help!

August 6, 2008 9:33 PM
 

Arvind said:

Jamie,

   Thanks for your information. I do see the data flow component name in the log file. But I'm maintaining a separate log table in my DB where I would like to fetch the data flow component (the failed component name) and insert it the table while handling the errors in the event handler. Is there a way to get this? I'm turning my head around for the past 2 days but couldn't get a clue !

Eagerly awaiting your response.

Thanks,

Arvind

August 7, 2008 5:55 AM
 

jamie.thomson said:

Zoz,

Sorry no, that doesn't ring any bells.

-Jamie

August 8, 2008 9:17 AM
 

jamie.thomson said:

Arvind,

Sorry, its not available as a seperate field. You'd have to parse it out of the error message.

-Jamie

August 8, 2008 9:17 AM
 

Phil said:

Hi,

Can someone please help me on this. I am new to SSIS and im tring to create a custom destination component in SSIS(2005)  which uses multithreading to process millions of  rows parallely. I am not able to get the connection to the database from my component and subsequently insert rows into a table. Any help would be appreciated

October 29, 2008 10:32 AM
 

Itay said:

Thanks. I learned a new thing today

November 13, 2008 1:59 PM
 

sabarinathan said:

Hi everybody,

can anybody hekp me to sort of this error .. pls go through the below mentioned code .  

TITLE: Expression Builder

------------------------------

Expression cannot be evaluated.

For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft%u00ae+Visual+Studio%u00ae+2005&ProdVer=8.0.50727.42&EvtSrc=Microsoft.DataTransformationServices.Controls.TaskUIFramework.TaskUIFrameworkSR&EvtID=FailToEvaluateExpression&LinkId=20476

------------------------------

ADDITIONAL INFORMATION:

Parsing the expression ""INSERT INTO SSISLog(EventType,PackageName, TaskName, EventCode, EventDescription, PackageDuration, ContainerDuration, InsertCount, UpdateCount, DeleteCount, Host)

VALUES (

'OnError',

"' + @[System::PackageName] + '",

"' + @[System::SourceName] + '",

0,

",

" + [DT_STR, 6,1252] DATEDIFF("ss", @[System::StartTime] , GETDATE())+",

" + [DT_STR, 6,1252] DATEDIFF("ss", @[System::ContainerStartTime], GETDATE())+",

" + [DT_STR, 4,1252]@[User::Insertcount] + ",

" + [DT_STR, 4 ,1252]@[User::UpdateCount] +",

" + [DT_STR, 4,1252]@[User::DeleteCount]+",

" '+  @[System::MachineName] + ' "

)

"" failed. The single quotation mark at line number "4", character number "3", was not expected.

(Microsoft.DataTransformationServices.Controls)

Thanks In Advance!

Regards,

Sabari

November 14, 2008 6:22 AM
 

How to build a custom log provider for SSIS | keyongtech said:

January 22, 2009 6:52 AM
 

AaronLowe.net said:

SSIS Blog Post Thank you

January 27, 2009 2:32 AM
 

SSIS Logging and Performance « The DBA Blog by Tomer Lev said:

July 8, 2009 4:34 PM
New Comments to this post are disabled

This Blog

Syndication

Powered by Community Server (Personal Edition), by Telligent Systems