I am currently using the SSIS Logging feature in my SSIS package. Currently, I have defined a destination log file, and each time the package is executed the log file gets appended with that days log.
Im trying to figure out how best to keep the log file name static (it gets emailed out, and my email client looks for a particular log file name) yet include only todays log information and append the rest of the log information to a history log file or something like that.
Has anyone tried doing something similar, or have any ideas on how best this can be accomplished?
I recently read the project real ETL design best practices whitepaper. I too, want to do custom logging as I do today, and also use SSIS logging. The paper recommended using the variable system::PackageExecutionId to tie the 2 logging methods together.
We have never built a data warehouse. We are not even sure what the required features are to do this. Nontheless, our developers are exploring two ETL tools: SSIS and Sunopsis.
They tend to like Sunopsis because it has been around a few years, has the equivalent of source-code modules (libraries) in the tool, has a highly granular security system, has 500 companies using it. and therefore must have survived several ETL projects. (We don't know if, or how many, real-world projects SSIS has been used for.). Sunopsis is Java based and uses its own authentication scheme (no support for Windows Active Directory yet. Bummer)
To the best of your knowledge is SSIS, out of the box, missing any of the standard functionalites needed to populate the datastore that will be used as the basis of our warehoue? (I don't even know precisely what all these functionalites are but my boss is worried we will find out somewhere down the road. While he recognizes that we can write our own code in .NET he prefers things "out of the box". They may not be fast but they tend to work right.)
Just implemented logging on my package, and the type of logging is to a text file. Any ideas on where should i store my log folder/file, so when I do my build and deploy, I end up moving the folder as well. Do I have to manually do that? How would that work? What if I save the log file outside of my project folder? Where is the log folder/file path specified? Thanks.
Logging in SSIS as compared to DTS is more complex to set up with so many events. What events should I choose if I need something similar to DTS package logging i.e. I simply want to see which tasks executed at what time and whether they failed or succeded and if they failed, what was the error?
I have migrated DTS package into SSIS. One of the ActiveX Script contains the following code. I have a question regarding changing this code.
What is the equilaent object of reading LogFileName in SSIS? I setup the Logging in SSIS, but didn't fidn any option using which i can refer to logfile as it is done in the following DTS code.
It'll be a great help if somebody can assist me in this.
Anyone know if there is a way to make the package only log errors? I only have errors checked, but it still logs the package onstart and onend events. Is there a way to turn this off?
What all would need to be done in order to capture a time that an SSIS package ran and completed? I have had a thread about this before here: http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1310953&SiteID=1
But, I have been busy and have been off and on with this issue. I really need to find some simple way to show in a report using SSRS the "as-of" date when the data was pulled in. Is using the Execute SQL task going to be that method?
Second time around w/this one. I'm running my sqlserveragent with a domain admin account and everything works well. I then switch to a domain user and the package fails. Windows app log contains ths
Package "Package1" failed.
For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp.
So I go to the sql server agent jobs and select the job I'm trying to run. I then go to Job Step Properties and under the logging tab choose SSIS log provider for Text files. What is supposed to go in the Configuration String?
It will let me type a text file in there or choose from one of my "3" datasources in my package. But since I'm developing this on another machine the choice to log to the file system makes no sense.
Please do not refer me to the kb article (http://support.microsoft.com/?kbid=918760) I've read and re-read that. All that I want to do is find out why the package is failing.
Hi, What SSIS logging option should I check to log the success of each component? In log file, I'd like to see what component is currently SSIS executing and what component(s) passed the execution. Thanks
I have written a custom task which writes the details of an error in the windows event log.
I use the on error event to invoke this task.
To simulate the error I try to convert a string into a number. I see that there are multiple as many as 6 messges with different error codes being logged into the windows event log.
They vary from something tangible like number conversion failed to obsure things like thread was being aborted.
Is this normal? to see multiple messages for one type of error.
Can I control this, so that only one message is sent into the windows event log.
(reason is that we have a monitoring software which may raise multiple admin alerts for the same issue).
I'm trying to use the information logged by SSIS packages into the database (the sysdtslog90 table) to track errors. Unfortunately, when the errors are written to the log they are associated with a source and sourceid and not the package and package ID. Sometimes the source is the package, but often it is a process within the package. All I need is some way to associate each error with a package--a package name/ID on each row in the log. This seems like it ought to be a pretty basic need, but I can't figure it out.
After executing a package, I found that the values in the "executionid" column are the only ones that are unique. Can we use this to determine what package was run? We are trying to architect a solution that would allow us to determine as to how long a package ran, if it ran into warnings / errors etc., We can easily accomplish this by having our own table and using Global variables within packages, we could insert / update this table. Appreciate any help.
Can anyone tell me the best way in SSIS to log performance at control flow level i.e. per task I have in my control flow and what performance characteristics it is possible to log.
I am having a problem where duplicate log statements are being written to a log file (as defined by a log provider).
I believe that this is because in the logging dialog box, I have ticked the checkbox next to a child task to override the logging functionality. I need to do this because it is a script task and I want to capture "ScriptTaskLogEntry" events (something that I cannot do at the parent level). However by doing this I seem to get the script events written at the parent, as well as at the Script Task level.
Is there any way of avoiding this, but still capturing the log events from the script task?
Another issue that is possibly linked is that I am getting an error from the log provider:
The SSIS logging provider "SSIS log provider for Text files" failed with error code 0x800700EA ((null)). This indicates a logging error attributable to the specified log provider.
Could this be because of the parent and child task are both attempting to write to the same log provider?
I am currently working to write a progress log for my SSIS packages. So far I am able write a new log entry, update this log entry using OnProgress and OnError Event Handlers. I'd like to take it one step further. Whenever the package ends whether cancelled or finished normally; I'd like to write to my logging table COMPLETED_ABNORMALLY on cancelled or COMPLETED_NORMALLY on a normal finish of the package. I'm not sure where to begin with this process. I'd like to utilize a simple method and event handler.
Just like in DTS where we can add error file so that if the DTS package fails we can see what caused the DTS to fail, likewise do we have anything like error logging file in SSIS. I greatly appreciate your help on this. thanks!!
After logging is configured in SSIS package, it seems that after each execution the output is appended to the log file (we are talking about log provider for text files in this case). As a result the file just keeps on growing. I would like to overwrite old information with each run, but I can't find where to configure this. Anybody knows?
I have enabled logging in my package and am using sql table to capture. i have defined a connection for it and i have defined it in the logging option. Once the logging is enabled, using package configurations, i am storing the value for the property "logging mode= 1", which means enabled in the table. But when i close and reopen the package, the package is failing to enable the logging. Even though i have stored the logging mode value in the configurations table, it is not getting enabled. Please help me solve this.
Workaround i have tried is declaring a variable explicitly to store the logging mode value and use it in the expressions of the pkg to define the logging mode. This variable is saved in the configuration table. This way works. but i want to know why it is not working with loggingmode value reading directly from configuration entries.
Has anyone attempted (with success) to capture the sql command text from SSIS packages at runtime for logging? What approach was used?
ProjectREAL used a stored proc to execute all sql statements. This seemed rather poor in design (formulate a string to pass to the sp just to log the sql command text). This especially seemed problematic as the stored proc would have to be on the source system.
I have been trying to store my sql in variables which were set to eval as expressions and was hoping to use an ExecuteSQL task in the PreExecute event handler for each task, source, transform or destination which I required logging of the sql command text.
Problem is that, depending on how the expression is formulated (with coding parameter markers or replacing parameter's markers with values) I might only get the pre-parameter replacement version of the sql command text rather than the final parameter replaced sql command text. Also not sure that this design would work with destinations.
We have a package that has a connection called Load_DimItem.trc. We don't need this logging enabled for this package anymore. However, if I delete the connection, and delete the log provider (SSIS log provider for SQL profiler), I get errors when trying to close the package after debugging. I get: "Cannot detach from one or more processes. [3172] The object invoked has disconnected from its clients."
I am trying to use a conditional split task so that I can check for specific fields. If the value doesn't exist I am piping the records to a derived field task, where I add an error. I then try to send these records to a flat file destination so that I can keep track of them. However, when I execute the SSIS data flow task I get the following error
[Log Invalid Records [5496]] Warning: The process cannot access the file because it is being used by another process.
This file isn't being used by any other process as far as I can tell, and the only process using it is the SSIS task trying to write to it.
If anyone has any ideas, then I would really really appreciate it
Has anyone come up/determined a generic way to capture and log indicative information within a data flow in SSIS - e.g., a number of rows selected from the source, transformed, rejected, loaded, various timestamps around these events, etc.? I am trying to avoid having to build a custom solution for each of the packages that I will have (of which there will be dozens). Ideally, I'd like to have some sort of a generic component (such as a custom transformation) that will hide the implementation details and provide a generic interface to the package.
It is not too difficult to achieve something similar on the control flow level, but once you get into data flows things get complicated.
I am having the same problems as those in another post. SSIS package works fine when executed in BIDS and through execute package utility but it doesnt work when executed as a step in a job.
The other problem is that the logging also doesnt work when i try executing it as a job. So I have no clue about what to do without knowing what error it is. When I run the job it simply says the step has failed.
I have tried most of the solutions posted in other websites most of them to do with using proxies with credentials but havent hit a solution. I would love to get any input on what to do.
I am logging all the tasks in my SSIS package to SQL Server. For each task I am logging atleast the Pre-Execute, PostExecute, OnError events. For Script tasks, I have custom logging and I am logging the ScriptTaskLogEntry event too.
When I run the package manually from BIDS, the logging works great ! But when I try to run the package from a job or from the command line, the number of events that is logged greatly reduces. For eg. when manually run, I get 104 records in the log table but when run from the command line I get 23 records only. Most of the custom logging messages from the Script Task do appear. Its the pre and post-execute events that are skipped. Any idea why ?
Here is command line from the job. I also use the same command line with "dtexec" from the command prompt.
I am using the "SSIS Log Provider for SQL Server" to log events to a table for "OnError" and "OnPostExecute" events of a package. This works as expected and provides a nice clean output on the execution steps of the package.
I am curious as to why I do not see any detail for any/all tasks that fall under the "Data Flow" section of the package though. For instance, on my "Control Flow" tab, I added a "Data Flow" task that simply loads a few tables from a target to destination server. However, there is nothing shown in the logging output. Just that a Data Flow task was initiated. And when I'm configuring this logging under "SSIS-->Logging" in the checkbox area on the left, you cannot "drill into" data flow steps.
Is there a reason why there is no detailed logging for Data Flow tasks? Would getting to that require me to create a custom log provider?