Problem To Execute A Package And Tasks With A Certain TransactionOption Property. I Need Help.
Nov 27, 2007
I have a Package and a DataFlow Task.
The Package has TransactionOption=Required.
The DataFlow Task has an OLE DB Source and an OLE DB Destination.
The DataFlow Task has TransactionOption=Supported.
The package executes on a Workstation and DataSources for the OLE DB Source and the OLE DB Destination are on a Server.
After the package had been launched an error message showed:
[OLE DB Destination [43]] Error: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "DWH_Destination" failed with error code 0xC0202009. There may be error messages posted before this with more information on why the AcquireConnection method call failed.
[DTS.Pipeline] Error: component "OLE DB Destination" (43) failed the pre-execute phase and returned error code 0xC020801C.
[Connection manager "DWH_Destination"] Error: The SSIS Runtime has failed to enlist the OLE DB connection in a distributed transaction with error 0x8004D024 "The transaction manager has disabled its support for remote/network transactions.".
[Connection manager "DWH_Destination"] Error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x8004D024.
If I set TransactionOption=NotSupported in the DataFlow Task then the package executes successful.
I can't find information on how you are suppose to handle the TransactionOption setting inside of a custom task.
I have a custom task that I have developed, and it basically calls a COM+ object that writes data to a database. When I have the task inside of a container that has the transactionoption set to required, and my custom task is set to supported, if one of the X items fail to execute in my custom task I am telling my task to fail the parent, which I thought would rollback everything. But it does not.
Is there someplace that I need to write rollback code in custom SSIS tasks? If there is I can't find any mention of it anywhere. Any examples out there on how to build custom SSIS tasks that support the TransactionOption parameter?
I have a package which consists of 3 Execute SQL Tasks -
1) Drop old database & Restore a new one 2) Run DDL 3) Run DML
My task was to put all of these in a transaction. I started using the "TransactionOption" property for this package to be "Required" and each child tasks also were set to the same transaction option property of "Required". However, after running the package, it errored out saying Task 1 cannot be in a transaction (which is logical) so, I removed the transaction required property from the first task and kept it on for the remaining 2 tasks. In addition, I have also set up a failover strategy, where in if these tasks were to fail, the package should restore the previous working copy of the backup. To test this scenario, I deliberately created an error in the DML task, so logically only this task should rollback, instead it rolls back the entire transaction and to my horror the failover step is not executed as well.
Is there something which I am not doing correctly?
I have some "Execute T-SQL Statement Tasks" in a package. I would like to run this same package on another SQL Server without having to change it on the other server. Since the server name can be given when setting up the connection, I think if I leave the server name out then the package could run on any server? Is my assumption correct?
I am trying to programmatically execute a package that contains an Execute SQL Task component bound to a variable for its "SqlStatementSource" property (via an expression). The variable is of type String and contains a simple value of "SELECT 1". The Execute SQL Task contains an expression that sets the SqlStatementSource property to the value of this variable.
The package runs fine when I execute it via dtexec or BIDS, but when I attempt to run it via the object model, I receive the following error message:
The result of the expression ""@[User::Sql]"" on property "SqlStatementSource" cannot be written to the property. The expression was evaluated, but cannot be set on the property.
I did a search on this forum and noticed quite a few threads about this same issue, but no explanation/solution. We have quite a few packages that have dynamically constructed SQL statements for Execute SQL Tasks, and they are all failing to run via the object model. Is there something that I am missing?
I want to use transactions to protect my tasks. I have ten tasks that need to be completed in a single package. If any of the tasks fail - I need the entire process to rollback.
I have 5 execute sql tasks to truncate groups of tables in my destination database.
After each of the single truncate tasks - I use data flow tasks to copy the data from the source to the destination db.
Both dbs are on the same server.
I am basically copying the entire db - with the exception of a few tables.
package.transactionOption = required , and all tasks transactionOptions are set to supported.
Several questions. It seems to hang on the first data flow task - the output window stops, etc.
If I set package.transactionOption = supported - The package will execute. If I create an error on the thrid truncate task - all previous tasks will complete - the package will fail on step 3 and the truncation for this step is not commited.
I am partially there. Can anyone help point me in the right direction? I need the entire process to commit if all tasks are successful or rollback if any fail.
Hi I have one problem in SSIS for passing Variable Values while executing Package. I'm giving in details as below:
I opened Microsoft SQL Server Management Studio Made Connection to Integration Services To Execute Package I Right Clicked and Click on Run Package Then I Clicked on Execute and package was executed successfully.
Problem is that if I try to Set Values then Package through Error DTExec: Could not set ProcessMode value to M.
Basically I could not understand in which format I should pass the Variables. What I tried is listed below:
ProcessMode;M Package.Variables[User:rocessMode].Value;M Package.Variables[ProcessMode].Value;M But every time I got errors.
And then I tried from Command Line DTEXEC /DTS "MSDBLoad_Order" /SERVER SERVERNAME /MAXCONCURRENT " -1 " /CHECKPOINTING OFF /REPORTING V /SET Package.Variables[ProcessMode].Value;M
First time Process run successfully. And it has changed the ProcessMode to M also. But after that it was also not changing the ProcessMode Value to M.
Please help in regarding. I tried a lot from Site examples also, but could not get proper solution.
Hi, I have a table with queries. I need to execute those queries and pass results into a variable. Then use that variable/result to execute other queries to make business decisions. EXAMPLE: TASK#1 TABLE-A :has queries below select count(*) from employee select count(* ) from mangers
I want to execute those queries and store results in @counts. How I execute all queries in table A and pass that to a variable?
TASK#2 Then I have another SQL task(may be SQL task) which use the value @counts make some decisions If @count > 1 then pass If @count <1 then fail
How can I do that?
I am still new to SSIS and not very familiar with variables. Any advice would be appreciated.
I have a simple Error row redirection (from an OLEDB Command) to redirect all rows in error to a Transform script and thereafter to a Flat file destination. This is via the red arrow (DF path) output from the OLEDB Command.
I don't understand why this leg executes even when there are no rows in error? Zero byte Flat Files get written out when there are no errors.
How come? Why would a path with a red arrow execute even when there are no errors?
Part 2 When I introduce some errors in the data to cause an integrity violation, and I hook up an OnError event handler, it is never raised even through the error rows are successfully redirected and written out to the Flat File destination.
So what consitutes an error for a Data Flow Task? Is an error raised by Sql Server for an integrity violation bubble up as an error in the SSIS package?
Maybe I'm missing something, but I can't find how to run multiple tasks in sequence while in Visual Studio debug mode. In DTS design mode I grew accustomed to right-clicking tasks one-at-a time, but in SSIS I find the additional step of having to exit Debugging mode after every task gets old after a while.
There must be a way to start execution at a certain task and have the package continue all the way to some other specified task. It would also be nice to have every task in a Group execute in sequence and stop (even if connections continue beyond the group). I could even settle for repeatedly clicking the Continue button in Debug mode, but it's always grayed out when the current task is finished!
I have two servers, A for dev and B for production.
On server A I developed a project containing a SSIS package using SQL Server business intelligence development studio. The package runs fine from the BIDS and also when I save it to SQLServer itself and run it as a scheduled job using the SQLServer Agent.
All ready to roll out to Server B I thought, so I then saved the .dtsx file to a shared network drive.
On Server B, I created a empty project with the same name as it had on Server A. I then imported the .dtsx file into the project using project > add existing item.
The package appeared to import ok but I now cannot execute any of the data flow tasks in isolation. If I right click on them, there is no option to 'execute task' as there should be, it is not greyed out, it's not there at all.
Also, if I attempt to debug the whole package I get a message saying 'This document is opened by another project'.
Can anyone help with this as my deployment to live isn't going very well to say the least!
Both server A + B are 32-bit 2005 std edition SP1 on W2003 Server std edition SP1.
In my ETL project, l need to extract rawdata using winzip, and the DOS command l use is c:program filesWinzipwinzip32.exe -min -e c: awdata est.zip c: awdata where -min for minimize and -e for extract, c: awdata est.zip for source file and c: awdata for destination. It works fine by using DOS command.
l configure the Execute Process Tasks with the following parameters, i.e. RequiredFullFileName : TRUE Executable : c:program filesWinzipwinzip32.exe Arguments : -min -e WorkingDirectory : c:program filesWinzip StandardInputVariable : User::gsRawFile StandardOutputVariable : User::gsDestDir Where User::gsRawFile = c: awdata est.zip and User::gsDestDir = c: awdata
I added tasks to refresh two tables (delete from, insert into, update) to an SSIS project . I have them running from the WinXP scheduler. The issue:
In dev the tasks integrate and execute successfully from scheduler
In prod I can right click and execute each of the three tasks without any problem, but the same tasks cause my project to fail when executed from scheduler
Questions: Any ideas about what I am failing to see? How do I get a meaningful log messages from the tasks that are failing?
Thanks for your ideas...
Installed Edition: IDE Standard
SQL Server Analysis Services Microsoft SQL Server Analysis Services Designer Version 9.00.1399.00
SQL Server Integration Services Microsoft SQL Server Integration Services Designer Version 9.00.1399.00
SQL Server Reporting Services Microsoft SQL Server Reporting Services Designers Version 9.00.1399.00
Microsoft Visual Studio 2005 Version 8.0.50727.42 (RTM.050727-4200) Microsoft .NET Framework Version 2.0.50727
Installed Edition: IDE Standard
SQL Server Analysis Services Microsoft SQL Server Analysis Services Designer Version 9.00.1399.00
SQL Server Integration Services Microsoft SQL Server Integration Services Designer Version 9.00.1399.00
SQL Server Reporting Services Microsoft SQL Server Reporting Services Designers Version 9.00.1399.00
The variable 1 is created as int32 and variable 2 is created as dattime.
When i execute the SQLtask, I get error:
[Execute SQL Task] Error: Executing the query "Exec mysp 'table1',OUTPUT,OUTPUT" failed with the following error: "Error converting data type nvarchar to int.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
What am i missing. I tried changing the data types adding the input variable also as a variable in the mapping. Nothing seems to work. Any ideas please?
Hi All, I have the following issue, i want to make dynamic the Argument property of the Execute Process Task but i couldn't able to do so, i have posted several times , hopefully i will get a solution today. I am using the following Argument to execute (winzip) program. Argument: /e C:FTPFile1.zip C:FTPUnzipped This Argument works but since File1.zip gets change every time, i mean it will be File2.zip, thus i have to change it manually on the Execute Process Task every time, however i want it to change by itself. To make it clear, the /e mean winzip usage for extract file, the C:FTPFile1.zip is where the zipped file is, and the C:FTPUnzipped is where i dumped the unzipped files. as you see the file name will change every time like File1.zip, File2.zip, File3.zip, etc... Thus i want to make it daynamic, for example if i execute today File1.zip, then i have to execute tomorrow File2.zip automatically, and then File3.zip etc, that means the source file is changing every time, how do i make it then to be Dynamic. Please help, i need your input badly. Thank you in advance,
I'm looking for a way to refer to a package variable within any Transact-SQL code included in either an Execute SQL or Execute T-SQL task. If this can be done, I need to know the technique to use - whether it's something similar to a parameter placeholder question mark or something else.
FYI - I've been able to successfully execute Transact-SQL statements within the Execute SQL task, so I don't think the Execute T-SQL task is even necessary for this purpose.
I have a master package, which executes child packages that are located on a SQL Server. The Child packages execute other child packages which are also located on the SQL server.
Everything works fine when I execute in process. But when I set the parameter in the mater package ExecutePackageTask to ExecuteOutOfProcess = True, I get the following error
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "Row Count" (5349).
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "SCR Custom Split" (6399).
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "SCR Data Source" (5100).
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "DST_SCR Load Data" (6149).
The child packages all run fine when executed directly, and the master package runs fine if Execute Out of Process is False.
I have developed a big SSIS package to extract data from flat-files ( + 200 Dataflows ).
The situation is the following, inside de SSIS package, there are a lot of validations before extracting & loading the flat-files, i'm running this validations in paralell, so that when a file arrives, it enters the "validation process" and start extracting the file.
When i run the SSIS package from BIDS it works the way i have concepted it... but when i run the ssis in the server, the tables that are loaded through the process are only "available" when the SSIS PACKAGE ends, it is imperative that trough the process, when a table receives new data, it becomes ready, and don't just be available when the SSIS package finishes...
I have attached the an lousing .jpeg.
It is importart for the tables to be available, so the stored procedures(OUTSIDE SSIS PACKAGE) that are dependent of some tables, start working before the SSIS package Ends.
I have done a search and have read some of the posts, but am left more confused than before. I am fairly new to SSIS. Here is my situation and what i am trying to accomplish.
I have a package that has a sequence container, in which there are multiple SQL tasks (about 20) running in parallel. I have checkpoints enabled, and FailPackageOnFailure enabled as well. If the package fails, when i re-run the package it will run the last task as well as all the other tasks. What I am looking to accomplish is when the package is re-run, have the SQL tasks that failed ran and not the previous successful tasks.
I think the best way would be via disabling tasks on successful completion of a task, where it writes the name of the SQL task to a temp table, but I am skeptical.
Can anyone point me in a direction to help me accomplish what I am looking for please.
We want to develop an error handling process that will log the errors into multiple destinations (eventlog, text files or sql database) depending upon a variable set in the package. Also we want that this errror handling process should be initiated by all the tasks in the package on error.
Is this possible? Can the same event handler be called from multiple tasks in the package? Also in the event handler can we call another package which actually does the error handling. This way we have only one place to change our error handling process in case required.
This may sound a little anal-retentive, but I have a number of SSIS packages that, when I open them, the first thing I have to do is scroll to the left or up to get to where the tasks are displayed. Even if I move the tasks right or down, they still end up in that initial position. This happens even if I use auto arrange.
Is there a way for me to set the package so it has a consistent point or display at which it opens?
I am having a hard time setting the executable path for an Execute Process Task in SSIS. I have a variable that is initialized at package statup which holds the path to an executable in Windows. When I set the property "Executable" Path in an expression, I get a warning that the path for the executable is not set. One workaround was to try and initialize the variable with a bogus path with the hopes that the "correct" value will be written on run-time. NO LUCK. I still get the error and I cannot run the package until I put a static path.
I have several sequence containers in one package that fire off execute package tasks. I would like each of the sequence containers to start at the same time when the job starts running. However when I set them up to do that, i get an error that the variable cannot be read because it is locked. I have the variables setup as readonly so not sure why they are being locked. When I run the package and have each sequence container fire off after the previous one ends it runs fine.
I'm trying to do some custom SSIS logging using event handlers, similar to the ideas provided by Jamie Thomson in the past. My problem is that when I use System:ourceID as one of the items to be logged, I can't match up the SourceID to any of the GUIDs that are displayed in the property window for the various tasks in my package.
Where is this sourceID coming from and how can I track it down?
I have developed a big SSIS package to extract data from flat-files ( + 200 Dataflows ).
The situation is the following, inside de SSIS package, there are a lot of validations before extracting & loading the flat-files, i'm running this validations in paralell, so that when a file arrives, it enters the "validation process" and start extracting the file.
When i run the SSIS package from BIDS it works the way i have concepted it... but when i run the ssis in the server, the tables that are loaded through the process are only "available" when the SSIS PACKAGE ends, it is imperative that trough the process, when a table receives new data, it becomes ready, and don't just be available when the SSIS package finishes...
I have attached the an lousing .jpeg.
It is importart for the tables to be available, so the stored procedures(OUTSIDE SSIS PACKAGE) that are dependent of some tables, start working before the SSIS package Ends.
I'm trying to keep track of the ETL process inserting/updating a row in one table for each package that finish in my ETL process when executing. So far, I created a Script task that increments by one a variable (counter) and then open a connection to my database an insert/update my table. What I want to see is Step 1/30, Step 2/30 and so on. Right know I can display Step 1, Step 2 but how can I get the overall number of tasks within a package?
I have a for each loop that populates from a set of flat files into a Sql Server table, I run the Flat file Import via a dts package embedded into Execute DTS 2000 Task. I want to pass the Sourcefile Name that is fetched by the For Each Loop to assign it Global Variable in DTS. how this can be made ?
I have a SSIS job, one of the last steps it performs is to execute a SQL 2000 DTS package. This has to be done as a SQL 2000 DTS package as it is performing rebuilds of SQL 2000 Analysis Services dimensions and cubes. We've found that when the DTS fails the SSIS job is happily completing showing as a success, we would prefer to know it went wrong.
As far as I'm aware SSIS merely starts the DTS off and doesn't care about it's result. I've taken a look in to turning on the logging for the execute DTS package and thought that the ExecuteDTS80PackageTaskTaskResult would give me the answer I need...but is merely written to the log not available as an event-handler. It also looks like it is not safe to put a SQL task in as the next item to go look at the SQL 2000 system tables to look at the log for the DTS package as the SSIS documentation warns that the DTS package can continue to run after the execute DTS package task has ended.
Ideally I want any error raised within the DTS package to cascade up to be an error in the SSIS job, I can then handle it appropriately. I cannot find a way to do this. Is there a way?
If not, can anyone suggest how in the remainder of the SSIS tasks I can be sure that the DTS has completed before I start any other tasks that will check for the SQL 2000 log of its execution?