Running Out Of Disk Space During SSIS Package Execution
Oct 3, 2006Hi all,
I'm running out of disk space when running SSIS package. Is there any way to select where temp files are saved during package execution ?
Hi all,
I'm running out of disk space when running SSIS package. Is there any way to select where temp files are saved during package execution ?
We've got an internal database that replicates with another database server for our website.
Not all tables are replicated, some use merge and the others are snapshot based and published regularly to the public website facing server.
However, there's a lot of data (well, large textual data) that's being transferred and it seems to be generating massive log files that continue to grow and grow.
I'm fairly new to adminning an SQL Server box, so was wondering if anyone can tell me what the best way to keep it under control is? I've heard its possible to truncate the logs, effectively deleting any data that has already been processed by subscribing servers etc.?
As I said, I'm very much new to this and would really appreciate some guidance, if only to the right part of the SQL Server Books Online :)
Thanks,
Paul
Hello,
I am testing my SSIS pakage, but I got a space disk issue (the C disk is over 100 GB):
Error: Date Time
Code: 0xC004704A
Source: xxxxDTS.Pipeline
Description: The buffer manager cannot extend the file "C:DTSxxxF.tmp" to length xxxxxx. There was insufficient disk space.
End Error
Error: Date Time
Code: 0x80070070
Source: xxxxDTS.Pipeline
Description: There is not enough space on the disk.
etc....
How can I solve the problem?
Is there any way to use different path for .tmp file?
Thank,
any help will be very appreciated.
Hey, I've a few jobs which call SSIS packages. If I run the SSIS package, it runs fine but if I try to run the job which calls this package, it fails. Can someone help me troubleshoot this issue? None of my jobs that call an SSIS package work. All of them fail.
Thank you
Tej
-- Initialize Control Mechanism
DECLARE@Drive TINYINT,
@SQL VARCHAR(100)
SET@Drive = 97
-- Setup Staging Area
DECLARE@Drives TABLE
(
Drive CHAR(1),
Info VARCHAR(80)
)
WHILE @Drive <= 122
BEGIN
SET@SQL = 'EXEC XP_CMDSHELL ''fsutil volume diskfree ' + CHAR(@Drive) + ':'''
INSERT@Drives
(
Info
)
EXEC(@SQL)
UPDATE@Drives
SETDrive = CHAR(@Drive)
WHEREDrive IS NULL
SET@Drive = @Drive + 1
END
-- Show the expected output
SELECTDrive,
SUM(CASE WHEN Info LIKE 'Total # of bytes : %' THEN CAST(REPLACE(SUBSTRING(Info, 32, 48), CHAR(13), '') AS BIGINT) ELSE CAST(0 AS BIGINT) END) AS TotalBytes,
SUM(CASE WHEN Info LIKE 'Total # of free bytes : %' THEN CAST(REPLACE(SUBSTRING(Info, 32, 48), CHAR(13), '') AS BIGINT) ELSE CAST(0 AS BIGINT) END) AS FreeBytes,
SUM(CASE WHEN Info LIKE 'Total # of avail free bytes : %' THEN CAST(REPLACE(SUBSTRING(Info, 32, 48), CHAR(13), '') AS BIGINT) ELSE CAST(0 AS BIGINT) END) AS AvailFreeBytes
FROM(
SELECTDrive,
Info
FROM@Drives
WHEREInfo LIKE 'Total # of %'
) AS d
GROUP BYDrive
ORDER BYDrive
E 12°55'05.25"
N 56°04'39.16"
Hi,
Has anyone monitored the execution of SSIS packages with MOM? Are there extreme benefits over just utilizing the built in execution and event logs, as well as the Windows Event Viewer?
What is the recommended way to monitor SSIS execution?
Thanks,
- Joel
During testing a package repetatively that deletes/inserts into several tables, over the course of several days, my package, which took 45 minutes to load 1700 XML files, began to take over 6 hours. Turns out it was an I/O bottleneck, and the Avg Disk Queue Length was around 200 and I was incurring many PAGEIOLATCH_EX. My devl machine uses a single local disk, no raid, so I had no options there, but I ran the maintenance wizard to recreate indexes/statistics and defraged the hard drive, and regained my original 45 minutes time. I guess I'll have to put a maintenance plan together to do this nightly.
-Kory
Hi All,
I am in the process of moving from a 32-bit SQL Server 2005 Enterprise (9.0.3054) to a 64-bit SQL Server 2005 Enterprise (9.0.3054 with 4 CPUs and 8GB of memory on Win 2003 SP2) and the process has been very frustrating to say the least. I am having a problem with packages that I created on my 64-bit SQL Server. I am importing a few tables from the 32-SQL Server into the 64-bit SQL Server using the Task --> Import to create the package.
Sometimes when I am creating a package I get the following error in a message box:
SQL Server Import and Export Wizard
The SSIS Runtime object could not be created. Verify that DTS.dll is available and registered. The wizard cannot continue and it will terminate.
Additional information: Attempted to read or write protected memory. This is often an indication that other memory is corrupt. (System.Windows.Forms)
Other times when I run a package that has run successfully before I get the following error:
Faulting application dtexecui.exe, version 9.0.3042.0, stamp 45cd726d, faulting module unknown, version 0.0.0.0, stamp 00000000, debug? 0, fault address 0x025d23f0.
Other times I get this error message:
.NET Runtime version 2.0.50727.1433 - Fatal Execution Engine Error (79FFEE24) (80131506)
And still other times
The package appears to hang when running. By this I mean that the Package Execution Progress shows progress up to a point then it just stops. (The package takes about 17 seconds to run normally) CPU usage is at 1% and the package cannot be stopped.
I have deleted and re-created the package several times and I have also re-installed the service pack on the SQL Server (9.0.3054) but that did not help.
Does anyone have any other suggestions to try?
Thanks.
Hi there
We have a SSIS run which runs as follows
The master package has a configuration file, specifying the connect strings
The master package passes these connect-strings to the child packages in a variable
Both master package and child packages have connection managers, setup to use localhost. This is done deliberately to be able to test the packages on individual development pc€™s.
We do not want to change anything inside the packages when deploying to test, and from test to production. All differences will be in the config files (which are pretty fixed, they very seldom change). That way we can be sure that we can deploy to production without any changes at all.
The package is run from the file system, through a job-schedule.
We experience the following when running on a not default sql-server instance (called dkms5253uedw)
Case 1:
The master package starts by executing three sql-scripts (drop foreign key€™s, truncate tables, create foreign key€™s). This works fine.
The master package then executes the first child package. We then in the sysdtslog get:
Error - €œcannot connect to database xxx€?
Info - €œpackage is preparing to get connection string from parent €¦€?
The child package then executes OK, does all it€™s work, and finish. Because there has been an error, the master package then stops with an error.
Case 2:
When we run exactly the same, but with the connection strings in the config file pointing to the default instance (dkms5253), the everything works fine.
Case 3:
When we run exactly the same, again against the dkms5253uedw instance, but now with the exact same databases defined in the default instance, it also works perfect.
Case 4:
When we then stop the sql-server on the default instance, the package faults again, this time with
Error - €œtimeout when connect to database xxx€?
Info - €œpackage is preparing to get connection string from parent €¦€?
And the continues as in the first case
From all this we conclude, that the child package tries to connect to the database before it knows the connection string it gets passed in the variable from the master package. It therefore tries to connect to the default instance, and this only works if the default instance is running and has the same databases defined. As far as we can see, the child package does no work against the default instance (no logging etc.).
We have tried delayed validation in the packages and in the connection managers, but with the same results (error).
So we are desperately hoping that someone can help us solve this problem.
Thanx,
/Nils M - Copenhagen
I am getting an error while running SSIS package:
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Could not allocate space for object 'dbo.SORT temporary run storage: '...' in database 'tempdb' because the 'PRIMARY' filegroup is full. Create disk space by deleting unneeded files, dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup."
The destination is sql 2000. The package does not create any large temp tables.
apart from setting MaxInsertCommitSize property on the destination in Data Flow, is there anything else I could do?
Where do I find this MaxInsertCommitSize option?
I would like to see if someone can help me out.
Scenarion:
1.- SSIS Package execute tasks on 2000 SQL Server Database
2.- Execution takes places using Business Intelligence Studio
Question:
1.- How can I tracked that SQl 2000 tasks took place using a SSIS Package?
Thanks
Hi all,
Which is the best way to execute SSIS packages? I have no problem to use dtexec command but I want it to run every night. Is this going to be done through the SQL Server (SQL Server Agent under Jobs)? Is the type going to be Operating System (CmdExec) or Transact Sql? How exactly is the command written there?
Thanks in advance.
Hi all,
I am Hazara. I am trying to call a SSIS package from a web service. But package.execute() method is returning 'failure'. Though I am able to execute the same package from a normal .Net project and it is working fine (using the same code that i have used in web service).
I have also tried to execute it through stored procedure for which firstly I created a .dll in c#(which is perfectly working) and then I registered this .dll in sqlserver-2005 using following command.
CREATE ASSEMBLY asmPackageExecuter FROM 'C:WINDOWSMicrosoft.NETFrameworkv2.0.50727PackageExecuter.dll'
WITH PERMISSION_SET = UNSAFE
GO
Now on calling the method of .dll (which is responsible to execute the package) I am getting the DTSExecResult as 'success' but data is not getting transfered from one able to other as was expected from the package.
Please help me. I have searched it everywhere on net but didn't get any solution.
I want to execute package only through web-service or stored-procedure
Thanks
Hazara.
running ssis package with ssis run time compoenents and sql server 2000...
Is it possible to run ssis packages that point to servers on sql server 2000
without installing sql server 2005 ?
Can we just install runtime for ssis and run the packages ?
Please explian with links if possible
thanks a lot
Hi Friends,
I need help from you.
I am working on SSIS packages for ETL purpose.
The version of SQL Server i am using is SQL Server 2005.
In Brief , the working of current ETL is as follows.
In ODS database i have 2 tables i.e Table_A & Table_B which gets loaded from another 2 staging tables A & B.
And using this 2 tables data will be loaded into a target table i.e Trg_A.
The ETL packages are executed by stored procedures by creating a job within the stored procedure.
The loading of the trg table is little tricky.
Before that loading of Table_A is implemented in a single SSIS package.
and loading of Table_B is been implemented in another SSIS package.
In the trg table there are two columns which will be getting updated as and when each table is loaded.
so for the first time if i run the package which is resposible for loading Table_A, it loads values into Table_A and once done it will updates (col1) in the target table.
Once after the complete of the execution of Package1.
Now i will kick off the second ssis package which loads the data into Table_B and updates the trg table's columns (col2).
Now the actual problem what i am facing is:
For loading Table_A and updating the col1 in Trg table i will be receving more than 5 excel file every month on weekly basis.
I cannot even gather all the files and run using a For-Loop counter.
So presently i am loading data excel file per week .
Similarly loading of table_B.
For a week if i am executing both the packages which loads the Table_A and updates the Trg(col1) and Table_B and updates Trg(col2), then i am getting a Deadlock Error and the entire ETL is getting messed up.
Now my requirement is , Eventhough the 2 packages are run in parallel , there could certain milli seconds time difference while start of the execution in Job Monitor.
I need to implement a Queing Mechanism which takes care of running the packages in a sequential manner rather than in parallel. i .e i need to ensure only one SSIS package is running in Job Monitor. Only after successful execution of either one the package, then only the second package should start its execution.
If we can implement such a queing mechanism , then my problem is solvedl.
I need some suggestions on this regard in implementing the Queing mechanism in a programatic approach using SQL Server Job Related MetaData Tables.
or else is there in server parameter or initialization parameters which can be set at Database level which suffice my requirement.
Any suggestions would be greatly appreciated.
Looking for sincere comments on this regards.
Thanks in advance.
Hi All,
This not a problem but here i wan to give u my some trial on package execution from C# code.
i just want to make sure whether this is right way or not?
I need to upload some processed text file into table using SSIS packages. I m calling these packages in runtime for different source text files passed to it.
I first created package on my machine and deployed packages on Sql server using default protection level. So when i m tryng to execute it from integration services it wont work giving some exception in AquireConnectionCall() , its coz all the sensitive information is stroed inside package is not available to that machine.
In C#
Now i m loading this package using LoadFromSqlServer().
I am creating connection manager object for each of source and destination type and then setting all sensitve information from my solution's config file.
Set the protection level of package and available connection managers to DontSaveSensitve.
by using this method m able to execute any package created on any machine with default protection level.
Can any one of tell me -ve aspects of this approach?
Thanks
I am wondering something, once we've created a job that executes a package at a given time interval, does that package get recompiled each time the job spins up and executes the package? Or is the package compiled once and then that compiled code is executed each run after the first run?
What I'm seein is this; I have a package that reads data from flat text files and then dumps that data into the database. The package will take 3 minutes to execute when executing on a single file, but when it's looping through ~50 files, it will take ~30 minutes to execute, that is less than a minute per file. Why is this?
Hopefully I'm just forgetting something and not setting a checkbox or radio button somewhere. The job is set up as an SSIS job, not as a command line job.
Thanks in advance for any help you can give me.
Wayne E. Pfeffer
Sr. Systems Analyst
Hutchinson Technolgy Inc.
I am currently experiencing a 30 second delay when starting an SSIS package from a query window or stored procedure in SQL 2005 Management Studio, using xp_cmdshell and dtexec.
When I run the package in BI Dev the execution results state an elapsed time of 4.82 sec, at a command prompt using dtexec the elapsed time is 3.48 sec, from MStudio the elapsed time is 33.86 sec, this test was run using the same configuration and databases. For the MStudio run, if I look at the DTS log file I€™m creating or the PC Application log, it states the package doesn't actually start until 31 sec after the execute button is pressed. I€™ve tried executing the package as both a SQL package and a file package without any difference in elapsed times. I have also set DelayValidation = True for every Task, ConnectionManager and the package itself.
When I look at the package log one difference I see is that the Management Studio executes using €˜NT AUTHORITYSYSTEM€™, BI Dev and the cmd prompt use the local user €˜[Server]Administrator€™, which in this case is the administrator. From this I have to believe it is some kind of user rights problem. I think SQL or the OS is waiting for something and after it times out at 30 sec, it allows the package to run. If this is the case I€™m not sure what it might be or how to find it.
I also tried making an xp_cmdshell_proxy_account with admin rights but this didn€™t seem to work either. I€™ve included the query code below. Any ideas, help or solutions are greatly appreciated.
DECLARE @cmd varchar(250)
DECLARE @Result INT
SET @Result = 0
--SET @cmd = 'dtexec /F "C: empDP2000 DataTransfer.dtsx" /DE ttalg /REP EW'
SET @cmd = 'dtexec /SQL "DP2000 DataTransfer" /DE ttalg /REP EW'
EXEC @Result = xp_cmdshell @cmd
SELECT @result
While executing the SSIS package from visial studio it is running. If we execute from Integration services - - -> stored packages - - - -> msdb - - - -package name, the package gets executed.
But when scheduled through jobs it gives the following error in history
"Execution as user. <user name > The command line parameters are invalid. the step failed"
command line looks like this "
/DTS "MSDBMaintenance PlansPackage1-HYUNDAI" /SERVER tvmwindev02 /CONNECTION "10.10.1.52.upsframis";"Data Source=10.10.1.52;Initial Catalog=upsframis;Provider=SQLOLEDB.1;Integrated Security=SSPI;Auto Translate=False;" /CONNECTION DestinationConnectionOLEDB;"Data Source=mscoe.db.ustri.com;Initial Catalog=HISNA_POC;Provider=SQLOLEDB;Integrated Security=SSPI;Auto Translate=false;" /CONNECTION "sample.db2test";"Data Source=sample;User ID=db2test;Provider=IBMDADB2.1;Location=10.10.1.55;" /CONNECTION SourceConnectionOLEDB;"Data Source=SAMPLE;User ID=db2test;Provider=IBMDADB2.1;Persist Security Info=True;Location=10.10.1.55;Extended Properties="""";" /MAXCONCURRENT " -1 " /CHECKPOINTING OFF
"
What could be the reason ?
Deleep.P
Hi,
I have a SSIS package runnig trough my sql server 2005 schedule job every one hr. it is a simple package to pull some data from a table and transfer this data into a text file. it was runnign smoothly since long time, but today its not executing through my job and giving me the following error.
Executed as user: GYRODATAGyroDBA. ...n 9.00.3042.00 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 1:13:27 PM Error: 2008-04-10 13:13:28.18 Code: 0xC001401E Source: Package1 Connection manager "DestinationConnectionFlatFile" Description: The file name "\Gyrow2kefw001PURCHASESPOOLsupplier.txt" specified in the connection was not valid. End Error Error: 2008-04-10 13:13:28.18 Code: 0xC001401D Source: Package1 Description: Connection "DestinationConnectionFlatFile" failed validation. End Error Progress: 2008-04-10 13:13:28.20 Source: Data Flow Task Validating: 0% complete End Progress Progress: 2008-04-10 13:13:28.34 Source: Data Flow Task Validating: 50% complete End Progress Progress: 2008-04-10 13:13:28.34 Source: Data Flow Task Validating: 100% complete End Progress DTExec: The package execution returned DTSER_FAILURE (1). Started: 1:13:27 PM Finis... The package execution fa... The step failed.
If i try to execute the same package trough business development studio its working f9.
anyone have any idea whats the problem is?
masroor
Hello everybody!
I have a question - is it possible to visualize an execution of the SSIS package when it is being run from SQL Agent? "Visualize" means to show a data-flow "live" - similair to the visualization provided by BI Dev.Studio when you run a package there, with coloured boxes, blinking etc.
I searched the Web but found nothing - neither MS-related nor utilities from third parties. Is it possible in any way?
Thanks in advance,
Andrey.
P.S. Parsing log-files is an option, but we would like to try first something less "painfull" and more universal...
Hello,
I have two packages which is having parent child relationship.
Package1 is calling Package2, Package2 will download the input files from remote server using COZYROC SFTP Task. then Package1 will execute.
It is working fine in BIDS and SQL Agent job in "DEV" Server. But it is not wroking when i deployed the packages and it's config files and then created a SQL Agent JOB to "QA" Server.
The Error is:
Description:
The connection type "SSH" specified for connection manager "LG-AUS" is not recognized as a valid connectionmanager type.
This error is returned when an attempt is made to create a connection manager for an unknown connect
ion type. Check the spelling in the connection type name.
End Error
Error: 2008-04-23 05:33:57.26
Code: 0xC0010018
Source:
Description:
Error loading value "<DTS:ConnectionManager xmlnsTS="www.micro
soft.com/SqlServer/Dts"><DTSroperty DTS:Name="DelayValidation">0</DTSroperty
><DTSroperty DTS:Name="ObjectName">SFTP-CMS</DTSroperty><DTSroperty DTS:Na
me="DTSID">{49D115FA-B208-4BFC-928D-7CC0964E743A}</DT" from node "DTS:Connection
Manager".
End Error
Error: 2008-04-23 05:33:57.29
Code: 0xC00220DE
Source: EPT Calling LG_Inbound
Description:
Error 0xC0010014 while loading package file "C:QATestLG-SFTPInbound.dtsx". One or more error
occurred. There should be more specific errors preceding this one that explains
the details of the errors. This message is used as a return value from functions
that encounter errors.
.
End Error
DTExec: The package execution returned DTSER_FAILURE (1).
Started: 5:33:55 AM
Finished: 5:33:57 AM
Elapsed: 1.359 seconds
Please give the solution ASAP.
Thanks
Thiru
Hi guys,
I am basically from COGNOS domian, but now i am trying to learn SQL server 2005.
Can any one say how automate or schedule the package that it been created in SSIS.
That package contains the data flow task..
i have got stuck in this
Thanks in advances
Lalitha
Keysoft Solutions
Hello,
I want to know detail execution flow of SSIS package (like Validation -> Expression evaluation -> Execution etc.)
Where can I get detail information, any reference (links)?
Thanks in advance.
-Omkar.
I would like to know how long will my package take to run if i have 2 million records to be inserted to my database.
Its taking hours( arround 5 hours or more)...is that the way it is????? i used oledb destinations..
i am transfering from text file to sql server database.
And one more thing which i want to know is...
Is is better to have several data flows where in one i check to see if the incoming records are valid.and have my insert or update logic in the other.
Will this increase the speed..
Hi everybody,
I currently need help on an issue I can't find the solution.
I developed a package and I when I run it from the Integration Service directly it runs and ends correctly, deletes/creates tables on the db on the server the package is running and copy the data from the other db.
I want to specify that I use two db in different domain as source and destination.
When I schedule it using the Sql scheduler it fail reporting the following error:
SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "DBname" failed with error code 0xC0202009. There may be error messages posted before this with more information on why the AcquireConnection method call failed.
Can anyone of you give me a clue about the issue I have?
Thanks!
Hello friends!
I have one query regarding execution of SSIS package through Stored Procedure.
I have created SSIS package which extract data from database and put that into various text files.Here I am using two global variables one is for Department ID and another is path where I wanna to place my text files as per departments.When I ran it through command prompt it works fine but now I want that dtsx package to run from stored procedure with same input parameters
when i searched on line i got this solution
Declare @FilePath varchar(2000)
Declare @Filename varchar(1000)
Declare @cmd varchar(2000)
set @FilePath = 'C:setupSSIS PackagesSSIS Package File Extract DataSSIS Package File Extract Datain'
set @Filename = 'DataExtract.dtsx'
select @cmd = 'DTExec /F "' + @FilePath + @Filename + '"'
print @cmd
exec master..xp_cmdshell @cmd
but when i execute it i got error like
Source: {8A27E8DF-051B-4F6B-9538-85BED1F161D8}
Description: Unable to load the package as XML because of package does not have a valid XML format. A specific XML parser error will be posted.
End Error
Error: 2007-02-22 11:31:37.32
Code: 0xC0011002
Source: {8A27E8DF-051B-4F6B-9538-85BED1F161D8}
Description: Failed to open package file "C:setupSSIS PackagesSSIS Package File Extract DataSSIS Package File Extract DatainDataExtract.dtsx" due to error 0x80070003 "The system cannot find the path specified.". This happens when loadin
g a package and the file cannot be opened or loaded correctly into the XML document. This can be the result of either providing an incorrect file name was specified when calling LoadPackage or the XML file was specified and has an incorrect format.
End Error
Could not load package "C:setupSSIS PackagesSSIS Package File Extract DataSSIS Package File Extract DatainDataExtract.dtsx" because of error 0xC0011002.
Description: Failed to open package file "C:setupSSIS PackagesSSIS Package File Extract DataSSIS Package File Extract DatainDataExtract.dtsx" due to error 0x80070003 "The system cannot find the path specified.". This happens when loading a
package and the file cannot be opened or loaded correctly into the XML document. This can be the result of either providing an incorrect file name was specified when calling LoadPackage or the XML file was specified and has an incorrect format.
And also I am not understand where i should pass my two input parameters which I used in SSIS package variables???????
Please help me out
Thanks
Hi,
I am using SSIS packages for transfering data from a legacy SQL server to a new one.
The database has around 700 Tables. Out of which, data from 20 tables need to be transfered.
I have created 20 data flows for that.
The SSIS package is executed using a utility program that i've written using c#. I capture all the events and use them for some post processing. I look at the status of each of the data flows (whether failed, complete, etc.. ) and take decisions appropriately.
There should be a provision built-into(my utility) the application that allows users to suspend the execution of the package and restart it later..
I cannot use the package's suspend function here as suspend functionality in my case means that the user could even reboot the comp. He may then restart the execution of the package from the last logical point.
My initially thinking was, that i would (somehow) store the number of records transfered to the target for a dataflow and later when the user restarts the execution, i would change the SQL query in the dataflow to start from the records > than the one i've saved.
But how do i get the row number?? T-SQL supports row_number() but it seems that it cant be used with the WHERE clause.
Does anyone know how to deal with this problem ??
Is it possible to save the current execution state of a package in file and restart it at a later point (even after reboot)?
Cheers
Deep GZ.
Hello
I am trying to call the same package with different starting parameters using asynchronous method calls.
My code is
namespace ReconHost
{
public partial class HostContainer : Form
{
public delegate void InvokePackageHandler(
Microsoft.SqlServer.Dts.Runtime.Application app,
int var
);
Microsoft.SqlServer.Dts.Runtime.Application app;
public HostContainer()
{
InitializeComponent();
for (int i = 0; i < 2; i++)
{
app = new Microsoft.SqlServer.Dts.Runtime.Application();
InvokePackageHandler asyncInvokePackageHandler = InvokePackage;
asyncInvokePackageHandler.BeginInvoke(app, i, null, null);
}
}
public void InvokePackage(Microsoft.SqlServer.Dts.Runtime.Application app, int var)
{
Package pkg = app.LoadPackage(@"c:GRSGRSGRSTest_Async.dtsx", null);
pkg.Variables["intVar"].Value = var;
pkg.Execute();
pkg = null;
}
}
}
If I set the for loop to loop just once, the package is executed fine. Any more than once and only one instance of the package is executed.
Does anyone know what I am doing wrong? I know it is possible to execute the two package instances simultaneously because you can use DTexecUI.exe on two clients to do this.
Thanks in advance
Tomo
Hi,
I have the hierarchy of packages..
there are three levels..
first package (say L1) calls two child packages.(say L2_1 , L2_2)
the parameters are passed to L2_1 from L1 and it executes completely and successfully.
next step is L1 calls L2_2 without any parameters.. and package fails with error as execution of L2_2 failed.
when I executed L2_2 individually it successfully executes.
L2_2 itself calls 5 other packages (at level 3)..
Am not able to identify the possible reason for failure..
Hi,
I have a dts package migrated from sql server 2000 to sql server 2005 clustered server using migration wizard without any problem.
I have created a new job on sql server 2005 and one of the steps involves executing the SSIS package. It keeps failing with the error message "package execution failed". I have logged in here as a domain administrator (as also a local administrator).
I followed Article ID: 918760 but did not help.
I need this to be resolved asap.
Any quick help, much appreciated.
Thx
Murali
When I veiw the exexcution/progress results for a package they show in order of the name of the task, is there anyway to see this tree view in order of the execution?
Thanks,
Casey Smith
MCT
Hi there,
I have been trying to schedule a package I design to run off hour, but unable to do so. Here is a strange issue:
1. I was able to fully run and complete the package through VSS. My package has three steps and I have on complete arrows from one step to the next... When I run it, it does what I expected in SSIS designer.
2. I was able to deploy and run the actual package by double click on the file system and it runs successfully through Execute Package Utility.
So my package does loop through a file folder and insert records from within the folder through execute SQL task to a SQL table...etc. Typically it takes a while to run....
However, when I put the package under SQL agent and run it as a job (through setting up the sql job to call a package on the file system). It runs and completed within a few seconds.... Somehow, it didn't give me an error, but it doesn't seem like it acutally runs the package.
Does the execution of the package somehow sends success message back prematurely to SQL job? Why does it not successful run?
Is there any configuration setting that I am missing here? How can I find out if it actually runs the package and why does it returns so quickly and claim successful, yet it didn't do anything??
Please help! thanks,
Jon