Integration Services :: How To Execute Multiple Packages In Parallel
Oct 14, 2008
I'm pretty new to SSIS but I've managed to cobble together a number of individual packages to refresh SQL tables from a 3rd-party database.
Now, what I'd like to do is have a single package that I can use to invoke each of the individual ones. Since it will run on a quad, I'd like to invoke them such they'll run in parallel.
We are building a dataload application where parameters are store in a table. And there are multiple packages for each load.There is a column IsChecked column if it is 1 then only the child package should execute.Created a master package. In which i have taken execute SQL task in that storing a results in variable and based on the result the child package should execute. But In executesql task i selected result set as full result set. I am getting the below error.
[Execute SQL Task] Error: Executing the query "SELECT isnull(ID ,0) AS ID FROM DataLoadParameter..." failed with the following error: "The type of the value (DBNull) being assigned to variable "User::LoadValue" differs from the current variable type (Int32). Variables may not change type during execution. Variable types are strict, except for variables of type Object.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
I have a stored procedure. It can be executed like this
exec test @p = 1; exec test @p = 2 exec test @p = n; n can be hundred.
I want the sp being executed in parallel, not sequence. It means the 3 examples above can be run at the same time. If I know the number in advance, say 3, I can create 3 different Execution SQL Tasks. They can be run in parallel.
However, the n is not static. It is coming from a table.
How can I execute Stored Procedures in PARALLEL and DYNAMICALLY ?
I think about using script task. In the script, I get the value of n, and the list of p, from the table, then running a loop with. In the loop, I create a threat and in the threat, I execute the sp like : exec test @p = p. So the exec test may be run parallel. But I am not sure if it works.
We manage some SSIS servers, which has only SSIS and SSIS tools installed on them and not the sql server DB.
SSIS packages and configuration files are deployed on a NAS. We run the SSIS packages through DTEXEC by logging in to the server.
We want to allow developers to run their packages on their own on the server, but at the same time we dont want to give them physical access on the server i.e we do not want to add them into RDP users list on server properties. We want them to allow running their packages remotely on the server.
One way We could think of is by using powershell remoting and we are working on that. But is there any other way or any tool already present for the same.
I have created for each container to call all the packages in a folder like below, also created a variable.
Then I add execute package task inside of foreach container and selected file system in a location and in connection called currently creating package name finally in connection properties i added variable in expression which i created and mapped into for each loop container. I referred below link
[URL] ....
All the packages are running but its not ending once all the packages executed its re run and continue the running process, how to stop once all the packages execute.
I have 5-6 table and wanna to make sure if I'm using any of them in my Packages and I do have 100 Packages.shortest way to search these tables in Packages.
I have a MASTER package containing multiple packages using execute package task in SSIS.When am running this MASTER Package in Sql server agent in SSMS, am getting error as below.
steps how to run this and let me know what information am missing to run the Master package.
ERROR :Failed to decrypt protected XML node “DTS: Password” with error 0x8009000B “Keynot valid for use in specified state”. You may not be authorized to access this information.
This error occurs when there is cryptographic error. Verify that the correct key is available.
After hitting limitations in the SQL CLR world that bar us from invoking COM objects we are forced to use windows services to read the messages off the Service Broker Queues. Unfortunately we loose the auto activation feature in the Queues, but we can still read messages and perform the SQL work under one transaction.
We are going to attempt to take N messages simultaneously from the Queue, though N instances of a windows service. If the messages send to the queue are one message per conversation, will we be able to achieve having N readers take messages off simultaneounsly?
Thank you very much,
Lubomir
P.S. if anyone has a better approach to obtaining the message in "out of sql code" or invoking external (not assemblies stores in SQL server) code libraries, that would be etremely nice to hear. I have thought about invoking a web service through CLR, but that is probably too much overhead - MSMQ seems much more appealing than a web service;
I have a schedule job that errors out on the first step when I attempt to run it. There are 3 packages that I created and I can run them all manually just fine.
I get this error when I try to run the job: -----------------------------------------------
An OLE DB error has occurred. Error code: 0x80040E4D. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E4D Description: "Communication link failure".
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80... The package execution fa... The step failed.,00:00:01,0,0,,,,0
I have 200+ plus packages that need to be flexible in how they are run. For example, an end user may choose to run packages 1,2,3 and the next end user may choose to run packages 2,3,7, etc. Prior ro running a package, I set an "instance id" inside the group of packages so I can tie them all together in the logfile - I know that packages 1,2,3 were all run as group and that's distinct from packages 2,3,7 that were run in a differnt group.
Initially I embarked on a scenario where I had a queue table that loaded up the packages to be run and then had a little c# app that read the queue, generated the "instance id" and ran all the packages (either thru dtexec.exe or the Microsoft.SqlServer.DTS.Runtime). But now I wonder if using a master package that uses the Execute Package Task is the way to go. My 200+ packages are all independent and run based on a single config file and it seems as though going the parent package route will destroy some of that independence because I'll now be relying on parent package variables.
Need some help... When we tried to run mulitple packages one after the other from a windows service, first one succeeds but later ones are throwing below error : "The script threw an exception: The element cannot be found in a collection. This error happens when you try to retrieve an element from a collection on a container during execution of the package and the element is not there. A deadlock was detected while trying to lock variable "System:ackageName, User::BusinessDate, User::Environment, User:ortfolioName" for read access. A lock could not be acquired after 16 attempts and timed out."
Later, we tried to create separate AppDomains for each package and execute via console application, but ended up with below error (The below expressions were defined in OnError Event) : "The result of the expression "@[User::ReportErrorFrom]" on property "FromLine" cannot be written to the property. The expression was evaluated, but cannot be set on the property. The result of the expression ""Error At :" + @[System:ourceName] + "" + "Error Description : "+ @[System::ErrorDescription] + "" " on property "MessageSource" cannot be written to the property. The expression was evaluated, but cannot be set on the property."
At last, we tried to span a separate process (System.Diagnostics.Process) for each package. this seems working but taking very long time: A package that normally takes 2 min, is taking 60 min.
We also tried creating an SSIS Package that executes mulitple packages. But only first package is getting executed, and second one is throwing below error (Here the variable it is trying to lock is of first package): "Failed to lock variable "UniqueInstrumentsQuery1" for read access with error 0xC0010001 "The variable cannot be found. This occurs when an attempt is made to retrieve a variable from the Variables collection on a container during execution of the package, and the variable is not there. The variable name may have changed or the variable is not being created.".
Please help us with some work around for this. Thanking you in advance,
I have a fuzzy lookup in Integration Services Packages that does not seem to run. I am pulling data from a table in sql server 2008 R2 and comparing results to data from another table in sql server (same database & instance) using a fuzzy lookup for match similarities between the data sets. When my data flow task reaches my fuzzy lookup, a DOS box pops up for a second and then my packages finishes with a message of "Finished. Cancelled". The last message in my execution results displays: "Information: Execute phase is beginning". Again, there are no excel files being processed or utilized in this package. I've tried running my packages both in 32 bit and 64 bit mode.
What is the best way of documenting SSIS packages?I am not interested in purchasing any software, any general template that covers the key information of any SSIS package that should be defined and documented.
I am currently moving everything from SQL Server 2005 SP2 to SQL Server 2012. I have a method for getting users, logins, roles and SQL jobs. But I also have to get copy all of the SSIS packages from 2005 to 2012. I know I can go to the 2012 SQL Server and click on the MSDN folder and choose import. However, this only enables me to import one package at a time. I have 95 packages. Is there a way to get them all from the 2005 SQL Server to the 2012 SQL Server in one shot? I am not a SQL developer nor am I a DBA but I have been assigned this task.
i am creating ssis packages with condition split . condition is SUBSTRING(EnglishProductName,1,1) == "A". pacakge is successfully executived but data is not move to condition split transformer to oldeb destinations. it not showing any error.
I have a requirement where in i have around 15 different flat files , filenames are fixed but folder path can be changed(i think i should use a variable for folder path). These 15 files data should go to their respective tables in the database.
Whether I need to create separate data flow task for each file or separate package? In addition to these, example : while importing product data into product table, if product ID already exists, we need to ignore it and upload only the new records.
I have around 500 packages (SQL 2005) deployed in file system & all this packages are running on daily basis via SQL agent. Now I need to migrated all 500 packages into SQL server 2008 R2.
There is no inventory to track which package belogs to which team and any other information.
Now, I need a method to query the pakages connection string details with database respective. Is there any method?
Would please any expert here give me any guidance about what Data Mining tasks can be automated and scheduled via Integration Services Packages? Also, If we automated the tasks, can we also automatically save the results of the tasks somewhere? Like if we automate assessing the accuracy of a mining model, then we wanna know the mining model accuracy later, therefore, we need to save all these results from the automated actions. Is it possible to realize this?
Thanks a lot in advance for any guidance and help for this.
I have several packages within secuence containers and into one main dtsx package with a checkpoint configuration and when I run it some succeed and some dont. The problem is that when I rerun it checkpoint doesnt seem to work cause some of the successful packages are rerun as well (and not skipped as it should be...) In other words, the process does not begin on the point of failure..
Seems to be that packages that finish after the failure point (and succeed) are not registered in the checkpoint file, then when I rerun the main package these succeeded packages are rerun too....
I have seen a number of posts regarding parallel development of SSIS packages and need some further information.
So far we have been developing SSIS packages along a single development stream and therefore have managed to avoid parallel development of our packages.
However, due to business pressures we will soon have multiple project streams running in parallel, and therefore multiple code branches, as part of that we will definitely need to redevelop the same SSIS packages in parallel. Judging from your post above and some testing we have done this is going to be a nightmare as we cannot merge the code. We can put in place processes to try and mitigate this but there are bound to be issues along the way.
Do you know whether this problem is going to be fixed? We are now using Team Foundation Server but presumably the merge algorythm used is same/similar to that of VSS and therefore very flaky?
However, not only are we having problems with the merging of the XML files, but we also use script tasks within the packages which are precompiled, as the DTSX files contain the binary objects associated with the script source code, if two developers change the same script task in isolated branches the binary is not recompiled as the merge software does not recognise this object.
Do you know whether these issues have been identified and are going to be fixed to be in line with the rest of Microsoft Configuration Managment principles of parallel development?
I am facing some problem's while using the FOR loop container to execute 7-10 packages in parallel.
The main package has 7 FOR loop containers say F1-F7.
Each FOR loop container has 2 task's
T1==> exec child package C1
T2==> exec delay task Delay1.
The idea is to run child packages c1-c7 in parallel ...delay for some time and then run again since there are in the FOR loop container.
I am facing someproblems.
1. The execution of tasks T1-T7 is not guranteed. This means SSIS picks up any 6 tasks of T1-T7 randomly to start with. 6 is the max it processes whereas i have more than that. Can i change this setting???
2. Its not guranteed that if say Task t1 of FOR loop F1 is executed the subsequent task for Delay within tat For loop would be executed next. Typically wat happens is it starts with T1-T6 (T7 onhold) and then exec the delay for T1-T5 and passes control to T7 without going into the delay for T6.This is not the intended execution.
What i want is exec T1-t7 ..delay for the next exec and start again.
Is there a way to change an image data type? I want to make a change to some deployed SQL 2008 SSIS deployed packages. I have a TSQL SELECT that searches the packages for a string. But I would like to be able to change a string. I have googled it but cannot find anything.
This is for SSIS 2008r2. I am generating flat files successfully with a datetime stamp (filename_yyyymmddhhmm). Now I need to append a MAX(FILEDATE) from the file. I have a query to do this, but am not sure about two things:
1) Is it advised to put the query in a Script Task (with db conn and so forth) or is it better to put it in an Execute SQL Task? I am thinking the latter but am not 100% sure.
2) How would I pass the result of this query (yyyymm) to the filename. The filename format would be (filename_yyyymm). I am assuming that I would probably need to pass the result into a variable/expression but am not quite sure of the steps involved.
I have an SSIS package which calls a command line app.When run in BIDS, it executes normally. The command line app is passed the arguments, does what it needs to do.When called as a SQL Agent Job (by the agent, or by me) it fails when calling the app, giving an exit code of 2 (which is an exception trapped by a try-catch). The SQL Agent service is running under my user (it's a test environment). The argument passed (from the log) is valid, and I've run it against the app, it provides the appropriate output.I can't for the life of me figure out what's going wrong.The app is passed an argument of a path and a password, and applies the password to the file, using interop.
How to execute ssis package from command prompt and also pass configuration file to it and set logging to ssis log provider for sql server. Writing all those options with cmd.