I have data flow tasks, one which validates the import file and one which processes the import file if the validation passed. The validation runs and pushes the three row types to three different recordset destinations. When I enter the processing data flow task, I have three parallel trees processing each recordset saved in the previous task. I'm using a script component to generate the rows which are then sorted and merged with the production database to find existing records. Based on this, I split to an OLE DB command (running an UPDATE command) or OLE DB destination (to simply insert the records.)
In this particular case, all records are being updated and nothing is being inserted new. Two of the three trees will complete the sort but hang on the merge, split, and OLE DB command components. The other will do the same but also hang on the split.
In another case, I truncated each destination table before running the package and the package runs fine.
Are toes being stepped on in the data flow task and causing a deadlock?
Update: I removed the sort transformation and sorted the rows before pushing them to the recordsets and I still get the same results.
I have done a search and have read some of the posts, but am left more confused than before. I am fairly new to SSIS. Here is my situation and what i am trying to accomplish.
I have a package that has a sequence container, in which there are multiple SQL tasks (about 20) running in parallel. I have checkpoints enabled, and FailPackageOnFailure enabled as well. If the package fails, when i re-run the package it will run the last task as well as all the other tasks. What I am looking to accomplish is when the package is re-run, have the SQL tasks that failed ran and not the previous successful tasks.
I think the best way would be via disabling tasks on successful completion of a task, where it writes the name of the SQL task to a temp table, but I am skeptical.
Can anyone point me in a direction to help me accomplish what I am looking for please.
I have a package that is doing some file transformation (Text, XML, and Excel) job based on a variable value. This package is called by a Parent package, where I am calling this package parallel through a script Task. So there are three parallel script task and all variables are local to script task.
In Script Task I am assigning value to child package variable using following code.
In my application code I am trying to invoke multiple threads in which each thread is loading an instance of the same SSIS package and would initialize the package variables with different values and execute the different instances in parallel. In each thread - after the package execution has completed successfully - I read that instance's SSIS package variables to get result information from that Instance run.
When I load the same package in different thread using LoadFromSqlServer() method - does the code create multiple instances of the SSIS package and load the distinct instances in each of the thread - Will the Package Execution ID be different for the different instances? - Are the package level variables instance safe?
Can we execute multiple instances of the same SSIS package simultaneously?? If yes, how? If no, what is the work-around to simulate such a functionality?
Is this this only way to do it in SSIS? http://sqljunkies.com/WebLog/ashvinis/archive/2005/06/15/15829.aspx For some reason I figured that SSIS would have this kind of stuff built into it, it seems a function that many would use.
I've made a query like the one in msdn (SELECT * FROM __InstanceCreationEvent WITHIN 10 WHERE Targetinstance ISA "CIM_DirectoryContainsFile" and TargetInstance.GroupComponent= "Win32_Directory.Name="e:\\temp""). I have 20 similar tasks for watching in different folders, but when there are too much tasks in parallel, it doesn't work anymore. I change the numbers of executables to 128 (in the general properties of the package (to test)) but it doesn't seems to work.
I don't understand why it works when there are only 1 or 2 (6 seems to be the maximum) tasks and not if there are more than 6.
Could you help me with this issue?
Configuration : Windows Server 2003, SQL Server 2005, SSIS, Sql Server Agent
I am having problems of my DTS package hanging when run through the sql server agent through a scheduled job. If I execute the job manually it runs fine? If I run the package manually it runs fine? Of course there is no way to actually tell what is being hung? Any ideas?
I have a very simple SSIS package that executes a .bat file. Here's the actual file it executes:
@Echo Off c: F: cd ReportingServicesScripts rs -i NoteBlankSnapshots.rss -S http://10.90.160.13/ReportServerTest EXIT
The .bat file executes a Reporting Services .rss script using the rs.exe utility. When I run the command from the command prompt, it runs just fine. When I execute the .bat file manually, it runs and completes. When I execute the actual package, it also completes. But when I schedule the package as a job, it just hangs...it never finishes.
The owner of the job is an administrator in SQL Server. I have the SQL Server Agent configured to interact with the desktop - although my .bat file requires no input from the user.
I've created other jobs that just execute plain old SQL using the same owneer and these jobs complete just fine.
I have a SQL 2000 SP4 server running on Windows 2003 SP2. About 6 months ago I started experiencing problems editing existing DTS packages. At that time it was just 1 package. (I tested that I could edit all others but this one package the first time I experienced the problem.)
But now it seems like anytime we experience import problems, I cannot edit the package so that I can verify that the import file is good. (So far the initial problem is always with the import file and once we figure out what that is and resolve it the package runs ok again.)
The problem is I'm planning on migrating to SQL 2005 and since I can't import DTS packages, I'm going to have to rewrite them as Integration Service Projects. If I can't edit the workflow tasks in DTS, it will be time consuming figuring out what the actual import file is. Also, most of these imports are coming from our AS400 which we are phasing out so I have to rewrite these to work with the new ERP package we are going to as we bring on each division.
I have not installed any post SQL 2000 SP4 hotfixes. So if anyone is aware of a hotfix that resolves this issue, please let me know.
Dear Folks, I have a package that calls winzip to extract files(command line usage) in an Execute Process task. The package runs fine if I am logged in to the server. It hangs on the winzip task otherwise. The package is stored on the server (as opposed to the file system) & is run under a proxy account using the SQL Server Agent. I tried adding folder and WinZip32.exe permissions for the domain user who the proxy account was created under to no avail. Any Ideas? Thanks for your help!
When I set the transaction option to required my package just hangs when I try to execute it. The status bar says "Validating" and then the name of the first destination data flow component (whatever that happens to be). I've let it sit for long periods and nothing happens. Any suggestions?
Hello everyone, I am having an odd behavior with an SSIS package that transfers data from one database to another. It seems to me that with some of the larger databases the services just stops working (0% utilization) when it is almost done. I don't get errors and the job stays in the "executing" state.
This doesn't happen always, but it will happen once the package is ran many times over the course of the week, I was wondering if you guys could point me in a direction on how I can find out the source of these hangs. Right now I am leaning towards it being a memory issue (or other misc. hardware problems) but I would like to rule out that it is software first.
I have an SSIS package that executes in about 1:20min from Visual Studio on my local machine. While executing, my machine is somewhat unresponsive.
When I deploy the package to the database server -- the very same database server that I am accessing from my local machine -- the package executes but eventually hangs. It appears to be running out of memory, and I usually have to kill the process to get the machine to respond. While it's hanging, the machine is unresponsive to all users. The hardware (including memory) is identical between my local development box and the server.
How should I troubleshoot this? I've tried deploying the package to MSDB, file system, running from dtexec, and running from dtexecui. This is very frustrating!
I have a number of packages that I have moved from an old server. Each package was scheduled with a SQL Agent job. On the old server everything ran fine. All of the packages run fine from VS, from DTEXECUI and I have tried one from the command line with DTEXEC and it worked.
When I run from the SQL Agent job, I don't get a failure, the package just hangs. I let one of the agent jobs sit for an hour with no progress. The package typically takes about 15 minutes to complete.
Below is the output from my package log up to the point that it hangs:
We are on SQL Server 2000 SP2 version. I have 3 DTS packages that are running successfully every day. We need to change them as the source side tables are going to be changed pretty soon. When I go into the designer view Enterprise Manager hangs when I do any of the following:
1. click on Properties for the transformation task 2. Click on Disconnected Edit 3. Click on Properties of connection 1(IBM DB2/400 Source), properties window pops up, now change the userid/pwd and click ok.
We have a problem with visual studio. It hangs when i use the "execute package" option. New packages are running correctly but the packages which i've already built are not executing anymore...Any ideas to get things on track again?
When I push my SSIS packages up to my production server (which has a different data source than my developement environment) and I try to open the package on the production server, it takes forever for to validate all the steps of the SSIS package because it's trying to validate against a datasource that isnt there, so it just waits for each element it's validating to time out. This is exceptionally annoying.
Is there a way to turn off this validation 'feature'?
We're experiencing a problem where intermittently our SSIS packages will hang. There are no log errors or events in the event viewer. It will happen whether the package is executed from the SQL Job Agent or run from BIDs. When running from BIDs it appears to hang inside one of the data flows (several parallel pipes with sorts, merge joins etc...). It appears to hang in multiple pipes within the data flow component. The problem is reproducable, we just kill it and re-run, and it appears to hang in the same places.
Now here's the odd thing: as we simply open and close some of the components in the pipe line after the place it hangs, a subsequent run will go further in the pipeline before hanging. If we open and close all the components after the point it initially hung, the data flow will run fine, from there on out. When I say "open and close" I mean no changes are made, we simply double-click the component, like a merge join, then click 'close.'
To me this does not seem like a memory problem but likely something is wrong with the metadata, where opening a component and closing it somehow alters the metadata to "right it".
This seems to occur intermittently after we make modifications to the package. It's like if you make any mod, even unrelated to the data flow, you then have to go through and open and close every component in your package to ensure it will work. Again, no errors or warnings are fired.
Help! I am using Script Transformation to output a new column as image[DT_IMAGE] field to store serialized object. In the VB script, the sample code as
The package always runs fine on my developing machine and will halt on other machine at AddBlobData after certain number row records were processed. I am stuck here. Anyone has any suggestion?
What I need is reading data from mutiple tables in one database and writing into a single table in another datable. In order preserve all the columns data, I use input column fields to construct a new object and then serialize it, and store the serialize data into detination db table. (The object and serialization function is coming from c# dll.)
Dim b As BusinessLicense = New BusinessLicense() b.ApprovalDate = Row.approvaldate b.BusinessId = Row.busid b.BusinessName = Row.busname b.NaicsCode = Row.naicscode b.NaicsDescription = Row.naicsdescr b.OwnerName = Row.ownername b.Phone = Row.phone b.Pkey = Row.pkey b.RenewalDate = Row.renewaldate b.StartDate = Row.startdate b.Suite = Row.suite
Row.serializedobject.AddBlobData(Serializer.Serialize(b)) '''----This is blocking line Row.infoType = BusinessLicense.TYPE
Both machine is xp with sp2. and standard SQL Server 2005 - 9.00.1399.06
I am working on SQL Server 7.0. Every weekend we go for reindexing of some tables. I want to know if it is possible to run the re-indexing of tables in parallel so that I can save time.
Our database is of size 80GB and one table is around 22GB. Rebuilding of index on this table takes a lot of time and we are unable to index the other tables.
hi, we currently use the Database Maintenance Plan to do backups for our SQL Server 2000 databases. I notice that the database are backed up one after the other.
I would like to know how to run the backups in parallel rather than sequentially. To do this, is there any dependency on the number of CPUs?
I created the package to download 4 ftp files at once. I set the MaxConcurrentExecutables for the SSIS package to 4. So in BIDS in downloads 4 files at the same time.
However, when I started the job I noticed that only 3 files were downloaded at the time (looking at temp files in download directory)
Solution: Sure enough after digging around for awhile - in Step properties for SSIS package - there is execution tab - and "Maximum Concurrent Executables" was -1 (which for some reason defaults to 3 concurrent processes even on our dual CPU server) - so after chanign that value to 4 - tada - all 4 files in parallel
Is there any way to run a stored procedure in parallel to another one? i.e. I have a stored procedure that sends an email. I then scan a table and send any unsent emails. I do not want the second part to slow the response to the user.
Assuming I have a line, is there a function I can call to create a parallel line at a given distance away.i.e - with the below I would want to draw a parallel line to the one output.
Hi ,I need to place the results of two different queries in the same resulttable parallel to each other.So if the result of the first query is1 122 343 45and the second query is1 342 443 98the results should be displayed as1 12 342 34 443 45 98If a union is done for both the queries , we get the results in rows.How can the above be done.Thanks in advance,vivekian
Hi,All. I'm writing test cases on C# for a few methods that make changes in database.To prevent making changes I used BeginTransaction-Rollback,everything was good.But this doesn't work if tested method has BeginTransaction-Rollback code itself.An error appears in NUnit: System.InvalidOperationException : SqlConnection does not support parallel transactions. Do smb know how to solve the problem?
I have several packages within secuence containers and into one main dtsx package with a checkpoint configuration and when I run it some succeed and some don´t. The problem is that when I rerun it checkpoint doesn´t seem to work ´cause some of the successful packages are rerun as well (and not skipped as it should be...) In other words, the process does not begin on the point of failure..
Seems to be that packages that finish after the failure point (and succeed) are not registered in the checkpoint file, then when I rerun the main package these succeeded packages are rerun too....