I have several sequence containers in one package that fire off execute package tasks. I would like each of the sequence containers to start at the same time when the job starts running. However when I set them up to do that, i get an error that the variable cannot be read because it is locked. I have the variables setup as readonly so not sure why they are being locked. When I run the package and have each sequence container fire off after the previous one ends it runs fine.
I have this job that download 4 files once a month from the same server. The files are sizable and I need to download them in less than 5 hours total. In 2000 I use an active x script to generate the ftp script then execute the script. all four files download at the same time in 4 different tasks with no issues.
I am rewrote the process in 2005 so that it uses the IS FTP function but when all 4 ftp tasks kick off they all fail... instantly. the initially shared the same FTP connection manager so I created different ones for each and still the same result
the error is one that relates to changing directories.... Now if I just run one of the tasks it runs fine it is just when more than one try to run at once. I ended up putting in 10 second delays between each ftp task kicking off and it works just fine...
Does this sound like a bug??
Also... I am on SQL 2005 Enterprise SP1 on Windows 2003 enterprise SP1.
I ran into a variety of problems trying to set a script task breakpoint in a package containing multiple script tasks. The debugger apparently treats the breakpoint as if it were set in ALL tasks in the package, not just the one in which it is actually set.
At best this results in hitting breakpoints in scripts where they are not set and at worst the debugger brings up the "Send error report" dialog and quits (while the package continues to run). The latter seems to happen most often when the script with the breakpoint has more lines than an earlier script and the breakpoint is set at a line number that exceeds the number of lines in the earlier script--it bombs when the earlier, shorter script starts.
To get the debugger to work under these circumstances I had to add some nonsense code like
While False Dim i as Integer = 0 End While to every script, at the same line number near the beginning of the script (line 40, for example). I then set a breakpoint on the middle statement in one of the scripts (doesn't matter which) to cause the debugger to open at runtime. It doesn't hit the breakpoint because the line is never executed. If the breakpoint is set on a line that can be executed in any script in the package, bad things tend to happen. I then add a "stop" statement to the script that I want to debug. This only works if the debugger is already open, hence the dummy breakpoint above. This workaround is usable, but I am debugging a package that has quite a few scripts and having to insert dummy code in all of them at a fixed line number is rather inconvenient. I would really like to see breakpoints work the way one would expect--only in the scripts where they are set. Is there some other, easier way around this problem? Is there at least an easier way to get the debugger to open so that "stop" will work?
In my application code I am trying to invoke multiple threads in which each thread is loading an instance of the same SSIS package and would initialize the package variables with different values and execute the different instances in parallel. In each thread - after the package execution has completed successfully - I read that instance's SSIS package variables to get result information from that Instance run.
When I load the same package in different thread using LoadFromSqlServer() method - does the code create multiple instances of the SSIS package and load the distinct instances in each of the thread - Will the Package Execution ID be different for the different instances? - Are the package level variables instance safe?
I'm going crazy with EM's Sheduled Tasks. I've setup couple of backups and weekly maintenance plans from EM. After couple of days I viewed sheduled tasks from EM toolbar and found that all the tasks have not executed as I have scheduled. the task history does not show any history at all. Clicking refresh does not do any good either. However if i right click on the database and choose restore i see all the backups that i have scheduled. Is this a bug?
In previous versions of SQL Server you were able to execute single steps within a DTS package. It doesn't seemt to be possible in visual studio. Does anybody know otherwise?
I have a SQL Server 2000 instance running on a Windows Server 2003 box with 4 processors. SQL Server is configured to use all 4 processors, and use all available processors for parallelism.
I have created a simple DTS package which has 2 "execute external process" tasks with no precedence constraints between them. There are no connections required or defined for the two tasks (sequential processing is forced on tasks sharing connections). The DTS package properties have the "limit the number of tasks to execute in parallel" set to 4.
However, despite the above configuration, the two steps are never executed in parallel, but always sequentially.
Does anyone have any ideas as to why these tasks are not being executed in parallel?
i have a for each loop and it has about 20 data flow tasks (simple data extractions). i notice when i run the package it only runs up to 4 data flow tasks at a time. others have to wait till one of the first 4 flows finishes.
i was wondering if there's a way to change the limit of how many data flow tasks can run at a time. is there a property some where ?
i know this will be stressfull to the server, but the server is well equiped with CPU power and memory, so performance will not be an issue.
I have developed a big SSIS package to extract data from flat-files ( + 200 Dataflows ).
The situation is the following, inside de SSIS package, there are a lot of validations before extracting & loading the flat-files, i'm running this validations in paralell, so that when a file arrives, it enters the "validation process" and start extracting the file.
When i run the SSIS package from BIDS it works the way i have concepted it... but when i run the ssis in the server, the tables that are loaded through the process are only "available" when the SSIS PACKAGE ends, it is imperative that trough the process, when a table receives new data, it becomes ready, and don't just be available when the SSIS package finishes...
I have attached the an lousing .jpeg.
It is importart for the tables to be available, so the stored procedures(OUTSIDE SSIS PACKAGE) that are dependent of some tables, start working before the SSIS package Ends.
I have done a search and have read some of the posts, but am left more confused than before. I am fairly new to SSIS. Here is my situation and what i am trying to accomplish.
I have a package that has a sequence container, in which there are multiple SQL tasks (about 20) running in parallel. I have checkpoints enabled, and FailPackageOnFailure enabled as well. If the package fails, when i re-run the package it will run the last task as well as all the other tasks. What I am looking to accomplish is when the package is re-run, have the SQL tasks that failed ran and not the previous successful tasks.
I think the best way would be via disabling tasks on successful completion of a task, where it writes the name of the SQL task to a temp table, but I am skeptical.
Can anyone point me in a direction to help me accomplish what I am looking for please.
We want to develop an error handling process that will log the errors into multiple destinations (eventlog, text files or sql database) depending upon a variable set in the package. Also we want that this errror handling process should be initiated by all the tasks in the package on error.
Is this possible? Can the same event handler be called from multiple tasks in the package? Also in the event handler can we call another package which actually does the error handling. This way we have only one place to change our error handling process in case required.
This may sound a little anal-retentive, but I have a number of SSIS packages that, when I open them, the first thing I have to do is scroll to the left or up to get to where the tasks are displayed. Even if I move the tasks right or down, they still end up in that initial position. This happens even if I use auto arrange.
Is there a way for me to set the package so it has a consistent point or display at which it opens?
Hi Friends "I wonder why the boolean values Set to "Disable Property" using expressions does not work as expected (though Precedence constraints work fine). I find no way to disable individual task. "
I have a boolen variable V, (False) enabled by default at design time for a task T, I disable T based on a condition (if A ! = 'a') during runtime.
Now first time when I run the Package, Task gets disabled (works fine on the condition given). However T gets permanently disabled even after we stop debug mode, which is not the expected behavior. Further on when you run, it wont work at all.
Is it true that a task disabled at design time cannot be enabled during runtime via expressions ?
If things are working fine in your case, pls let me know what settings are required.
I have put together the stored-procedure below to carry out update, delete and insert queries in one visit. The code has been pieced together from the pages listed at the bottom. I pass the procedure three XML strings and after testing it a few times it seems to work fine. I’m fairly new to stored procedures though, so I was hoping someone would answer these questions: 1. Is this an acceptable way to do this? Can you foresee any problems?2. I want to make this an ‘all-or-nothing’ event, i.e. if any part of the procedure fails, it must all fail. How would I achieve that?3. I want to know in my calling code what the result is. I’ve used output parameters before, but I’m unsure how to combine one with 2 above. Sorry this is a long script, but I’ve removed most of the column names and values to shorten it. Thanks in advance.CREATE PROCEDURE amendPageRecords @PagesToUpdate xml, @PagesToDelete xml, @PagesToInsert xml
SELECT PageID = UpdatePages.Item.value('@PageID', 'int'), PageType = UpdatePages.Item.value('@PageType', 'varchar(10)') FROM @PagesToUpdate.nodes('Pages/Page') AS UpdatePages(Item)
UPDATE page SET page.page_type = UP.PageType FROM page INNER JOIN @UpdateTable UP ON page.page_id = UP.PageID
-- DELETING RECORDS ------------------------------------------------------------ DECLARE @DeleteTable TABLE (PageID int)
INSERT INTO @DeleteTable (PageID) SELECT PageID = DeletePages.Item.value('@PageID', 'int') FROM @PagesToDelete.nodes('Pages/Page') AS DeletePages(Item)
DELETE FROM page FROM page INNER JOIN @DeleteTable DP ON page.page_id = DP.PageID
SELECT SiteID = InsertPages.Item.value('@SiteID', 'int'), PageType = InsertPages.Item.value('@PageType', 'varchar(10)') FROM @PagesToInsert.nodes('Pages/Page') AS InsertPages(Item)
INSERT INTO page SELECT SiteID, PageType FROM @InsertTable
END GOCode taken from:http://weblogs.asp.net/jgalloway/archive/2007/02/16/passing-lists-to-sql-server-2005-with-xml-parameters.aspxhttp://www.eggheadcafe.com/articles/20030627c.asphttp://www.sommarskog.se/arrays-in-sql-2005.html
I'm trying to do some custom SSIS logging using event handlers, similar to the ideas provided by Jamie Thomson in the past. My problem is that when I use System:ourceID as one of the items to be logged, I can't match up the SourceID to any of the GUIDs that are displayed in the property window for the various tasks in my package.
Where is this sourceID coming from and how can I track it down?
I have developed a big SSIS package to extract data from flat-files ( + 200 Dataflows ).
The situation is the following, inside de SSIS package, there are a lot of validations before extracting & loading the flat-files, i'm running this validations in paralell, so that when a file arrives, it enters the "validation process" and start extracting the file.
When i run the SSIS package from BIDS it works the way i have concepted it... but when i run the ssis in the server, the tables that are loaded through the process are only "available" when the SSIS PACKAGE ends, it is imperative that trough the process, when a table receives new data, it becomes ready, and don't just be available when the SSIS package finishes...
I have attached the an lousing .jpeg.
It is importart for the tables to be available, so the stored procedures(OUTSIDE SSIS PACKAGE) that are dependent of some tables, start working before the SSIS package Ends.
I'm trying to keep track of the ETL process inserting/updating a row in one table for each package that finish in my ETL process when executing. So far, I created a Script task that increments by one a variable (counter) and then open a connection to my database an insert/update my table. What I want to see is Step 1/30, Step 2/30 and so on. Right know I can display Step 1, Step 2 but how can I get the overall number of tasks within a package?
I have a Package and a DataFlow Task. The Package has TransactionOption=Required. The DataFlow Task has an OLE DB Source and an OLE DB Destination. The DataFlow Task has TransactionOption=Supported. The package executes on a Workstation and DataSources for the OLE DB Source and the OLE DB Destination are on a Server.
After the package had been launched an error message showed:
[OLE DB Destination [43]] Error: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "DWH_Destination" failed with error code 0xC0202009. There may be error messages posted before this with more information on why the AcquireConnection method call failed. [DTS.Pipeline] Error: component "OLE DB Destination" (43) failed the pre-execute phase and returned error code 0xC020801C.
[Connection manager "DWH_Destination"] Error: The SSIS Runtime has failed to enlist the OLE DB connection in a distributed transaction with error 0x8004D024 "The transaction manager has disabled its support for remote/network transactions.". [Connection manager "DWH_Destination"] Error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x8004D024.
If I set TransactionOption=NotSupported in the DataFlow Task then the package executes successful.
I have some "Execute T-SQL Statement Tasks" in a package. I would like to run this same package on another SQL Server without having to change it on the other server. Since the server name can be given when setting up the connection, I think if I leave the server name out then the package could run on any server? Is my assumption correct?
I am in the process of moving from a 32-bit SQL Server 2005 Enterprise (9.0.3054) to a 64-bit SQL Server 2005 Enterprise (9.0.3054 with 4 CPUs and 8GB of memory on Win 2003 SP2) and the process has been very frustrating to say the least. I am having a problem with packages that I created on my 64-bit SQL Server. I am importing a few tables from the 32-SQL Server into the 64-bit SQL Server using the Task --> Import to create the package.
Sometimes when I am creating a package I get the following error in a message box:
SQL Server Import and Export Wizard
The SSIS Runtime object could not be created. Verify that DTS.dll is available and registered. The wizard cannot continue and it will terminate.
Additional information: Attempted to read or write protected memory. This is often an indication that other memory is corrupt. (System.Windows.Forms)
Other times when I run a package that has run successfully before I get the following error:
Faulting application dtexecui.exe, version 9.0.3042.0, stamp 45cd726d, faulting module unknown, version 0.0.0.0, stamp 00000000, debug? 0, fault address 0x025d23f0.
The package appears to hang when running. By this I mean that the Package Execution Progress shows progress up to a point then it just stops. (The package takes about 17 seconds to run normally) CPU usage is at 1% and the package cannot be stopped.
I have deleted and re-created the package several times and I have also re-installed the service pack on the SQL Server (9.0.3054) but that did not help.
The master package has a configuration file, specifying the connect strings The master package passes these connect-strings to the child packages in a variable Both master package and child packages have connection managers, setup to use localhost. This is done deliberately to be able to test the packages on individual development pc€™s. We do not want to change anything inside the packages when deploying to test, and from test to production. All differences will be in the config files (which are pretty fixed, they very seldom change). That way we can be sure that we can deploy to production without any changes at all.
The package is run from the file system, through a job-schedule.
We experience the following when running on a not default sql-server instance (called dkms5253uedw)
Case 1: The master package starts by executing three sql-scripts (drop foreign key€™s, truncate tables, create foreign key€™s). This works fine.
The master package then executes the first child package. We then in the sysdtslog get:
Error - €œcannot connect to database xxx€? Info - €œpackage is preparing to get connection string from parent €¦€?
The child package then executes OK, does all it€™s work, and finish. Because there has been an error, the master package then stops with an error.
Case 2: When we run exactly the same, but with the connection strings in the config file pointing to the default instance (dkms5253), the everything works fine.
Case 3: When we run exactly the same, again against the dkms5253uedw instance, but now with the exact same databases defined in the default instance, it also works perfect.
Case 4: When we then stop the sql-server on the default instance, the package faults again, this time with
Error - €œtimeout when connect to database xxx€? Info - €œpackage is preparing to get connection string from parent €¦€?
And the continues as in the first case
From all this we conclude, that the child package tries to connect to the database before it knows the connection string it gets passed in the variable from the master package. It therefore tries to connect to the default instance, and this only works if the default instance is running and has the same databases defined. As far as we can see, the child package does no work against the default instance (no logging etc.).
We have tried delayed validation in the packages and in the connection managers, but with the same results (error).
So we are desperately hoping that someone can help us solve this problem.
Hi all, I am creating a dts package to export files from one database to another database. I tried to search for ways to execute the files and found out that i need to add reference to Microsoft.Sql.managedDts. However, I cannot find this reference from my reference. Do i have other alternatives to run this file?
I am trying to programmatically execute a package that contains an Execute SQL Task component bound to a variable for its "SqlStatementSource" property (via an expression). The variable is of type String and contains a simple value of "SELECT 1". The Execute SQL Task contains an expression that sets the SqlStatementSource property to the value of this variable.
The package runs fine when I execute it via dtexec or BIDS, but when I attempt to run it via the object model, I receive the following error message:
The result of the expression ""@[User::Sql]"" on property "SqlStatementSource" cannot be written to the property. The expression was evaluated, but cannot be set on the property.
I did a search on this forum and noticed quite a few threads about this same issue, but no explanation/solution. We have quite a few packages that have dynamically constructed SQL statements for Execute SQL Tasks, and they are all failing to run via the object model. Is there something that I am missing?
I've made a query like the one in msdn (SELECT * FROM __InstanceCreationEvent WITHIN 10 WHERE Targetinstance ISA "CIM_DirectoryContainsFile" and TargetInstance.GroupComponent= "Win32_Directory.Name="e:\\temp""). I have 20 similar tasks for watching in different folders, but when there are too much tasks in parallel, it doesn't work anymore. I change the numbers of executables to 128 (in the general properties of the package (to test)) but it doesn't seems to work.
I don't understand why it works when there are only 1 or 2 (6 seems to be the maximum) tasks and not if there are more than 6.
Could you help me with this issue?
Configuration : Windows Server 2003, SQL Server 2005, SSIS, Sql Server Agent
I have come across a situation where there 10 tasks. The second task on the flow is a script task which disables all further tasks based on a condition. I thought that the logic would be better if we force terminate the package successfully at this stage itself. How can this be done.
I have a scenario where given an incident, for example a helpdesk request, I need to get the time invloved given the status of the incident as it moves through a queue. Each incident has a unique incidentID and can have any number of status events, each with a statusID (1 to many) For example:
status1(S1) - Incident opened -- the time here is captured as start time
S2 - misc. status - time to capture is time between start time and the new status time i.e., 2 minutes
S3 - misc. status - time to capture is time between start time and S3 i.e., 3 minutes, and the time between S2 and S3 i.e., 1 minute
... can include any status count -->> Sx ... ending with a closing time
I'd like to able to capture any time between any 2 status levels between the start and closing time.
I'm figuring that using cursors would be the best way to go, but realize that there are always better ways to do things. Listed below is the SQL I plan to use. If anyone has a better way to go about accomplishing the desired goal please let me know. The code should be able to run under either SQL Server 2000 or 2005. Thanks in advance.
SAMPLE SQL:
declare @ticketnumber varchar(20) ,@opendate int ,@closedate int
select ticketnumber,min(opendate),max(closedate) into temp from mastertable group by tickernumber
declare MyCursor cursor for select * from temp order by 1
open MyCursor fetch next from MyCursor into @ticketnumber,@opendate,@closedate while (@@FETCH_STATUS =0) begin select sum(.....) from mastertable where ticketnumber = @ticketnumber and ?field? = 'EX'
fetch next from MyCursor into @ticketnumber,@opendate,@closedate end close MyCursor deallocate MyCursor
I am receiving an error on my master package that executes a number of other packages. The individual packages work fine when executed by themselves. However, I am getting the following error when I attempt to execute it from another package:
Error: Failed to acquire connection "conneciton". Connection may not be configured correctly or you may not have the right permissions on this connection.