We have been working with SSIS for a while and we have not found a solution or a reason for this. We have a master package that calls 10 packages in sequential order. (as shown below). If we execute each one of the package separately the run in less than 2 minutes, but when we call them through the master package the execution time start increasing as follows: Child 1 (2 min), Child 2 (3 min),, Child 3 (4 min), Child 4 (6 min), Child 1 (7 min), and so on. The execute package task has the ExecutionOutOfProcess = false (when we set it equal to True even takes longer to execute, it was creating a dtsHost.exe process for each child and always remain in memory after the package finished executing). Can someone please provide a solution or a workaround for this? Any help would be appreciated. Any help will be appreciated.
I'm running a master package executing 8 child packages.
Each package contains the same connection managers and each package is stored within the MSDB database. The master package executes the packages stored in the MSDB using the 'execute package task'.
When running the master package from either MSDB or visual studio directly the odd thing happens that some tables will be filled, but after package execution I notice each table to be empty. There's no rollback 'procedure 'specified within the package and each package executes successfully because of error row handling. Anybody any hints how to solve this one?
I have a parent package which executes 14 child packages in parallel, which on average take ~10 seconds each to complete when I execute the parent packege using BIDS or DTEXEC.
However, if I run the parent package using SQL Management Studio (Integration Services > Stored Packages > MSDB > Right Click > Run Package) each package takes in excess of 10 minutes to run, getting progressively slower as each package starts.
Surely the package is executing in exactly the same way as BIDS/DTEXEC, just a differenct UI?
2 SQL Execute Task, One Loop container, 2 Data Flow tasks, 1 Foreach loop container, 1 ftp task. The data flow tasks has 1 oledb source, 1 flat file source, 1 row count transformation, 1 recordset destination and 1 oledb destination.
When I load the package into BIDS it takes 125 MB of memory and then everything is slow, the properties panel slides in slowly and exists slowly. The object is the packages are not painted properly. to make changes and run takes lot of time.
Am I doing anything wrong here? Why is it consuming so much of memory?
I have sql query to search for fields in a rather big view. If I execute the query in sql server enterprise manager, the results will be displayed in less than 6 seconds. However, if I execute it using asp.net, it will take very long (more than 2 minutes).
The query is a simple one like "SELECT * FROM myview WHERE name LIKE '%Microsoft%'". And the code I use to execute it in asp.net is
Dim dsRtn As DataSet Dim objConnection As OleDbConnection Try objConnection = GetOleDbConnection() objConnection.Open() Dim objDataAdapter As New OleDbDataAdapter(strSearch, objConnection) Dim objDataSet As New DataSet() objDataAdapter.Fill(objDataSet, strTableName) dsRtn = objDataSet Catch ex As Exception dsRtn = Nothing Finally If objConnection.State = ConnectionState.Open Then objConnection.Close() End If End Try
Where strSearch is the sql search string.
I don't have any problem using such code for other queries.
Could somebody suggest the cause of the problem and how to solve it? Thanks!
I am having a query where I am connecting to eight different tables using joins. When I join one table to another the speed of the execution becomes less. Even on my local server it is taking nearly 2 to 3 minutes to execute the query. How can I increase the speed of execution of my query.
Scenario 1: Sproc executed on local server against local tables that took 40 seconds to run, now takes 30 minutes to run. - No blocking locks - Sometimes "NOP" in command when sp_who2 is run. - perfmon shows nothing out of the ordinary when looking at server resources. (memory, processors, etc.) there have been NO configuration changes. - Occaisional lost packets (every 10th) with ping -t - I flushed the procedure cache, and rebooted the server.Scenario 2: Sproc executed on another server accesses tables on Scenario 1 local server via server link, runs with no problems in 30 seconds.SQL Server 2000 SP3a.
We have a quick query regarding SQL performance.We have SQL Server 2000 (32 Bit) and SQL Server 2005 (64 Bit) as twoseparate instances on a DB Server.We were analysing the execution times for the same stored procedure onboth instances:1. Through Remote Desktop of the actual DB server2. Through Query Analyser of my local machine.The results were as follows:1. Through Remote Desktop of the actual DB serverIterationSP Execution Time (in secs)SQL 2000SQL 200512852273327344035383Average 3232. Through Query Analyser of my local machine.IterationSP Execution Time (in secs)SQL 2000 SQL 2005)1379623277335844277954391Average3585Could you please provide some light on why case 2 is slow and anysuggestions to improve the same?Thanks in Advance!
I have a SQL Server 2005 Std. Ed. 64-bit installation. There is one instance supporting a single production database. I have a CLR udf. This udf uses the XMLDocument object to retrieve XML from a URL. When the CLR udf is executed, there seems to be an initial slow response time. Subsequent response times are very fast. If the CLR udf is not called for a few minutes and then called, the slowdown appears again.
Is there something happening behind the scenes with compilation or something like that which could cause this slowdown?
The master package has a configuration file, specifying the connect strings The master package passes these connect-strings to the child packages in a variable Both master package and child packages have connection managers, setup to use localhost. This is done deliberately to be able to test the packages on individual development pc€™s. We do not want to change anything inside the packages when deploying to test, and from test to production. All differences will be in the config files (which are pretty fixed, they very seldom change). That way we can be sure that we can deploy to production without any changes at all.
The package is run from the file system, through a job-schedule.
We experience the following when running on a not default sql-server instance (called dkms5253uedw)
Case 1: The master package starts by executing three sql-scripts (drop foreign key€™s, truncate tables, create foreign key€™s). This works fine.
The master package then executes the first child package. We then in the sysdtslog get:
Error - €œcannot connect to database xxx€? Info - €œpackage is preparing to get connection string from parent €¦€?
The child package then executes OK, does all it€™s work, and finish. Because there has been an error, the master package then stops with an error.
Case 2: When we run exactly the same, but with the connection strings in the config file pointing to the default instance (dkms5253), the everything works fine.
Case 3: When we run exactly the same, again against the dkms5253uedw instance, but now with the exact same databases defined in the default instance, it also works perfect.
Case 4: When we then stop the sql-server on the default instance, the package faults again, this time with
Error - €œtimeout when connect to database xxx€? Info - €œpackage is preparing to get connection string from parent €¦€?
And the continues as in the first case
From all this we conclude, that the child package tries to connect to the database before it knows the connection string it gets passed in the variable from the master package. It therefore tries to connect to the default instance, and this only works if the default instance is running and has the same databases defined. As far as we can see, the child package does no work against the default instance (no logging etc.).
We have tried delayed validation in the packages and in the connection managers, but with the same results (error).
So we are desperately hoping that someone can help us solve this problem.
I have a big problem with slow execution of stored procedure in SQL Server 2005 but I really don't understand the reason. I have a database with large table (about 400 million rows) and simple stored procedure to get data from that table (one select statement to select time and value columns).
Strange thing is that if I call that stored procedure from .net application (native SqlDataProvider) it takes about 6 seconds to execute but if I call the same procedure with the same parameters from within SQL Server Management Studio it takes only 25 milliseconds to execute!
I've noticed that from .net, procedure is called with binary data and in Management Studio sql script is executed so I've copied/pasted the script from Management Studio to .net code and again the same thing happens (6 seconds from .net and 25ms from Management Studio). I traced executions with SQL Profiler and everything seems to be identical for both applications except it takes much longer time for .net application.
Both SQL Server Management Studio and .net application are on the same machine and SQL Server is on another.
This is the query that when executed in Management Studio takes 25ms:
At first I thought that Management Studio somehow caches results but if I change parameters of stored procedure it always takes less than 30ms to execute. I really don't understand this. Please, help!
Hi All,I have a table that currently contains approx. 8 million records.I'm running a SELECT query against this table that in somecircumstances is either very quick (ie results returned in QueryAnalyzer almost instantaneously), or very slow (ie 30 to 40 seconds toreturn results), and I'm trying to work out how I improve performance.Essentially the query I'm running is nothing more complex than:SELECT TOP 1 * FROM Table1 WHERE tier=n ORDER BY member_id[tier] is a smallint column with a non-clustered, non-unique index onit. [member_id] is a numeric column with a clustered, unique index onit.When I supply a [tier] value of 1, it returns results instantaneously.I have no idea if this is meaningful, but the tier = 1 records wereloaded first into the table, and comprise approximately 5 millionrecords.When I supply a [tier] value of 2, the results take 30 to 40 seconds.tier =2 records were loaded second, and comprise approximately 3million records.I've tried running an execution plan, and while I'm no expert, itappears to me that the index on tier isn't being used, even if I use:tier = CAST(2 as SMALLINT)I'm wondering if anyone can give me ANY advice on how to get anybetter performance out of this SELECT statement?Also, out of curiosity, can a disk defragment have a positive impacton SELECT query performance?Any help very much appreciated!Much warmth,Murray
I have some VB.NET code that starts a transaction and after that executes one by one a lot of queries. Somehow, when I take out the transaction part, my queries are getting executed in around 10 min. With the transaction in place it takes me more than 30 min on one query and then I get timeout. I have checked sp_lock myprocessid and I've noticed there are a lot of exclusive locks on different objects. Using sp_who I could not see any deadlocks. I even tried to set the isolation level to Read UNCOMMITED and still have the same problem. As I said, once I execute my queries without being in a transaction everything works great. Can you help me to find out the problem?
Hey. I've a problem and I think I know the answer also but still want to confirm. We are using SQL 2000 and SSRS 2000. The problem is, we have custom reports which a customer can build and run. I wonder how one can write sp's for that. The way it's written right now is a dynamic select clause then a dynamic, from, a dynamic where, dynamic groupby all appended torgether and run by execute command. I know it'd dynamic SQL and execution plans and stuff will hurt me but someof these reports take forever. Is there anything that can be done to fasten these reports? And if the select will be dynamic and the where will be dynamic, does it make sense to even use a sp? Is it ever going to use the same execution plan? When I run DBCC memorystatus, procedure cache takes up most of this memory. Does the use of dynamic SQL explain that?
I created a package which passes some infornmations( through parameters) to its child package.
I need to do some processing in parent package based on execution status of child package.i.e.
if child fails then some operation and if child succeeds then other operation.
To determine the status of execution of child package I am using two differnt constraint ..one constraint is having value "Success" and other having value "Failure".
My problem is that when child packge is executed successfully the constraint with value = "Success" works properly but when child fails the constraint with value "Failure" does not work.
Dear friends,I have a problem with a simple select statement and I don't know why it is happening.I have 2 tables, Fees and FeesDataRoles. Fees presents all the fees and FeesDataRole is a middle table between Fees and Roles table. So each fee can have multiple Roles and a Role can have many Fees.Now I have a select statement:Select *From Fees Inner Join FeesDataRoles ON Fees.FeeID = FeesDataRoles.FeeIDWhere (FeesDataRoles.DataRoleID = @DataRoleID) AND (FeesDataRoles.RecordStatus = 1 ) AND (FeesDataRoles.ValidFrom >= getdate() ) AND ( FeesDataRoles.ValidTo <= getdate() OR FeesDataRoles.ValidTo is null)Now it shouldn't take that long to execute this procedure but surprisingly sometimes when I insert a value to the table and then execute this store procedure it does now show the data just added. Very strange.....!!!!I ran the procedure 5 times after inserting an item and nearly 1 out of 5 does not return the right result righ. ( It does not include the recently inserted rows)Anyone have any idea....?I used Tuning Advisor, no sugestion. I change the clustered index in FeesDataRoles from FeesDataRoleID(the primary key of the table) to DataRoleID to increase the performance, still it happens sometimes.Is my Where clause so costly that cause this problem.Please help. I really appreciate your help.Regards,Mehdi
Hi I have created a SSIS master package. It is contains two child package. The master package is running in fine . But i am calling the master package in the sql server agentjobs,the following error coming.
An OLE DB error has occurred. Error code: 0x80040E4D. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E4D Description: "Login failed for user 'test1'."
I have created a SSIS master package. It is contains two child package. The master package is running in fine .
But i am calling the master package in the sql server agentjobs,the following error coming.
An OLE DB error has occurred. Error code: 0x80040E4D. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E4D Description: "Login failed for user 'test1'."
SSIS packages running slow on startup in 64 bit prod env. looks like it is taking lot of time (around 2mins) to validate the package. Delayvalidation is false (default); changing it to true didn't help. Any fix for it?
Executing packages using dtexec commad.
We have job packages which loads 10s of child packages in sequence using package execution task... taking hours to complete. Same are working fine 64 bit dev env.
I have already created package which loads a text file to database.I wanted to execute that package based on the availability of the file using Visual Basic,Perl or VBSCRIPT whichever is easier.Please advise.Thanks
I wrote a stored procedure that calls a DTS package that imports a file into a SQL Server table, then executes some commands on that table. How can I tell that the DTS package has completed, so that I can continue with the next command? Here is my code:
-- Run the DTS package EXEC @Result = msdb..sp_start_job @job_name = 'MyJob' WAITFOR DELAY '00:00:02'
-- Call the table that the package created SELECT * FROM <imported_table> WHERE ...
I had to insert a WAITFOR statement to make this work. Any other suggestions as to how to know when to continue?
Is there a method to either: A) Kick of a DTS package directly from a stored procedure without shelling out to the DTSRun facility. B) Kick off a scheduled job containing a DTS task from a stored procedure?
I need help from you. I am working on SSIS packages for ETL purpose. The version of SQL Server i am using is SQL Server 2005.
In Brief , the working of current ETL is as follows.
In ODS database i have 2 tables i.e Table_A & Table_B which gets loaded from another 2 staging tables A & B. And using this 2 tables data will be loaded into a target table i.e Trg_A.
The ETL packages are executed by stored procedures by creating a job within the stored procedure.
The loading of the trg table is little tricky. Before that loading of Table_A is implemented in a single SSIS package. and loading of Table_B is been implemented in another SSIS package.
In the trg table there are two columns which will be getting updated as and when each table is loaded. so for the first time if i run the package which is resposible for loading Table_A, it loads values into Table_A and once done it will updates (col1) in the target table.
Once after the complete of the execution of Package1. Now i will kick off the second ssis package which loads the data into Table_B and updates the trg table's columns (col2).
Now the actual problem what i am facing is:
For loading Table_A and updating the col1 in Trg table i will be receving more than 5 excel file every month on weekly basis. I cannot even gather all the files and run using a For-Loop counter. So presently i am loading data excel file per week .
Similarly loading of table_B.
For a week if i am executing both the packages which loads the Table_A and updates the Trg(col1) and Table_B and updates Trg(col2), then i am getting a Deadlock Error and the entire ETL is getting messed up.
Now my requirement is , Eventhough the 2 packages are run in parallel , there could certain milli seconds time difference while start of the execution in Job Monitor. I need to implement a Queing Mechanism which takes care of running the packages in a sequential manner rather than in parallel. i .e i need to ensure only one SSIS package is running in Job Monitor. Only after successful execution of either one the package, then only the second package should start its execution.
If we can implement such a queing mechanism , then my problem is solvedl.
I need some suggestions on this regard in implementing the Queing mechanism in a programatic approach using SQL Server Job Related MetaData Tables. or else is there in server parameter or initialization parameters which can be set at Database level which suffice my requirement.
Any suggestions would be greatly appreciated. Looking for sincere comments on this regards.
I have a master package which calls two package one after an other.
1) I will be passing parameters to master package using dtexec with /set property to set the variables in the master package. I would like to know how can child package access these variables.
I have a SSIS Project thats been running fine for months up until yesterday.
There's a master package that calls other packages, and when i run it now in Visual Studio, i get a Message Box after each of it's child packages run. The message states:
TITLE: Microsoft Visual Studio------------------------------The designer window cannot be closed while a package is running.Stop the debugger before attempting to close the window.------------------------------BUTTONS:OK------------------------------
This seems to be causing issues, because VS seems to hang on my machine at some point when running master packages, and i think it might due to a message box.
I have a package that reads the contents of 11 Excel files into various tables. Opening this package in the designer, or with DTExecUI is extremely slow. In both cases when I open the package is takes over 10 minutes to do anything. Visual Studio gives the "Visual Studio is Busy" message for 10 minutes and DTExecUI just hangs. DTExecUI actually hangs twice, once when opening the package and a second time when clicking "Execute" (totalling over 20 minutes). It seems like no matter how I try to run the package it will always hang for 10 minutes before running with no status message or anything. Once it runs it completes quickly with no errors.
The various tasks in the package are fairly simple, most being Source > Data Conversion > Destination.
I have a package that has 12 data pump tasks all executing in parallel.
It is transferring raw data from an AS400 DW to a MSSQLSvr Staging area.
Each pump task on completion assigns values to a set of global variables, then having done this passes these as parameters to a sproc which inserts them into a table.
This seems to work for 4 or 5 of the pump tasks but, the rest of the rows in the table are all the same because the remaining pump tasks are all executing before the sprocs.
Is there a way to make sure that the entire set of job steps completes, before starting another job set of steps while still keeping them running in parallel.
I had wondered if there was a way to use the PumpComplete phase of each pump step to fire off the sproc, but can't see how you execute the step.
I want to finsih the execution of a DTS package from an ActiveX task. If a condition is ok, the package would continue as normal. If not, I want to finish the package without any error. Just without executing the next tasks. Do you have any idea?
I need a user to pass an input to a package from a VB form, and then want to show progress/errors in the form.
I have been able to use Application. LoadFromSqlServer and I then set pkg.Variables.(MyVariable) to the required value before calling Execute.
Works fine so far.
I dont know how to show progress. The DTSExecResult just returns a Cryptic Success /Failure status message.
Is there a way to log the errors if any to my Windows form? I have seen samples on Console Apps(see below) but I would prefer to show it in my Windows form in a text Box or something, appending each error result to the text.
TIA
Kartik
Code Snippet
Class EventListener Inherits DefaultEvents
Public Overrides Function OnError(ByVal source As Microsoft.SqlServer.Dts.Runtime.DtsObject, _ ByVal errorCode As Integer, ByVal subComponent As String, ByVal description As String, _ ByVal helpFile As String, ByVal helpContext As Integer, _ ByVal idofInterfaceWithError As String) As Boolean
I have basic knowledge of SSIS and will appreciate if some can suggest a way (with example if possible), to solve this problem.
I have an SSIS package that requires execution of another package, lets say B. This I am able to do using the 'Execute package' task. Now, what I really want to do is, I wish to check if a perticular table 'A' has rows that satisfy a given condition. In other words, I wish to execute Package 'B' onle if Table 'A' has rows. How can I do this?