Need your advise on this. I have a DTS package which check for 2 dates and execute tasks when the date do not matched. The problem I am facing now is I could make the next step to start only if the previous step is completed. When the DTS package is executed, all steps being completed almost at the same time. See below / attached DTS package.
In the disgram, I have labelled 5 steps A ~ E, each step needs info from the finished product from previous step to produce correct result in it's own step. I couldn't schedule each step to run at different time because the DTS kicks off based on a file that comes in and each step doesn't have a fixed processing time to complete.
I have tried using On Success or On Complete and both options start the next step immediately not not wait for the job the complete or success. I guess this is because I have transferred the command to external when using command. Is there a way to control by some delay between each task?
Please advise. Thank you.
Each of the step has something like below (refreshing of excel file with macro build in):- I cannot build all macros into one file and run from the main excel.
select @MainUpdate=Main_Update_CET from APMEAPV_Compare
select @TempUpdate=Temp_Update_CET from APMEAPV_Compare
--select @MainUpdate, @TempUpdate
if @MainUpdate<>@TempUpdate
begin
DECLARE @commandK varchar(1000)
SET @commandK='Start Excel.exe "D:Daily_Status_Report_EDWHEDWH_Runbook_BTS.xls"'
exec master..xp_cmdshell @commandK, No_Output
I am getting an error in Replication between SQL Server 2005 and SQL Express when "Connecting to Subscriber". Detailed error message is given below. Do we need to increase the logintimeout for "Connecting to Subscriber". How can we increase it?
Message 2007-10-15 06:37:58.398 Startup Delay: 8503 (msecs) 2007-10-15 06:38:06.898 Connecting to Distributor 'ACR-MANGO' 2007-10-15 06:38:06.976 Initializing 2007-10-15 06:38:06.976 Parameter values obtained from agent profile: -bcpbatchsize 2147473647 -commitbatchsize 100 -commitbatchthreshold 1000 -historyverboselevel 1 -keepalivemessageinterval 300 -logintimeout 15 -maxbcpthreads 1 -maxdeliveredtransactions 0 -pollinginterval 5000 -querytimeout 1800 -skiperrors -transactionsperhistory 100 2007-10-15 06:38:06.991 Connecting to Subscriber 'ACR-ANJILISQLEXPRESS' 2007-10-15 06:38:46.133 Agent message code 20084. The process could not connect to Subscriber 'ACR-ANJILISQLEXPRESS'. 2007-10-15 06:38:46.148 Category:NULL Source: Microsoft SQL Native Client Number: 08001 Message: Unable to complete login process due to delay in opening server connection 2007-10-15 06:38:46.148 The agent failed with a 'Retry' status. Try to run the agent at a later time.
I am currently experiencing a 30 second delay when starting an SSIS package from a query window or stored procedure in SQL 2005 Management Studio, using xp_cmdshell and dtexec.
When I run the package in BI Dev the execution results state an elapsed time of 4.82 sec, at a command prompt using dtexec the elapsed time is 3.48 sec, from MStudio the elapsed time is 33.86 sec, this test was run using the same configuration and databases. For the MStudio run, if I look at the DTS log file I€™m creating or the PC Application log, it states the package doesn't actually start until 31 sec after the execute button is pressed. I€™ve tried executing the package as both a SQL package and a file package without any difference in elapsed times. I have also set DelayValidation = True for every Task, ConnectionManager and the package itself.
When I look at the package log one difference I see is that the Management Studio executes using €˜NT AUTHORITYSYSTEM€™, BI Dev and the cmd prompt use the local user €˜[Server]Administrator€™, which in this case is the administrator. From this I have to believe it is some kind of user rights problem. I think SQL or the OS is waiting for something and after it times out at 30 sec, it allows the package to run. If this is the case I€™m not sure what it might be or how to find it.
I also tried making an xp_cmdshell_proxy_account with admin rights but this didn€™t seem to work either. I€™ve included the query code below. Any ideas, help or solutions are greatly appreciated.
All my SSIS packages consistently take 1 minute to start when they are scheduled via the SQL Agent.
For example, a package runs in 20 seconds in the BI Studio. I transfer it to the SQL server and store it in MSDB. I then run it manually from SSMS - Integration Services and it takes 20 seconds. I then schedule it via a job in the SQL Agent and it takes 1 minute 20 seconds.
I can see from some simple logging that there is consistently a delay of 1 minute between the job starting and the package starting.
I have also switched every occurrence possible of DelayValidation to TRUE in all my packages and tasks. All this did was reduce the package run time from 27 seconds to 20 - the 1 minute delay still exists.
This happens on all my packages on all my servers. Any ideas.....?
I wonder if anybody can shed any light on this problem. I have a SQL Agent job which has three steps, each step runs an SSIS package.
The job is scheduled to start at 11.00 pm, which is does successfully. However, it has been taking between 2 and 3 hours to run, which is way longer than it should.
When I've looked at the logging, I've found that the although the job starts at 11.00 pm, the first package (in job step 1) does not start executing until about 11.30. It finishes in about 5 minutes, there is then about an hour delay before the second package (in job step 2) starts. This finishes in about 10 minutes, then there is another hour delay before the third package (in job step 3) starts.
I've tried configuring the steps as SSIS jobs, and also as cmd jobs using dtexec, both exhibit the same behaviour.
Any ideas about what could be causing this delay? The packages are stored in msdb on the same server as the SQL Agent job, if that makes any difference.
I receive this error message occasionally in my ETL - usually when trying to pull data from our source system:
An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Login timeout expired". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Unable to complete login process due to delay in login response". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "TCP Provider: Timeout error [258]. ". There is a setting in the connections manager, under the ALL tab, Initialization, Connect Timeout. It is set to zero, which would be the default. It appears (by looking at the timestamps in my error log) that the default is 15 seconds. If I change this to a higher value (60 seconds or so) should this fix the error?
There is and SQL Agent job that starts a package (from a file system using cmd command). Usually job takes 8-10 minutes. But sometimes it get stuck for a long time (1+ hour).
DTexec process can be found with procmon, but it seems it just not doing anything (And package is not logging to file Start of the execution) After long wait it just runs a package quickly.
I've moved a package to SSIS catalog to try to get more detailed logging, but with no luck.
Job starts at 1 PM, package execution starts at 1:49 PM. Without any messages about the execution in SSISDB log.
First I've thought it might be long validation problem, but when package executes validation messages are there and they perform quick.
Hi, In my application, i have two package, parent package and child package. the parent package is executing child package using a Execute Package Task. "Execute Out Of Process" property of Execute Package Task is set to TRUE. means the child package will be run in separate process not in the process of Parent package. this was working fine, but at a particular client location. its failing the error is "not able to load child package". for me it seems some setting on server restricting to create separate process for child package execution. when "Execute Out Of Process" property of Execute Package Task is set to FALSE. its working fine.
can anyone help what could cause its failure with property set to TRUE.
I expect to create a trigger to post updated data from GoldMine hosted in MS-SQL to my migration MS-SQL database in the appropriate tables mirroring the destination PICK data tables.
Then, start an ActiveX DTS package to migrate the data via a PICK DSN to data tables in a PICK database.
Currently the dba has been able to use VB6.0 with ADO to push data into PICK. He also was able to do similar using MS-Access.
However, PICK (RainingData) is of the opinion that he must script a PICK server side Basic (RealBasic) insert script to receive the data from a VB6.0 application triggered by MS-SQL.
I think that I could skip the Basic script and go direct with ADO in DTS as he has before with VB6.0 with a user form.
Can I have the trigger start the DTS or should I just schedule it to run as often as necessary to update the PICK database?
FYI, this is a one-way data flow into PICK.
TIA
Anyone within the L.A. CA area that has experience with PICK and MS-SQL can get some well paid consulting hours. I'm just the GoldMine GMT whose been enlisted to get the job done, but would appreciate an expert with PICK to join the project.
I am unable to call a package with a cube processing task... it will not execute. I have even tried to simply call a package to process foodmart on my own machine and it will not run. The package when run manually executes fine.
My SSIS solution has about hundred packages and time to time I have to edit a package. I understand I could use 'Build' command to compile only updated package, as opposed to Rebuild which recomplies all of the packages.
Nevertheless, in both cases SSIS opens all of the packages in design environment before compilation. My packages are saved in SourceSafe and that process takes quite long and I was wondering if there was any other way to compile only updated package where none of the other packages are opened during Build/Rebuild process? For example we could use dtutil to deploy only updated packages without running Package Installation Wizard.
I have several DTS packages that take data from SQL Server and exports them onto an Access DB located on the network. Basically, one Execute Process Task from within my DTS package takes the Access DB and zips it while another such task copies the zipped DB and pastes it onto another location on the network. All this works fine.
This export process happens once a month so each month I have to manually add a datestamp to the end of the Access DB file that’s being exported to distinguish it from prior month's export. For example, this month's export file would have AccessDB_20040727.mdb name and the next month it would be AccessDB_20040820.mdb (date is determined based on the date the output is exported on). AccessDB.mdb is the default name of the export DB and datestamp is added at the end of the file name depending on what date the export was run. As I said, I can do this manually each month and it works fine.
I want to, however, know if there is a way to automatically supply the datestamp to the Execute Process Task's Parameters' text box? Following is what I have right now in the Parameters box: \ghf1-ndc8-sqll$productionoutputlob200406LOB_CF_forDril ldown_All_20040727.zip \ghf1-ndc8-sqll$productionoutputlob200406LOB_CF_forDril ldown_All.mdb
I want to take above text from the parameters box and replace it with something like: \ghf1-ndc8-sqll$productionoutputlob200406FileName_RunDa te.zip \ghf1-ndc8-sqll$productionoutputlob200406LOB_CF_forDril ldown_All.mdb
Where FileName_RunDate is a variable/placeholder for the name of the output with the datestamp the output is exported on.
There are three different Execute Process Tasks that are happening within each of my DTS packages so it's a time consuming job to have to manually add a datestamp to each package every month when data is exported.
Does anyone know if what I am asking is doable? If I can use a variable in the parameters box of each Execute Process Task’s properties and supply current datestamp values to it prior to executing the package each month? If so then what are the ways? How would I do that?
When I start creating a new DTS Package and I choose the Analysis ProcessingTask icon, I only have the option of working with the local MicrosoftAnalysis Server.How can I choose a remote server's Analysis Server in order to process itsdatabase?Thank you
Trying to run a SSIS package from a SQL job, and the package itself has a step that calls a process task that runs a batch file. The syntax in the process task I have is the following:
executable: c:windowssystem32cmd.exe Arguments: /C e:SungardPTAencryptfile.bat Working Directory: e:sungardpta
I keep getting the following in my log:
PackageStart,MIMKEIMC11N,MI rustserviceadmin,PTADailyTransactionExtract,{46F7381F-B345-47DC-BFC0-17CCF02A935A},{F82C7944-D28C-4F70-8CB7-F0BD7ED748D2},1/29/2008 1:59:11 PM,1/29/2008 1:59:11 PM,0,0x,Beginning of package execution. OnError,MIMKEIMC11N,MI rustserviceadmin,EncryptFiles,{FCF5B653-CC05-4183-981B-F5EF4906DD09},{F82C7944-D28C-4F70-8CB7-F0BD7ED748D2},1/29/2008 1:59:12 PM,1/29/2008 1:59:12 PM,-1073573551,0x,In Executing "c:windowssystem32cmd.exe" "/C e:SungardPTAencryptfile.bat" at "e:sungardpta", The process exit code was "1" while the expected was "0". OnError,MIMKEIMC11N,MI rustserviceadmin,PTADailyTransactionExtract,{46F7381F-B345-47DC-BFC0-17CCF02A935A},{F82C7944-D28C-4F70-8CB7-F0BD7ED748D2},1/29/2008 1:59:12 PM,1/29/2008 1:59:12 PM,-1073573551,0x,In Executing "c:windowssystem32cmd.exe" "/C e:SungardPTAencryptfile.bat" at "e:sungardpta", The process exit code was "1" while the expected was "0". OnTaskFailed,MIMKEIMC11N,MI rustserviceadmin,EncryptFiles,{FCF5B653-CC05-4183-981B-F5EF4906DD09},{F82C7944-D28C-4F70-8CB7-F0BD7ED748D2},1/29/2008 1:59:12 PM,1/29/2008 1:59:12 PM,0,0x,(null) PackageEnd,MIMKEIMC11N,MI rustserviceadmin,PTADailyTransactionExtract,{46F7381F-B345-47DC-BFC0-17CCF02A935A},{F82C7944-D28C-4F70-8CB7-F0BD7ED748D2},1/29/2008 1:59:12 PM,1/29/2008 1:59:12 PM,1,0x,End of package execution.
Has anyone ever used an Execute Package Task to call a child package, and the Execute Package Task's ExecuteOutOfProcess = True? Unless the account it runs under is an Administrator on the box, it fails for me with "Error 0x80070005 while loading package file "C:program filesmicrosoft sql server90dtsPackagesETLFact_SalesTransaction_Tracking.dtsx". Access is denied."
This is eating up hours and hours of my time, time we can't afford. Is anyone able to successfully call a child package out of process?
I'm trying to run a task that executes a script file (cmd). When i run it with in bids with my own users (domain admin) it works. When i start a cmd prompt and try to run the cmd file directly from the network location where it is it works (with my own rights and with the sql server agent user).
Now when i try to run in from smss > agent jobs > job and run job it never completes. Im not getting any error message either it just keeps on running on the step ??? It seems like a rights issue, but the account running the sql server agent is able to execute the cmd file directly from the command prompt.
There are no errors in any error logs anywhere and no error is displayed...
Ps. Im running the job step as a integration service pacgake.
I have a master package, which executes child packages that are located on a SQL Server. The Child packages execute other child packages which are also located on the SQL server.
Everything works fine when I execute in process. But when I set the parameter in the mater package ExecutePackageTask to ExecuteOutOfProcess = True, I get the following error
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "Row Count" (5349).
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "SCR Custom Split" (6399).
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "SCR Data Source" (5100).
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "DST_SCR Load Data" (6149).
The child packages all run fine when executed directly, and the master package runs fine if Execute Out of Process is False.
I was trying to extract data from the source server using OLEDB Source and SQL Server Destination when i encountered this error:
"Transaction (Process ID 135) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.".
What must be done so that even if the table being queried is locked, i wouldn't experience any deadlock?
Hello all, I am running into an interesting scenario on my desktop. I'm running developer edition on Windows XP Professional (9.00.3042.00 SP2 Developer Edition). OS is autopatched via corporate policy and I saw some patches go in last week. This machine is also a hand-me-down so I don't have a clean install of the databases on the machine but I am local admin.
So, starting last week after a forced remote reboot (also a policy) I noticed a few of the databases didn't start back up. I chalked it up to the hard shutdown and went along my merry way. Friday however I know I shut my machine down nicely and this morning when I booted up, I was in the same state I was last Wenesday. 7 of the 18 databases on my machine came up with
FCB:pen: Operating system error 32(The process cannot access the file because it is being used by another process.) occurred while creating or opening file 'C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDataTest.mdf'. Diagnose and correct the operating system error, and retry the operation. and it also logs FCB:pen failed: Could not open file C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDataTest.mdf for file number 1. OS error: 32(The process cannot access the file because it is being used by another process.).
I've caught references to the auto close feature being a possible culprit, no dice as the databases in question are set to False. Recovery mode varies on the databases from Simple to Full. If I cycle the SQL Server service, whatever transient issue it was having with those files is gone. As much as I'd love to disable the virus scanner, network security would not be amused. The data and log files appear to have the same permissions as unaffected database files. Nothing's set to read only or archive as I've caught on other forums as possible gremlins. I have sufficient disk space and the databases are set for unrestricted growth.
Any thoughts on what I could look at? If it was everything coming up in RECOVERY_PENDING it's make more sense to me than a hit or miss type of thing I'm experiencing now.
Dear list Im designing a package that uses Microsofts preplog.exe to prepare web log files to be imported into SQL Server
What Im trying to do is convert this cmd that works into an execute process task D:SSIS ProcessPrepweblogProcessLoad>preplog ex.log > out.log the above dos cmd works 100%
However when I use the Execute Process Task I get this error [Execute Process Task] Error: In Executing "D:SSIS ProcessPrepweblogProcessLoadpreplog.exe" "" at "D:SSIS ProcessPrepweblogProcessLoad", The process exit code was "-1" while the expected was "0".
There are two package varaibles User::gsPreplogInput = ex.log User::gsPreplogOutput = out.log
How do I use the execute process task? I am trying to unzip the file using the freeware PZUnzip.exe and I tried to place the entire command in a batch file and specified the working directory as the location of the batch file, but the task fails with the error:
SSIS package "IngramWeeklyPOS.dtsx" starting.
Error: 0xC0029151 at Unzip download file, Execute Process Task: In Executing "C:ETLPOSDataIngramWeeklyUnzip.bat" "" at "C:ETLPOSDataIngramWeekly", The process exit code was "1" while the expected was "0".
Then I tried to specify the exe directly in the Executable property and the agruments as the location of the zip file and the directory to unzip the files in, but this time it fails with the following message:
SSIS package "IngramWeeklyPOS.dtsx" starting.
Error: 0xC002F304 at Unzip download file, Execute Process Task: An error occurred with the following error message: "%1 is not a valid Win32 application".
The command in the batch file when run from the command line works perfectly and unzips the file, so there is absolutely no problem with the command, I believe it is just the set up of the variables on the execute process task editor under Process. Any input on resolving this will be much appreciated.
I am designing a utility which will keep two similar databases in sync. In other words, copying the new data from db1 to db2 and updating the old data from db1 to db2.
For this I am making use of the 'Tablediff' utility which when provided with server name, database, table info will generate .sql file which can be used to keep the target table in sync with the source table.
I am using the Execute Process Task and the process parameters I am providing are:
The customer.bat file will have the following code: tablediff -sourceserver "LV-SQL5" -sourcedatabase "TC_CTI" -sourcetable "CUSTOMER_1" -destinationserver "LV-SQL2" -destinationdatabase "TC_CTI" -destinationtable "CUSTOMER" -f "c:SQL_bat_Filessql5TC_CTIsql_filescustomer1"
the .sql file will be generated at: C:SQL_bat_Filessql5TC_CTIsql_filescustomer1.
The Problem: The Execute Process Task is working fine, ie., the tables are being compared correctly and the .SQL file is being generated as desired. But the task as such is reporting faliure with the following error :
[Execute Process Task] Error: In Executing "C:SQL_bat_FilesSQL5TC_CTIpackage_occurrence.bat" "" at "C:Program Files (x86)Microsoft SQL Server90COM", The process exit code was "2" while the expected was "0". ]
Some of you may suggest to just set the ForceExecutionResult = Success (infact this is what I am doing now just to get the program working), but, this is not what I desire.
I'm pulling data from Oracle db and load into MS-SQL 2008.For my data type checks during the data load process, what are options to ensure that the data being processed wouldn't fail. such that I can verify first in-hand with the target type of data and then if its valid format load it into destination table else mark it with error flag and push into errors table... All this at the row level.One way I can think of is to load into a staging table then get the source & destination table -column data types, compare them and proceed.
should I just try loading the data directly and if it fails try trouble shooting(which could be a difficult task as I wouldn't know what caused error...)
Env: SQL Server 2000It actually waited for 50 seconds instead of 5, triedWAITFOR DELAY '000:00:005' and WAITFOR DELAY '000:00:5' respectively,and got same behavior. Bug or ?TIA
I am parsing a file where along the flow I use a conditional split. One path of the split is the primary table (with IDENTITY) values. The rest of the paths have a FOREIGN KEY to the primary table.
It seems that SSIS is trying to insert the rows at the same time (which makes sense) but this is causing a problem with the secondary tables and their FK constraint since the primary table is not yet written.
Is there a way to delay the secondary tables until the primary table is done?
(I guess one way is to run through the file twice... once for the primary table and another for the rest but that seems wasteful to me...)
I am using Sql 2005 SP1 and merge replication on a database. One of the tables is used for an audit trail and has a dynamic filter applied so that it doesn't replicated every audit trail record to every subscriber.
Our sp's tend to insert records in to the audit trail table when someone inserts a new product (for example). The problem is that just recently the insert of new products has been taking >2 seconds, this is relatively slow compared to how it used to be 2 months ago.
Using profiler I have found that it is the insert in to the audit trail table that is taking all the time, and this is taking a long time because of something replication is doing. From profiler I have found that the following statement is the culprit. This is something that replication is doing but why it take so long I don't know:
select count(*) from [dbo].[MSmerge_repl_view_000CC979122E4C88AF27FE08CDCC84EB_B5F96F71937D4D9A949DEECFE540D0C4] [AUDIT_TRAIL_DETAIL] with (rowlock) where [RowGUID] in (select [AUDIT_TRAIL_DETAIL].[RowGUID] from inserted [AUDIT_TRAIL_HEADER], [dbo].[MSmerge_repl_view_000CC979122E4C88AF27FE08CDCC84EB_B5F96F71937D4D9A949DEECFE540D0C4] [AUDIT_TRAIL_DETAIL] with (rowlock) where (AUDIT_TRAIL_HEADER.ID = AUDIT_TRAIL_DETAIL.FKAuditTrailHeaderID))
The AUDIT_TRAIL_DETAIL table currently has 1.1 million row in it.
Can anyone give me any clues as to what I should do help improve the performance once again? Should I stop filtering on this table?
It works well but some times I get long delay in connection and read data.is it any way to solve the problem?
for more information whene is working well I can connect to database and get all information I need in .1 sec. when is going to be late this action may takes 20 sec
I am having this table locking issue that I need to start paying attention to as its getting more frequent.
The problem is that the data in the tables is live finance data that needs to be changed and viewed almost real time so what I have picked up so far is that using 'table Hints' may not be a good idea.
I have a guy at work telling me that introducing a data access layer is the only way to solve this, I am not convinced but havnt enough knowledge to back my own feeling up. (asp system not .net).
Hi, I created a stored procedure that run as a service in the SQL Server , as long as the server is up. It queries a table , does some work on the rows one by one , and after each row is done - deletes it from the table. If there is no data - it goes to 'sleep' - waitfor delay ('00:00:03')
does anyone knows if that waitfor command releases the cpu , or just counting and uses the cpu ? Eyal.
I`ve been following the newsgroups, and the consensus had seemed to be that 7.0 would be released around November. However, I spoke to a Microsoft partner last week who told me that the release date would be sometime in the second quarter of 1999. Does anyone know whether if this is true/untrue?