I have been working with this for about a month now, and no similar problems to date. Today I am trying to introduce 4 configuration flags that control whether optional ETL stage feeds are executed. I did this by adding a do-nothing script component. The precedent and constraint is used, and it checks the boolean variable flag. The first package executes fine. But it never returns from there. This precedent has nothing fancy on it either. It simply does not run any more of the package, make any more conditional checks, nor the common completion tasks. It just seems to think it is done.
The optionals all fire execute package tasks. One thing that might be tripping it up is that I attempt to run one package twice, each time with varying parent package variable set to control it to use a different destination database for each run. Should this not be OK to do?
I need to execute a DTS that have a couple of steps and one of them is a process task that simply call an exe file i made that will send an email to warn the user.
What happens here is that the process task executes my exe file but it doesn't wait for it to compete and fires the next task after and finally closes.
There is anyway to make a "while statament" to wait until my exe application finishes?
I want to finsih the execution of a DTS package from an ActiveX task. If a condition is ok, the package would continue as normal. If not, I want to finish the package without any error. Just without executing the next tasks. Do you have any idea?
i searched and all i found is questions, not answers.
maybe it's a silly question, but i really can't find any documention / posts about this.
i have a scheduled job in sqlagent that executes a SSIS package that runs every minute. As a result, my application log in eventviewer gets filled very quickly.
i tried using "/REPORTING E" option with no luck.
i tried enabling logging on the package, and then select only the OnErro Event, no luck.
No Error but Export to Excel does not finish When the report has 2 pages with total 500 rows exporting to Excel is not a problem. If it has 100 pages 5000 rows exporting to excel does not end and it does not return any error but the process does not end either. What might the problem be?
I'm pulling my hair out. After several attempts I got the sp_OAMethodto execute without error. Unfortunately the DTS package isn'texecuting. It also isn't returning any error. What could I be doingwrong? Any help would be appreciated.This is theEXEC @hr=sp_OACreate 'DTS.Package', @oPKG OUTPUTIF @hr<>0BEGINEXEC sp_OAGetErrorInfo @oPKG,@src OUT, @desc OUTSELECT hr=convert(varbinary(4),@hr), Source=@src, Description=@descRAISERROR (@desc,16,1)RETURNENDEXEC @hr=sp_OAMethod@oPKG,'LoadFromSQLServer',NULL,@ServerName='CAMDEV 0',@PackageName='TestPkg',@Flags=256IF @hr<>0BEGINEXEC sp_OAGetErrorInfo @oPKG,@src OUT, @desc OUTSELECT hr=convert(varbinary(4),@hr), Source=@src, Description=@descRAISERROR (@desc,16,1)RETURNEND--Execute the pkgEXEC @hr=sp_OAMethod @oPKG,'Execute'IF @hr<>0BEGINEXEC sp_OAGetErrorInfo @oPKG,@src OUT, @desc OUTSELECT hr=convert(varbinary(4),@hr), Source=@src, Description=@descprint @descRAISERROR (@desc,16,1)RETURNEND
I've installed SP2 on my server. If I run the package on the server, the package worked there before the installation of SP2. Now with SP2 it doesn't work anymore. In VS2005 on my computer the package works before and after the installation of SP2.
He gives an error with the execution of a sql-task on an oracle server:
Error :Executing the query "insert into cube_content values (trim(?), trim(?), trim(?), trim(?), trim(?), sysdate) failed with the following error: "ORA-01401: inserted value too large for column". Possible failure reasons: Problems with the query, "ResultSet property not set correctly, parameters not set correctly, or connection not established correctly.
I have created a package that do a file search on an AS400 box using activex scriptand UNC path to do the file search. When I run it locally, it's fine. When I run it on the server, it fails. The login setup for the sql server agent service and the job is the same and they both have admin rights.
In addition, I also have another package pointing to the same path, but the job is to create a text file to the UNC path. It works even when I schedule it.
May someone please help me to solve this problem ?
Pump file data into sql server Move file to "archive" directory(file system task) Delete File (file system task) End Loop
Unmap Drive (batch file)
The Map/unmap code is in a batch file c:windowssystem32et use \10.10.10.10ShareName MyPassword /USER:MyUserName /YES
Unmap: c:windowssystem32et use \10.10.10.10ShareName /DELETE /YES
Here are the results when running this package: 1. Running in BIDS on separate workstation. Everything OK. 2. Running on Server by right clicking on package in Integration Services (SSMS) and choosing "run". Everything OK 3. Running as job with SQL Agent: Package succeeds but no action was taken on the files, files in "ShareName" still there, so therefore no data pumped into SQL Server.
Now, the difference is the SQL Agent jobs are running using a domain account proxy. I'm not sure how that would affect things though--I have the tasks in the package set to fail the package if they fail, so they are not failing, the drives are being mapped o.k.
The computer with the share is non-domain, but that shouldn't matter--I am specifiying the local username and password in the batch file as you can see, and as you can see it works from the workstation in BIDS on a separate machine, and works on the server too as long as I don't run it as a job. The batch file sits on both the server and the local workstation with the same local path.
Any idea why the files aren't actioned when run as a job?
I have several DTS packages saved 'locally' to the SQL server. I want to duplicate a package, so that I can make some changes then replace the original. I certainly don't want to rebuild the entire package from scratch. So, I open up the original package, go to the 'Package' menu and choose 'Save As', then give it a new name and press OK. No errors, all appears well, the title bar even shows the new name of the package. But, when I close the package and go the the 'local' package list, it (the new package name) doesn't appear in the list. Refresh, exit SEM, reboot - doesn't show up. I even looked in the MSDB table where packages are supposed to be stored (at least the name / package id / etc), and it doesn't show there as well. Tried from several client machines.
OS: Windows 2000 Server (advanced) SP2 SQL: SQL 2000 Server (no SP's)
I have many jobs on sql 05 and all work but one. This one writes to an Access DB on the same server as SQL. The package works fine. But when executed in the context of the SQL Agent job, it fails.
Jobs that write to a text file work fine. The Access DB has no password required. By the way, that job in sql 2000 worked fine.
I have this SSIS package which just doesn't seem to run when executing as a sql job and I keep getting this error:
"The command line parameters are invalid. The step failed."
I read some of the comments in forums and they were suggesting to verify the command line for the sql job since there is known bug in the command line for sql job.
But that didn't seem to resolve it and the reason could be one of the variable values that I am trying to set.
In this package one of the variables that I am trying to set is the connection string and my command line looks like this
When I try to run this from the command line I get the error as:
Argument " package.variables[User::MetaDataConnectionString].Value;Data Source=[SVRNAME];Initial Catalog=[DBNAME];Integrated Security=True;"" for option "set" is not valid.
I think the issue is in the set parameter where it seems to be intepreting the ;'s in the connection string as part of its command (which I seem to be escaping by putting them in quotes but it seems to be stripping them off)
Has anyone else encountered this issue? Is there any other escape character that I should be using?
I'm stuck with this and I haven't idea how to solve it. I'm trying to migrate a dts 2000 package from BIDS and I obtain this message:
This wizard will close because it encountered the following error:
Index was out of range. Must be non-negative and less than the size of the collection. Parameter name: index (mscorlib)
I'm going to Migrate DTS 2000 Package select my current sql2k production server (it has almost 600 dts although I think that is not any problem at all)
Wizard recognize without problems my server and then I put a folder for save them but on the next step appears the aforementioned message.
I have an SSIS package is made up of SQL tasks and dataflows. The dataflows connect to an Oracle database using Native OLE DBOracle Provider for OLE DB (10g). This is the first package dealing with oracle that runs on the server. I can execute the package manually by right clicking and going to 'Run Package' while logged in remotely from the server, but it gets hung up and does nothing if I run it as a job. I always have to quit the job. I can disable everything but the dataflows in the package and the job completes and runs fine.
I have a simple parent that uses an Execute Package Task to call a simple child package. The child package has a Data Flow that I disabled. When running the child package by itself, the data flow task is bypassed. When running the package via the parent the data flow task executes.
Second issue is when you disable the package configurations in the child, the parent doesn't recognize that it's diables and tries to load the configurations.
It's almost like there are some settings in a child that get ignored by the parent??
I had just installed SQL 2005 dev on my laptop and got an error message when I tried to create a package using the BI IDE. I received the same error using VS2005 IDE. But the project was created regardless without any packages. When I tried to create a new package in the project, I received the same error again, but with an option to view the error details.
Following is the text of the error details:
TITLE: Microsoft Visual Studio ------------------------------
I am trying to execute an SSIS package from an MS Access 2003 database that imports a table from the Access database into a target table in SQL 2005. I saved the package in SQL 2005 and tested it out. If I run it from the Management Studio Console with Run->Execute ... everything works just fine. However, if I try to run it using the following line command "Exec master.dbo.xp_cmdshell 'DTExec /SER DATAFORCE /DTS SQL2005TestPackage /CHECKPOINTING OFF /REPORTING V'" the execution will always fail when the Access database is open (shared mode). The connection manager looks like this: "Data Source=E:Test.mdb;Provider=Microsoft.Jet.OLEDB.4.0;Persist Security Info=False;Jet OLEDB:Global Bulk Transactions=1". The error is listed below:
Code: 0xC0202009 Source: NewPackage Connection manager "SourceConnectionOLEDB" Description: An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft JET Database Engine" Hresult: 0x80004005 Description: "Could not use ''; file already in use.".
Has someone managed to pass successfully a variable from a parent package to a child package? I€™ve tried a zillion permutations and I can€™t get it to work. The strange thing was that I was able to successfully do this with pre-RTM builds. Basically, what I am trying to do is:
The parent package has a variable, e.g. ExecutionID which I set using a script to System::ExecutionInstanceGUID. I verified that the variable is set correctly by dumping it to a SQL Server table. I created a child package variable with the same name. In the child package, I€™ve created a parent package configuration that points to the ExecutionID variable. I am trying to read the variable in a Derived Column Task in which I have a column linked to @ExecutionID. This doesn€™t work. Step-by-step instructions from someone who managed to concur this will be greatly appreciated.
Oh, I didn€™t have any luck hitting a breakpoint in a script task inside a child package with both in and out of process execution also.
I have a package with a custom log provider, which runs in BIDS. However when I deploy the package onto SQL Server and run it on the deployed machine, if fails:
"failed to decrypt protected XML node DTS:Password...key not valid for use in specified state..."
Now this is definately to do with the custom log, as if I take it out & redeploy, I can run it on the deplyed server + also run it within a job. I have entered the custom log provider library (+ other required DLLs) in the GAC on the deployed machine, but I'm clearly missing something.
This problem is a bit weird but I'm just wondering if anybody else experienced this.
I have a package that has file system tasks (copying dtsx files actually). Basically the package copies other packages to a pre-defined destination. Thing is, it only works if one of the packages it is configured to copy has some sort of sensitive data (e.g., a connectionstring with a password), otherwise it reports a success message on execution but doesn't actually do anything. I've checked the forcedexecutionresult and it is set to None for that matter.
Just wondering if anybody else experienced this problem and of course if there's a way to solve it.
I am having a problem importing data into SQL 7 from any type of source. I go through the whole import process no problem. When I click the finish button to start the import, nothing at all happens. Enterprise Manager and the DTS just hang and I must use crtl+alt+delete to end the program. Can anyone give me any suggestions as to what might be happening. Big Thanks in advance, I've been working on this for days.
I am a VoIP phone system using SQL on the back end. I am trying to get a Trigger to fire and email me when a certain number has been dialed.
Create Trigger trg_Emergency_Calls on dbo.CallDetailRecord for Insert
IF @@ROWCOUNT=0 RETURN ---NO rows affected exit proc
IF (SELECT finalcalledpartynumber FROM inserted)='95593684' BEGIN --RAISERROR ('Call Stored Procedure Here',16,10) EXEC WEB_SRVR03.master.dbo.sp_SMTPMail @body='This is a test Email'
END Return
GO
The problem is I have to execute the actually email SP is on another server and has to be that way. The trigger actually runs each time but if the IF statement becomes true then the trigger hangs and never completes. Watching the other server(WEB_SRVR03) there is never a request to execute the sp_SMTPMail. I have been trying to troubleshoot this with profiler but I never see any locks or anything that would give me a problem. Also the insert statement that caused the trigger to fire also never finishes and so the record isn't written to the db. If anyone has any suggestions I would appreciate it. Thanks
I have a big problem and i'm not able to find any hint on the Network.
I have a window2000 pc, VS2005,II5 and SQLServer 2005(dev edition)
I created an SSIS Package (query to DB and the result is loaded into an Excel file) that works fine.
I imported the dtsx file inside my "Stored Packages".
I would like to load and run the package programmatically on a Remote Scenario using the web services.
I created a solution with web service and web page that invoke the web service.
When my code execute: Microsoft.SqlServer.Dts.Runtime.Application.LoadFromDtsServer(packagePath, ".", Nothing)
I got the Error: Microsoft.SqlServer.Dts.Runtime.DtsRuntimeException: The package failed to load due to error 0xC0011008 "Error loading from XML. No further detailed error information can be specified for this problem because no Events object was passed where detailed error information can be stored.". This occurs when CPackage::LoadFromXML fails.
The error message doesn't help so much and there is nothing on the www to give me and advice....
I have a Stored Procedure which Ideally should run when a Customer Logs in, the Procedure will check the available stock and create a Temp Table for the Information, which allows many other Queries in the site to run a lot faster, (due to no joins). The Query has taken as much as 30seconds (Lots of Records and 1/2 dozen Joins) to run upon log in and causing a Timeout for the web application.
I want the procedure to run as it is, but for the login method to not be dependent on the Process tried this in .NET cmd.BeginExecuteNonQuery() (cmd=SQLCOmmand) which doesn't do what I want it just allows me to run heaps of QUeries at the same time.
Can anyone help me with getting this procedure to run and not hold up the Web application? Not sure whether I need to do this in .NET, or whether I can get .NET to run a Batch File or something, but someone must have had a similiar problem, please help.
I have a scheduled job to run daily at 4 am. the job imports data from client side from text files and puts data in our sql table.It takes around 1 hour. the problem is it doesnt stops after the import process completes.so after one hour i can see the data is imported into my sql table so thats fine but the job keeps running. I tried observing that job's spid in activity monitor in sql 2005 but after one hour i cant even see that spid but still job runs. its weird.and after that when i stop the job manually then it stops saying job completed succcessfully. that step is a last step and it uses windows cmd.my understanding is the job step doesnt understands that it got finished. what should i do in here?? any ideas r appreciated we are running the same job for another servers and its fine.
I have a database A include five tables, and have more than 1,500,000 rows. There is a replica database of A. First of all, there is no data in the two dbs. When I finish inserting data into A, the replica db seems still work, the log file size still changes. How can I know the replication finished or not? How long it will take to finish replicate 1,500,000 rows of data?
My Server having DTS package even Job failing saying "Error string: The specified DTS Package ('Name = 'ERP_Mater_Encript_Data_Move'; ID.VersionID = {[not specified]}.{T0I37w32-AB51-1SA4-9495-AcE6SSS4a321}') does not exist"
I have some code that I inherited that I'm having an issue with. It includes a class for database functionality. At one point a call to a function, snippet below, results in a SQL error ('can't insert NULL into column MyColumn). What is amazing and frustrating me is that it just Blows Right by the Error! I thought it would raise the alarm bells, "hey! I got a sql error." I only see the error if I step through the code and look at the Exception that's caught. I tried removing the catch statement entirely, thinking that would at least cause an unhandled exception error, but no go. How do I raise a big red flag to the user when SQL errors happen? And how could it not be doing that automatically? This is for an intranet site so I really don't care if they see ugly errors or not. try{ sqlcommand = new SqlCommand(); sqlcommand.Connection = DBInterface.DBConnection; sqlcommand.CommandText = strSQL; sqlcommand.CommandType = CommandType.Text; nRowsAffected = sqlcommand.ExecuteNonQuery();}catch (Exception ex){ return -1;}
"SSIS 2012 Catalog doesn't have option to give read access to SSIS Catalog to view package run reports" ... Any luck allowing power developers / operators access to READ the SQL 2012 SSIS Execution Reports without granting them SSIS_Admin or Sysadmin?
According to this link posted back in 2011 (w/ Microsoft's feedback in Nov 2011: "We’re closing this issue as “Won’t Fix.” At this point the bug does not meet our bar for resolving prior to SQL Server 2012 RTM. As we approach the SQL Server 2012 release the bar for making code changes gets progressively higher." URL....Regarding Permissions to SSIS Catalog, here are the findings. We can give access in three ways:
1. READ Access – We can provide a user db_datareader access. With this the user can see the objects within the SSIS catalog database, but cannot see the reports.
2. SSIS_ADMIN – Add the user to this database role in SSISDB. With this the user can view the reports. But it also provides them privileges to modify catalog information which is not expected. We can add it using below script EXEC sp_addrolemember 'ssis_admin' , 'REDMONDPAIntelAnalyst'
3. SYSADMIN - Add the user to this server role. This will make the user an admin on the SQL server. This is not intended. Is there any method available which will have provision to give read only access to see SSIS Catalog package execution reports and not having modify Catalog access.
Here's my dilema, I want to run a stored procedure that starts another stored procedure running, but does not wait for the stored procedure to complete execution.
The stored procedure should execute immediately, and leave the other procedure to complete running in the background. Is there any way to do this?
We recently upgraded out SQL version from SQL2008R2 to SQL2014. As such, the compatibility mode changed to SQL2104 (120).
We have several queries that used to run fine that now take forever to bring back results. There are no errors (which surprised me). They just take way too long now. PLus they seem to be causing high I/O and CPU.
If I change the compt level back to SQL2008 - these queries run fine.
QUERY with SQL2008 compt level - finished in 2 minutes. QUERY with SQL2014 compt level - finishes in 3 hours 22 minutes.
same exact query - same server - only thing changed was compatibility level.
WHat do I look for in the queries that could be causing this? (they look fine but obviously I'm missing something here)..