Execute SQL Tasks Failing For Duplicate Syntax Between DEV And Production DB
Mar 28, 2007
Newbie here...be patient with me!
I added tasks to refresh two tables (delete from, insert into, update) to an SSIS project . I have them running from the WinXP scheduler. The issue:
In dev the tasks integrate and execute successfully from scheduler
In prod I can right click and execute each of the three tasks without any problem, but the same tasks cause my project to fail when executed from scheduler
Questions:
Any ideas about what I am failing to see?
How do I get a meaningful log messages from the tasks that are failing?
Thanks for your ideas...
Installed Edition: IDE Standard
SQL Server Analysis Services
Microsoft SQL Server Analysis Services Designer
Version 9.00.1399.00
SQL Server Integration Services
Microsoft SQL Server Integration Services Designer
Version 9.00.1399.00
SQL Server Reporting Services
Microsoft SQL Server Reporting Services Designers
Version 9.00.1399.00
Microsoft Visual Studio 2005
Version 8.0.50727.42 (RTM.050727-4200)
Microsoft .NET Framework
Version 2.0.50727
Installed Edition: IDE Standard
SQL Server Analysis Services
Microsoft SQL Server Analysis Services Designer
Version 9.00.1399.00
SQL Server Integration Services
Microsoft SQL Server Integration Services Designer
Version 9.00.1399.00
SQL Server Reporting Services
Microsoft SQL Server Reporting Services Designers
Version 9.00.1399.00
In SSIS Version 9.00.2047.00 I have build a few packages using script tasks. These packages work well in development.
When I try to run them Using DTUtil the systems displays in the Package Excecution Progress-window the message: 'Error: The tasks "reading registry" cannot run on this edition of Integration services. It requires a higher level edition.'
Emptying the script task from its variables and script doesn't make a difference. I did not find an explanation in BOL. Concerning script tasks I only find
The Script task uses VSA as its engine for writing and running scripts. To run a script, you must have VSA installed on the computers where the package runs. But since I even get these messages when I try to run on the machine where I build the package,I guess this can't be the problem.
WE have a number of SQL 6.5 servers all of which run many scheduled tasks. However on one all tasks which use cmdexec to run batch files fail after 2-3 seconds. The history returns : 'No message' and the error logs just indicate that the tasks have faILED.
The identical tasks run on servers with what appears to be identical configurations and setup.
Hi I have a strange problem with SSIS packages. (brief description - packages select some data from DBs , write them to CSV file and then CSV file is copied and renamed to a folder made up of the date) I have 5 packages scheduled to run, these jobs run perfectly when test scheduled during the day, (so its not a user permissions problem). However it seems the 1st package to run at night will fail. The reason I say 1st is the following, I had Package A scheduled at 11:20PM and package B at 11:30PM , Package B always succeeded but package A always failed. I would test A during the day and it would run fine (the jobs would run successfully aswell as just executing the package manually) .
The I changed the time with B to 11:50PM and it succeeds and A fails ! without changing the packages themselves.
This counts out a possibilty of a DB backup causing the problem (pack always succeeded at 1130, now fails at the same time)
I was thinking maybe as the folder wasn't created when the 1st pack ran this was causing the fail, but when i test run the job this morning it succeeds .. and todays folder doesn't exist either !
I have a strange problem with a scheduled task failing with the following:
"Unable to send completion notification email to operator with email name '' for task 2780, 'Scheduled Update'"
This job is the same across several servers and the job runs on the other servers. This is a bit frustrating... I cannot seem to find the difference that is causing the problem.
The funny thing is that I am not using SQLMail or anything to notify anyone regardless if the job succeeds or fails.
Hi, I have a table with queries. I need to execute those queries and pass results into a variable. Then use that variable/result to execute other queries to make business decisions. EXAMPLE: TASK#1 TABLE-A :has queries below select count(*) from employee select count(* ) from mangers
I want to execute those queries and store results in @counts. How I execute all queries in table A and pass that to a variable?
TASK#2 Then I have another SQL task(may be SQL task) which use the value @counts make some decisions If @count > 1 then pass If @count <1 then fail
How can I do that?
I am still new to SSIS and not very familiar with variables. Any advice would be appreciated.
I have a simple Error row redirection (from an OLEDB Command) to redirect all rows in error to a Transform script and thereafter to a Flat file destination. This is via the red arrow (DF path) output from the OLEDB Command.
I don't understand why this leg executes even when there are no rows in error? Zero byte Flat Files get written out when there are no errors.
How come? Why would a path with a red arrow execute even when there are no errors?
Part 2 When I introduce some errors in the data to cause an integrity violation, and I hook up an OnError event handler, it is never raised even through the error rows are successfully redirected and written out to the Flat File destination.
So what consitutes an error for a Data Flow Task? Is an error raised by Sql Server for an integrity violation bubble up as an error in the SSIS package?
Maybe I'm missing something, but I can't find how to run multiple tasks in sequence while in Visual Studio debug mode. In DTS design mode I grew accustomed to right-clicking tasks one-at-a time, but in SSIS I find the additional step of having to exit Debugging mode after every task gets old after a while.
There must be a way to start execution at a certain task and have the package continue all the way to some other specified task. It would also be nice to have every task in a Group execute in sequence and stop (even if connections continue beyond the group). I could even settle for repeatedly clicking the Continue button in Debug mode, but it's always grayed out when the current task is finished!
I have two servers, A for dev and B for production.
On server A I developed a project containing a SSIS package using SQL Server business intelligence development studio. The package runs fine from the BIDS and also when I save it to SQLServer itself and run it as a scheduled job using the SQLServer Agent.
All ready to roll out to Server B I thought, so I then saved the .dtsx file to a shared network drive.
On Server B, I created a empty project with the same name as it had on Server A. I then imported the .dtsx file into the project using project > add existing item.
The package appeared to import ok but I now cannot execute any of the data flow tasks in isolation. If I right click on them, there is no option to 'execute task' as there should be, it is not greyed out, it's not there at all.
Also, if I attempt to debug the whole package I get a message saying 'This document is opened by another project'.
Can anyone help with this as my deployment to live isn't going very well to say the least!
Both server A + B are 32-bit 2005 std edition SP1 on W2003 Server std edition SP1.
In my ETL project, l need to extract rawdata using winzip, and the DOS command l use is c:program filesWinzipwinzip32.exe -min -e c: awdata est.zip c: awdata where -min for minimize and -e for extract, c: awdata est.zip for source file and c: awdata for destination. It works fine by using DOS command.
l configure the Execute Process Tasks with the following parameters, i.e. RequiredFullFileName : TRUE Executable : c:program filesWinzipwinzip32.exe Arguments : -min -e WorkingDirectory : c:program filesWinzip StandardInputVariable : User::gsRawFile StandardOutputVariable : User::gsDestDir Where User::gsRawFile = c: awdata est.zip and User::gsDestDir = c: awdata
I have recently created several DTS Packages and scheduled them to run nightly. The Packages complete succesfully when executed from the DTS Designer. However, when they are executed by a scheduled job, the following error results.... DTSRun: Loading... DTSRun: Executing... Error: -2147220499 (800403ED); Provider Error: 0 (0) Error string: No Steps have been defined for the transformation Package. Error source: Microsoft Data Transformation Services (DTS) Package Help file: sqldts.hlp Help context: 700. Process Exit Code 1. The step failed.
This packages exist and are running via scheduled jobs on another server. Does anyone know what is causing this error???? thanks!
ALTER PROCEDURE [Audit].[spBatchPackage_OnExtract]
@Execution_Idx int
,@Batch_Idx int
,@Source nvarchar(1024)
,@SourceId uniqueidentifier
,@Destination nvarchar(1024)
,@StartTime DATETIME
,@FileSpecification nvarchar(2)
When I execute the SQL Task, I get this error:
[Execute SQL Task] Error: Executing the query "[Audit].[spBatchPackage_OnExtract] ?, ?, ?, ?, ?, ?, ?" failed with the following error: "Invalid character value for cast specification". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
Any help on clearing up this error would be appreciated!
Hello all- Before I go any further, I have followed http://msdn.microsoft.com/en-us/library/ms188304.aspx as best possible. I am attempting to send mail through a DML trigger. We'll call the database 'DB', and it is owned by a domain account named 'DOMAINAcct'. The trigger simply blocks any CUD operations on a table which we'll call 'Tbl', and sends an email. Hence, it looks something like...
CREATE TRIGGER [dbo].[TR_Tbl_BlockChanges] ON [dbo].[Tbl] WITH EXECUTE AS OWNER INSTEAD OF INSERT,DELETE,UPDATE AS EXEC [msdb].[dbo].[sp_send_dbmail] @profile_name = 'AcctMail', @recipients = 'foo@bar.com', @subject = N'CUD operations not allowed on Tbl', @body = N'Blocked'
AcctMail is a valid profile and operates correctly. I have created the DOMAINAcct user in msdb, given it the AUTHENTICATE permission, and added it to the DatabaseMailUserRole. When the trigger fires, according to the article, the security context should switch to dbo (DOMAINAcct), then be successful when attempting to execute the msdb sproc. Instead I get the usual: Msg 229, Level 14, State 5, Procedure sp_send_dbmail, Line 1 The EXECUTE permission was denied on the object 'sp_send_dbmail', database 'msdb', schema 'dbo'.
I have a Package and a DataFlow Task. The Package has TransactionOption=Required. The DataFlow Task has an OLE DB Source and an OLE DB Destination. The DataFlow Task has TransactionOption=Supported. The package executes on a Workstation and DataSources for the OLE DB Source and the OLE DB Destination are on a Server.
After the package had been launched an error message showed:
[OLE DB Destination [43]] Error: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "DWH_Destination" failed with error code 0xC0202009. There may be error messages posted before this with more information on why the AcquireConnection method call failed. [DTS.Pipeline] Error: component "OLE DB Destination" (43) failed the pre-execute phase and returned error code 0xC020801C.
[Connection manager "DWH_Destination"] Error: The SSIS Runtime has failed to enlist the OLE DB connection in a distributed transaction with error 0x8004D024 "The transaction manager has disabled its support for remote/network transactions.". [Connection manager "DWH_Destination"] Error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x8004D024.
If I set TransactionOption=NotSupported in the DataFlow Task then the package executes successful.
I have some "Execute T-SQL Statement Tasks" in a package. I would like to run this same package on another SQL Server without having to change it on the other server. Since the server name can be given when setting up the connection, I think if I leave the server name out then the package could run on any server? Is my assumption correct?
The variable 1 is created as int32 and variable 2 is created as dattime.
When i execute the SQLtask, I get error:
[Execute SQL Task] Error: Executing the query "Exec mysp 'table1',OUTPUT,OUTPUT" failed with the following error: "Error converting data type nvarchar to int.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
What am i missing. I tried changing the data types adding the input variable also as a variable in the mapping. Nothing seems to work. Any ideas please?
In SQL Server using xp_cmdshell we can excute any of the command or executable files which can be executed in command prompt. Here my problem is that .. I am trying to execute OSQL from the MSSQL(Query Analyser) using xp_cmdshell.. but its give error saying "'osql' is not recognized as an internal or external command, operable program or batch file." This error occours when it is not able to find the executable file... but same thing I am able to execute from the command prompt. So I feel this problem is some where related to the path setting of windows. If some one can solve this problem or sugesst the how to set the path for window it will be help full.. waiting for reply
I am trying to programmatically execute a package that contains an Execute SQL Task component bound to a variable for its "SqlStatementSource" property (via an expression). The variable is of type String and contains a simple value of "SELECT 1". The Execute SQL Task contains an expression that sets the SqlStatementSource property to the value of this variable.
The package runs fine when I execute it via dtexec or BIDS, but when I attempt to run it via the object model, I receive the following error message:
The result of the expression ""@[User::Sql]"" on property "SqlStatementSource" cannot be written to the property. The expression was evaluated, but cannot be set on the property.
I did a search on this forum and noticed quite a few threads about this same issue, but no explanation/solution. We have quite a few packages that have dynamically constructed SQL statements for Execute SQL Tasks, and they are all failing to run via the object model. Is there something that I am missing?
I'm running SQL Server 7.0 SP3 and having trouble with DTS.
I have an Execute SQL Task that runs several stored procedures. When one of the stored procedures fails, the Execute SQL Task just terminates without failing.
I found a knowledge base article (Q238523) dealing with this situation but it was supposedly fixed in SP2 - I have SP3! The other work around suggested, issuing a SET NOCOUNT ON, does not always work.
Has anyone else run into this or have any other suggestions.
I hate the thought of spending days doing another work around in order get basic DTS functionalty to work as it should!
I have ForEach Loop using Foreach File Enumerator. Within this loop I have SQL Task containing an Insert statement. When I run the Insert statement in query builder the transaction inserts data into a table as expected.
However, when actually running the process I am getting the error message:
Executing the query "INSERT INTO dbo.TEST_TABLE ..." failed with the following error: "Value does not fall within the expected range.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
I currently have the ResultSet to "None" and have defined the parameter I am using. Where the process seems to joke is on my file_Name variable will I am trying to insert only part of the file name.
I have an SSIS package which calls a command line app.When run in BIDS, it executes normally. The command line app is passed the arguments, does what it needs to do.When called as a SQL Agent Job (by the agent, or by me) it fails when calling the app, giving an exit code of 2 (which is an exception trapped by a try-catch). The SQL Agent service is running under my user (it's a test environment). The argument passed (from the log) is valid, and I've run it against the app, it provides the appropriate output.I can't for the life of me figure out what's going wrong.The app is passed an argument of a path and a password, and applies the password to the file, using interop.
I have a flatfile contain several SQL syntax calling a stored procedure (execute usp_table 'param1', 'param2' out) and have been imported into a temporer table. Inside this table I have a few column (identity column, syntax column and status column). Status column used for identify whether the execution succes. My plan is to read this table and execute the syntax row by row, the stored procedure will return a success or failure status from param2. And I have to read this return to update the Status column inside the table. I tried Execute SQL Task and throw the return value of stored procedure to a result set. But I didn't know how to treat the result set to update the table. So what task should I use ? Thanks in advance.
I'm looking for the correct syntax to pull back duplicate vendors based on 6 fields from two different tables. I want to actually see the duplicate vendor information (not just a count). I am able to pull this for one of the tables, something like below:
select * from VendTable1 a join ( select firstname, lastname from VendTable1 group by firstname, lastname having count(*) > 1 ) b on a.firstname = b.firstname and a.lastname = b.lastname
I'm running into issues when trying to add the other table with the 4 other fields.
As part of the logging process for data input, I want to update two fields in a logging table. The first is a datetime, derived from looking up the maximum value in another table (the table I've just imported), and the second is an integer - the number of rows captured in a variable during the task.
I can do this in two separate Execute SQL tasks as follows:
Task 1 syntax
DECLARE @maxDate datetime SELECT @maxDate = max(dtLastChangedDate) FROM dbo.tblCancel_RAW
UPDATE dbo.tblLogging SET PreviousFilterValue = CurrentFilterValue, CurrentFilterValue = ISNULL(CAST ( @maxdate as varchar(25)),CurrentFilterValue), DateSourceTableLastRead = GetDate(), RowsReturned= -1 WHERE SourceTableName = 'cancel'
Task 2 Syntax, with the variable user::rowsimported mapped to parameter 0
UPDATE dbo.tblLogging SET RowsReturned= ? WHERE SourceTableName = 'cancel'
However I cannot make this work with a single SQL statement such as
DECLARE @maxDate datetime SELECT @maxDate = max(dtLastChangedDate) FROM dbo.tblCancel_RAW
UPDATE dbo.tblLogging SET PreviousFilterValue = CurrentFilterValue, CurrentFilterValue = ISNULL(CAST ( @maxdate as varchar(25)),CurrentFilterValue), DateSourceTableLastRead = GetDate(), RowsReturned= ? WHERE SourceTableName = 'cancel'
because no matter how I try to map the parameter (0,1,2,3,4 etc) the task fails.
Is this behaviour by design, is it a bug, or is there something I've missed?
relog syntax: relog [path to file] [-f: to SQL format] [-o to output SQL:Databasename!DSN"]
The name of the file I want to be processed by the windows relog utility is a variable which is created by another preceding task: User::Variable
In the "Execute Process Task" there are three parameters I have to use:
Executable: which I think needs to be relog.exe Arguments: C:PerflogsUser::Variable -f SQL -o "SQL:PerfCounters!PerfCounters" WorkingDirectory: C:WindowsSystem32
As you can see in the Arguments, it's not going to work but I can't figure out the syntax.
Has anyone got some experiance with passing arguments to a execute process task based on variables?
Hi, I'm having an SSIS package which gives the following error when executed :
Error: 0xC002F210 at Create Linked Server, Execute SQL Task: Executing the query "exec (?)" failed with the following error: "Syntax error or access violation". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.Task failed: Create Linked Server
The package has a single Execute SQL task with the properties listed below :
General Properties Result Set : None
ConnectionType : OLEDB Connection : Connected to a Local Database (DB1) SQLSourceType : Direct Input SQL Statement : exec(?) IsQueryStorePro : False BypassPrepare : False
Parameter Mapping Properties
variableName Direction DataType ParameterName
User::AddLinkSql Input Varchar 0
'AddLinkSql' is a global variable of package scope of type string with the value Exec sp_AddLinkedServer 'Srv1','','SQLOLEDB.1',@DataSrc='localhost',@catalog ='DB1'
When I try to execute the Query task, it fails with the above error. Also, the above the sql statement cannot be parsed and gives error "The query failed to parse. Syntax or access violation"
I would like to add that the above package was migrated from DTS, where it runs without any error, eventhough it gives the same parse error message.
I would appreciate if anybody can help me out of this issue by suggeting where the problem is.
I am using the Import/Export wizard to import data from an ODBC data source. This can only be done from a query to specify the data to transfer.
When I try to create the tables, for the query, I am getting the following error:
Msg 2714, Level 16, State 4, Line 12
There is already an object named 'UserID' in the database.
Msg 1750, Level 16, State 0, Line 12
Could not create constraint. See previous errors.
I have duplicated this error with the following script:
USE [testing]
IF OBJECT_ID ('[testing].[dbo].[users1]', 'U') IS NOT NULL
DROP TABLE [testing].[dbo].[users1]
CREATE TABLE [testing].[dbo].[users1] (
[UserID] bigint NOT NULL,
[Name] nvarchar(25) NULL,
CONSTRAINT [UserID] PRIMARY KEY (UserID)
)
IF OBJECT_ID ('[testing].[dbo].[users2]', 'U') IS NOT NULL
DROP TABLE [testing].[dbo].[users2]
CREATE TABLE [testing].[dbo].[users2] (
[UserID] bigint NOT NULL,
[Name] nvarchar(25) NULL,
CONSTRAINT [UserID] PRIMARY KEY (UserID)
)
IF OBJECT_ID ('[testing].[dbo].[users3]', 'U') IS NOT NULL
DROP TABLE [testing].[dbo].[users3]
CREATE TABLE [testing].[dbo].[users3] (
[UserID] bigint NOT NULL,
[Name] nvarchar(25) NULL,
CONSTRAINT [UserID] PRIMARY KEY (UserID)
)
I have searched the "2714 duplicate error msg," but have found references to duplicate table names, rather than multiple field names or column name duplicate errors, within a database.
I think that the schema is only allowing a single UserID primary key.
I keep receiving the following error whenever I try and call this function to update my database.
The code was working before, all I added was an extra field to update.
Exception Details: System.Data.SqlClient.SqlException: Incorrect syntax near the keyword 'WHERE'
Public Sub MasterList_Update(sender As Object, e As DataListCommandEventArgs)
Dim strProjectName, txtProjectDescription, intProjectID, strProjectState as String Dim intEstDuration, dtmCreationDate, strCreatedBy, strProjectLead, dtmEstCompletionDate as String
Dim myConnection As New SqlConnection(System.Configuration.ConfigurationSettings.AppSettings("connectionstring")) Dim cmdSQL As New SqlCommand(strSQL, myConnection)