I have the failed path of many tasks pointing to a single Script Task that logs a failed entry into my database. In this script task, I need to be able to determine the task that faliled as I need to write a unique error message depending on the task that failed.
In my VB script, how would I determine which task was the one that failed?
Hello -I am trying to determine the last time a SP was executed. Does anyoneknow how to do this? I'm trying to cleanup some databases.Thanks!*** Sent via Developersdex http://www.developersdex.com ***Don't just participate in USENET...get rewarded for it!
I have modified my workflow to take conditional branch. The workflow terminates after the branched tasks finish without continueing to the next task no matter how I set the flow out condition. I found I have this problem when a task take more than one input. Does SSIS has a seeting for limiting the inputs to be taken?
-The first task in that package (an Exec SQL Task) retrieves a timestamp value from an external source
-That timestamp value is used in data-flows to pull data that has been changed since then
-At the end of the package the value in the db is updated
Now...if something fails then on the next execution the checkpoint file will kick in. But this means that the first task (which retrieves the timestamp) will not execute and therefore all the data-flows will be pulling the wrong data.
The only way I can think of getting around this problem is to specify that a task should execute regardless of the presence of a checkpoint file. Unfortunately it seems this cannot be done.
Another option might be to put an OnPreExecute task on the package that gets the timestamp value.
Anyone got any advice about how I should progress? Short of a suggestion from elsewhere I'm going to go with the tactic of using the OnPreExecute to retrieve the timestamp.
I am using SQL Server 2005 Integration Services to create new values for my tables. One step I must do is execute a query and the script task that receive its constraint (it is set to completion) must do different things depending on the query result.
My question is: how can I know the result of the precedence constraint?
I'm having a hard time to getting back an xml data back from a stored procedure executed by an Execute SQL task.
I'm passing in an XML data as a parameter and getting back resulting XML data as a parameter. The Execute SQL task is using ADO connection to do this job. The two parameters(in/out) are type of "string" and mapped as string.
When I execute the task, I get the following error message.
[Execute SQL Task] Error: Executing the query "dbo.PromissorPLEDataUpload" failed with the following error: "The incoming tabular data stream (TDS) remote procedure call (RPC) protocol stream is incorrect. Parameter 2 ("@LogXML"): Data type 0xE7 has an invalid data length or metadata length.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
I also tried mapping the parameter as XML type, but that didn't work either.
If anyone knows what's going on or how to fix this problem please let me know. All I want to do is save returning XML data in the parameter to a local package variable.
I created DTS a while ago and placed in job to run once a day (it worked fine for 3 months) 2 days ago I changed sa password and now job fails with error (Login failed for user 'sa'.), but it run fine from DTS !!!
1. My DTS created with domain Account DomainSVCSQL2000( sa rights and local admin) 2. SVCSQL service use DomainSVCSQL2000 to run 3. SVCSQL agent use DomainSVCSQL2000 to run 4. DTS use 'osql -E
Where should look for reference to sa ?
Executed as user: MONTREALsvcsql2000. DTSRun: Loading... Error: -2147217843 (80040E4D); Provider Error: 18456 (4818) Error string: Login failed for user 'sa'. Error source: Microsoft OLE DB Provider for SQL Server Help file: Help context: 0. Process Exit Code 1. The step failed.
Just wonder whether is there any indicator or system parameters that can indicate whether stored procedure A is executed inside query analyzer or executed inside application itself so that if execution is done inside query analyzer then i can block it from being executed/retrieve sensitive data from it?
What i'm want to do is to block someone executing stored procedure using query analyzer and retrieve its sensitive results. Stored procedure A has been granted execution for public user but inside application, it will prompt access denied message if particular user has no rights to use system although knew public user name and password. Because there is second layer of user validation inside system application.
However inside query analyzer, there is no way control execution of stored procedure A it as user knew the public user name and password.
Looking forward for replies from expert here. Thanks in advance.
Note: Hope my explaination here clearly describe my current problems.
OK. I give up and need help. Hopefully it's something minor ...
I have a dataflow which returns email addresses to a recordset.
I pass this recordset into a ForEachLoop configuring the enumerator as (Foreach ADO Enumerator). I also map the email address as a variable with index 0.
I then have a Execute SQL task which receives this email address as a varchar variable (parameter 0) which I then use in my SQL command to limit the rows returned. I have commented out the where clause and returned all rows regardless of email address to try to troubleshoot this problem. In either event, I then use a resultset to store the query result of type object and result name 0.
I then pass this resultset into a script variable to start parsing the sql rows returned as type object. ( I assume this is the correct way to do this from other prior posts ...).
The script appears to throw an exception at the following line. I assume it's because I'm either not passing in the values properly or the query doesn't return anything. However, I am certain the query works as it executes just fine at the command prompt.
My intent is to email the query results to each email address with the following type of data by passing the parsed data from the script to a send mail task. Email works fine and sends out messages but the content is empty. I pass the parsed data as string values to the messagesource and define the messagesourcetype as a variable in the mail task.
part number leadtime
x 5
y 9
....
Does anyone have any idea what I might be doing wrong?
Hi, this code inserts twice the same record. I thing it is due to the "Selet Scope_Identity" in the sqlcommand.If i remove the Select part, the inserts occurs only once. If i remove the line "comd.ExecuteNonQuery()", then the inserts also occurs once. Is there something wrong in my code? ThanksT. Dim connection As SqlConnection Dim comd As SqlCommand Dim connectionstr, sql As String Dim iden As Integer connectionstr = ConfigurationManager.ConnectionStrings("econn").ConnectionString.ToString() connection = New SqlConnection(connectionstr) comd = New SqlCommand() comd.Connection = connection sql = "INSERT INTO table(field,...) VALUES (@fld,...); SELECT SCOPE_IDENTITY()" comd.Parameters.Add("@var1", SqlDbType.NVarChar, 10).Value = txtvnm.Text ... connection.Open() iden = Convert.ToInt32(comd.ExecuteScalar()) comd.ExecuteNonQuery() connection.Close()
I know you can use sql profiler to see what sqlcode actually executed when you run a sproc, but is there any way toget this information in asp.net? After executing a sproc, I'd like to send the sqlcode that was sent, to my Audit class. Is there any wayto retrieve this in asp.net itself?cheers!
I created a simple DTS which executes a VB standard exe that simply writes a string to an ascii file opened as append. SQL Server, the exe, and the ascii file are all on the same NT box (mine). If I execute the DTS myself the process works with no problems. When I attempt to execute the DTS via a job the job hangs with no apparent indications as to what may be the cause of the hangup. The SQL Server Agent is up and running and set to run under the system account. I have applied SQL Server SP3. The same problem was occuring prior to applying SP3.
Does anybody have any idea? All sugestions are appreciated.
Is there any (easy!) way in which I can see the SQL that has been executed. I'm using stored procs that create & execute other stored procs(via sp_executesql). At the early stages there are often trivial errors in the created procs that cause rather general exception messages that do not give much of a clue as to where the error is. (It's tedious to find these in the debugger) ... and was wondering if there's any trace output type option I can turn on to see what sql has been presented for execution.
Hello, i have a trigger and i want to know the query that raised it, or want to retrieve the last executed query by the server. I think it's a hard question but i know that someone can help me... Thanks
hello, I have a big problem with a script.. some instructions seems to be not executed. I don't understand. If I execute line perline it's ok - but the entire block no. Explain me and find the solution:
IF NOT EXISTS ( SELECT * FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = 'ACS_ACL' AND COLUMN_NAME = 'ZoneUId' ) BEGIN
ALTER TABLE dbo.ACS_ACL ADD ZoneUId T_UID;
UPDATE dbo.ACS_ACL SET ZoneUId = '1' WHERE ZoneUId IS NULL;
print 'end of script'
END
result:
Msg 207, Level 16, State 1, Line 9 Invalid column name 'ZoneUId'.
the error is on line: UPDATE dbo.ACS_ACL SET ZoneUId = '000000001' WHERE ZoneUId IS NULL;
I have a program that allows various users to login to the sql server. On the login window there is a dropdown that lists all the databases on the server and the user can select which database they wish to login to. To get the list of databases on the server I use a login created for the program we will call 'worker'. The program logs in as worker and runs "SELECT * FROM sys.databases". Before I load the login window, I check to make sure the login worker exists or has been corrupted (in case a user deletes it or changes the password, etc) and if it has been corrupted or deleted, I recreate the login using the "CREATE LOGIN worker WITH PASSWORD = '123' " function.
The problem occurs after I recreate the worker login. The login is created successfully but when the login window appears and logs in as the worker login to get a list of the databases, I get an error saying the login failed for worker. If open the login window again, everything works fine (the worker login isn't created again as it already exists). Further, if I run the same code but put a break point in the code and step through everything, it works the first time.
Is there an amount of time that is necessary to wait for the CREATE LOGIN function to be executed?
I have a SQL 2000 DTS package which executes without errors when run from the DTS design screen or by right clicking the package and selecting execute. When I schedule the package and get it to run at a certain time or highlight the job and select start, the package comes back with an error saying there was a problem with a transformation. I have tried scheduling the DTS by right clicking on it and selecting schedule and creating a new job and using DTSRun with an operating system command. I get the same results! Help!
I have a DTS Package that is used to dispatch mails to all the customers. When I execute the package manually by selecting it and then clicking Execute Package from the context menu, it executes to completion successfully. But when I schedule to execute it using Job, then I get the following error message:
... DTSRun: Executing... DTSRun OnStart: DTSStep_DTSExecuteSQLTask_2 DTSRun OnFinish: DTSStep_DTSExecuteSQLTask_2 DTSRun OnStart: DTSStep_DTSActiveScriptTask_1 DTSRun OnFinish: DTSStep_DTSActiveScriptTask_1 DTSRun OnStart: DTSStep_DTSExecuteSQLTask_3 DTSRun OnError: DTSStep_DTSExecuteSQLTask_3, Error = -2147220421 (8004043B) Error string: The task reported failure on execution. Error source: Microsoft Data Transformation Services (DTS) Package Help file: sqldts80.hlp Help context: 1100 Error Detail Records: Error: -2147220421 (8004043B); Provider Error: 0 (0) Error string: The task reported failure on execution. Error source: Microsoft Data Transformation Services (DTS) Package Help file: sqldts80.hlp Help context: 1100 Error: -2147467262 (80004002); Provider Error: 0 (0) Error string: No such interface supported Error source: Microsoft OLE DB Provider for SQL Serv... Process Exit Code 1. The step failed.
FYI: This Package is under machine7. When I copy the package to my system (machine3) and execute it using Job, it succeeds :rolleyes:
Using the dm_exec_query_stats in 2005, I know I can get the number ofexecutions for a particular sql_handle, but is it possible to get thenumber of execs for a SQL in version 7 or 2000? Also is it possible toget reads/writes/etc in these early versions?Thanks.
Hi!How can I get percent of executed procedure in MSSQL Server, lik inEnterprise Manager and/or dbMgr2k ?I use Visual C# and MSDE.Thank's for help, gregory
I was wondering the way in which stored procedures are executed. For example if I had a stored procedure (A) that executed a sub stored procedure (B). And two different users where to execute the Stored procedure at the same time what would happen. Would the second user have to wait while the first user had finished (A and B), dose each user get a copy of the stored procedure or can one user be running stored procedure (B) whilst the other is running stored procedure (A).
I have a ssis project that contains a parent package and 2 child packages. The parent package loads data from multiple flat files into a database and then kicks off the 2 child packages using separate execute package tasks.
The child package has a data flow.Within the data flow data is extracted from a database. The data is transformed using a script component and then loaded into a second database.
The problem I have is that the second child package is not working. It appears as if the data is being extracted fine.However the script component does not seem to be being executed so the columns that are being transformed are not being changed and so the write to the database fails. When I send the error rows to a database table with all the fields varchar(200) the write completes but the transformed columns are blank.
Also if I put a message box or ComponentMetaData.FireInformation in the script component I get no output.
However when i run this project on my development machine it runs fine but when I run it on the staging server it gives the problems explained above.
I'm running a report that uses multiple datasets. One of which is basically a script written in SQL. I would like this dataset, let's call it code, to execute before the other ones. How can I ensure that this happens?
I have a field in a table that has sql statements as follows; 'Low_Quote_Mortgage_' + REPLACE(CONVERT(VARCHAR, GETDATE(), 10),'-','') + '.txt'
I am trying to create a proc/view to return the data in an executed form ie; Low_Quote_Mortgage_021108.txt
I tried using a function ;
create FUNCTION [dbo].[DynamicFileConversion] ( @dynamic_filename VARCHAR(100) ) RETURNS VARCHAR(100) AS BEGIN DECLARE @Static_filename VARCHAR(100)
SET @Static_filename = (@dynamic_filename)
RETURN @Static_filename END
SELECT suptype_id, oen.dbo.DynamicFileConversion(REPLACE(dynamic_filename, '%day%', 'GETDATE()')) AS Link FROM abc.dbo.ondemand
Didn't work...the only way i can think to do this is by first inserting the data into a temp table then using a cursor to go through each record and linking it back to the table ...just seems really inefficient anyone know a better way?
Greetings all. I hope someone has seen this before. It's driving me up the wall. Here's the scenario. Import a table via bulk insert. Some columns imported need to have the first 6 characters stripped. Write a generic routine to do this. Here's the script:
declare @pattern varchar(10) set @pattern = 'yn' --only 2 fields are to be used with this table
declare @cnt int set @cnt = 0 while @cnt < len(rtrim(@pattern)) begin set @cnt = @cnt + 1 if @cnt = 1 and substring(@pattern,@cnt,1) = 'y' begin print 'updating f1' update TempImport set f1 = substring(f1,6,len(f1)-6) end if @cnt = 2 and substring(@pattern,@cnt,1) = 'y' begin print 'updating f2' update TempImport set f2 = substring(f2,6,len(f2)-6) end if @cnt = 3 and substring(@pattern,@cnt,1) = 'y' begin print 'updating f3' update TempImport set f3 = substring(f3,6,len(f3)-6) end end
The result???
updating f1 Server: Msg 207, Level 16, State 1, Line 10 Invalid column name 'f3'. Server: Msg 207, Level 16, State 1, Line 10 Invalid column name 'f3'. Server: Msg 207, Level 16, State 1, Line 10 Invalid column name 'f3'.
Okay, ignoring the fact that the loop seems to run 3 passes, the table only contains 2 fields, f1 and f2. Since the @pattern is only 2 characters long this should not be a problem. The trace shows only 'updating f1' yet all updates are being executed. The 'if' statement seems to be ignored by the updates. What am I missing?
Hi, I'm pondering building a custom connection manager and within the Validate() method I want to initiate a connection to a SQL Server instance, check something out, and then close he connection at the end of the method. Here's the code I have so far:
Code Snippet public override Microsoft.SqlServer.Dts.Runtime.DTSExecResult Validate(Microsoft.SqlServer.Dts.Runtime.IDTSInfoEvents infoEvents) { DTSExecResult execRes = DTSExecResult.Success; if (String.IsNullOrEmpty(_serverName)) { infoEvents.FireError(0, "SqlConnectionManager", "No server name specified", String.Empty, 0); execRes = DTSExecResult.Failure; } else {
//Establish a connection and check that it is pointing to an MDM DB SqlConnection sqlConnToValidate = new SqlConnection(); sqlConnToValidate.ConnectionString = this._connectionString; try { sqlConnToValidate.Open(); if (!IsMDMDatabase(sqlConnToValidate)) { execRes = DTSExecResult.Failure; } } catch (Exception e) { infoEvents.FireError(0, "MDMConnectionManager", e.Message, String.Empty, 0); execRes = DTSExecResult.Failure; } finally { if (sqlConnToValidate.State != ConnectionState.Closed) sqlConnToValidate.Close(); } } return execRes; }
I'm worried about the overhead of establishing a connection every time Validate() is called. So the questions are
When does Validate() get called?
Does it get called for every component/task that uses it?
Does anyone suspect that what I'm doing here is necassarily a bad thing or not?