I have some triggers that insert data in a table located in a remote server. If the remote server happens to be down I want to continue with the original transaction anyway. Can I use a BENGIN TRY BEGIN CATCH block to continue with the transaction in the trigger and also insert the data into a temporal local table if the remote insert fails?
How can avoid SQL to rollback the transaction after an insert fails?. I would like to keep the sucessfully inserted rows and continue with next ones after a single one has failed, and then report the ones that failed.
If an update in a stored procedure fails/errors (as in (a) below) the procedure will not continue with (b) - I need the code in (b) to run despite whether the previous update was successful or not - Any ideas?
(a) if(@Data2 = 6) begin update SCHEDULE set Start_CallBack = getdate() where (Block = @Block) end
(b) WHILE @Block_Count > 0 BEGIN UPDATE BLOCK SET Status = @Block_Status END
We have a stored procedure that failed with: Could not continue scan with NOLOCK due to data movement
We are running with ISOLATION LEVEL READ UNCOMMITTED. There are other jobs running, some of which might be hitting these tables (all using the ROWLOCK hint - though I know that's not guaranteed), however, this stored proc would not be going near the same rows. But even if they were, we'd be happy with either the before-look or after-look. This needs to be a low-impact job and should have minimal impact on ther other jobs, so we can't take out locks. Is there any hint we can use to do this? e.g. can we tell the query to just wait until the data has stopped moving then try again?
does anybody know this error code? I get that error from a jdbc-connection ocasionally but can't find an explanation. The original message is in German. The number should be ok. The error occures when a transaction is aborted, the jdbc-connection is lost.
'Der Server konnte die Transaktion nicht fortsetzen. Beschreibung: 60000000b8'
I have parent package which is having the Execute Sql task , which execute the script provided by file connection , after that it calls the child package thru Exceute Package task , I have set the Parent package's Transaction property as "Required" & child's as "Support" , as I run the package child package gets hang .. but it started running when I put the direct input instead of script in my parent package . Can you plz help why this is happening & what is the solution because I require that script file
My transaction log backup is failing with the following error: Does anyone know how I can fix it? (null) Microsoft (R) SQLMaint Utility (Unicode), Version 8.00.194 Copyright (C) Microsoft Corporation (null) Logged on to SQL Server as 'NT AUTHORITYSYSTEM' (trusted) Starting maintenance plan 'Maint Plan - TLogs' on 5/7/2001 8:18:11 AM Backup can not be performed on database. This sub task is ignored. (null) Thanks
Our Transaction Log backup job runs every two hours from 7:00am to 8:59pm. Today, our Full backup job and the first Transaction Lob backup job ran successfully. Subsequent Transaction Log Backups have failed with the following error message under Job History:
"The job failed. The Job was invoked by Schedule 13 (Schedule 1). The last step to run was step 1 (Step 1)."
The caveats are that the database is in Full recovery mode and a .TRN file is generated on the server. Also, the SQL Server log says the job ran successfully. Any ideas?
I created a trigger to update another table when a record is inserted. This trigger appears to work about 99% of the time, but occasionally it will fail. Here is the code for the trigger, does anyone see anyting wrong with this?
CREATE TRIGGER trgUpdateClaimCapture ON ResponseCapture FOR INSERT AS BEGIN UPDATE ClaimCapture SET ResponseTime = GETDATE() FROM Inserted I, ClaimCapture C WHERE C.Site = I.Site AND C.Batch = I.Batch AND C.RxNumber = I.RxNumber AND C.ServiceDate = I.ServiceDate AND C.ResponseTime IS NULL END
Okay. I changed the times that the transaction logs are backed up, via the built in maintenance schedule. It was then that I started to get failures only on the Transaction log backup for the master. The error that I get from the history log is Backup can not be performed on this database. This sub task is ignored. If I look in the file that is saved to disk :
Starting maintenance plan 'DB Maintenance Plan1' on 09/11/2004 02:30:00 Backup can not be performed on database 'master'. This sub task is ignored.
End of maintenance plan 'DB Maintenance Plan1' on 09/11/2004 02:30:31 SQLMAINT.EXE Process Exit Code: 1 (Failed)
I changed the backup back to its original time as this was the only change made. This has not resolved the problem. As I am new to SQL and still finding my feet All the other SQL maintenance plans that I changed are working fine.
I'm running SQL 2005 SP2 and having a problem with a Database Maintenance task created in SSMS doing transaction log backups of all user databases.
When I look at dbo.sysmaintplan_logdetail I see the message:
Database 'Database1' will not be backed up because it does not have its recovery model set to Full or BulkLogged.
Great - that's what we want! Skip the SIMPLE recovery mode databases. However, the next entry is the following error message:
Executing the query "BACKUP LOG [Database1] TO DISK = N'D:\MSSQL$Windows\MSSQL.2\MSSQL\Backup\Database1\Database1_backup_200707231600.trn' WITH NOFORMAT, NOINIT, NAME = N'Database1_backup_20070723160023', SKIP, REWIND, NOUNLOAD, STATS = 10 " failed with the following error: "The statement BACKUP LOG is not allowed while the recovery model is SIMPLE. Use BACKUP DATABASE or change the recovery model using ALTER DATABASE. BACKUP LOG is terminating abnormally.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
Which says to me it is trying to backup the same database(s) it just said it was skipping!
What's going on? I see that there is a fix to a similar error message with the cumulative update KB936305 (http://support.microsoft.com/kb/936305) for maintenance cleanup tasks.
I need a trigger (well, I don't *need* one, but it would be optimal!)but I can't get it to work because it references ntext fields.Is there any alternative? I could write it in laborious code in theapplication, but I'd rather not!DDL for table and trigger below.TIAEdwardif exists (select * from dbo.sysobjects where id =object_id(N'[dbo].[tblMyTable]') and OBJECTPROPERTY(id, N'IsUserTable')= 1)drop table [dbo].[tblMyTable]GOCREATE TABLE [dbo].[tblMyTable] ([fldCSID] uniqueidentifier ROWGUIDCOL NOT NULL ,[fldSubject][ntext] COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,[fldDescription] [ntext] COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,[fldKBSubject] [ntext] COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,[fldKBDescription] [ntext] COLLATE SQL_Latin1_General_CP1_CI_AS NULL) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]GOCREATE TRIGGER PrepopulateKBFieldsFromQuery ON dbo.tblMyTableFOR INSERTASBEGINIF UPDATE(fldKBSubject)BEGINUPDATEtblMyTableSETfldSubject = i.fldKBSubjectFROMinserted i INNER JOINtblMyTable ON i.fldCSID = tblMyTable.fldCSIDENDIF UPDATE (fldKBDescription)BEGINUPDATEtblMyTableSETfldDescription = i.fldKBDescriptionFROMinserted i INNER JOINtblMyTable ON i.fldCSID = tblMyTable.fldCSIDENDEND
SQL Server 2000, C#, ASP.NET and ADO.NET. I have searched for 3 days trying to figure out why my ADO.NET Transaction executes my stored procedures and runs the .COMMIT() on the server, figured this out by using the SQL Profiler, but the data doesn't show up in the database tables and I recieve no error from SQL or ASP.NET. What gives???? Anyone else ever heard of such a thing. Please help, before I loose my sanity.
Hello, I have one insert and one update in my insert update trigger. Is there any problem if I use Begin Transaction, Rolback transaction or Commit Transaction if both insert and update do not succeed?
I have a requirement wherein I need to get the current transaction-id from a insert/update trigger. Earlier in SQL*Server 6.5 I could get it from syslogs based on current process-id whereas now in Version 7 it is not possible. I don't know how to get the output from DBCC log command for the current session and then assign it to a variable.
Hello, I have one insert and one update in my insert update trigger. Is there any problem if I use Begin Transaction, Rolback transaction or Commit Transaction if both insert and update do not succeed?
Hi, I wasn't sure if this was the correct place for this question since it involves both T-SQL and ODBC but i thought I would start here. A little system information to start. I'm using SQL 2005 on a Windows 2003 server. An application is connection to my database through ODBC and adds records to a table one at a time (no batches).
I have an Instead of Trigger created on a table that tests for the existence of a valid foriegn key. If the key doesn't exist in the related table I want to insert the record into a "bad records" table.
My trigger works perfectly when I add a bogus record to the table directly in the SQL Management Console table view or thorugh an insert query. However, when I try a live test with an application connected via ODBC that is adding records to the table the trigger fails to catch the errors and will not update the "bad records" table. If a record is legitmate (i.e. has a valid foreign key) then the final bit of trigger code fires without a problem.
Does anyone know of a problem with Instead Of Triggers working on multiple tables when connection through ODBC? It just seems weird that the trigger works perfectly in one situation but then doesn't work in another.
I'm updating one column using trigger but i am getting below error .
UPDATE failed because the following SET options have incorrect settings: 'NUMERIC_ROUNDABORT'. Verify that SET options are correct for use with indexed views and/or indexes on computed columns and/or filtered indexes and/or query notifications and/or XML data type methods and/or spatial index operations.
We have a rather large environment and have just a couple of boxes out there that we are getting cannot begin distributed transaction on inserts and updates but works fine on selects. Inserts and updates work fine outside the begin tran / commit so it's definitely DTC
We have checked the configuration on and the source box is set to No authentication required same for destination.
We have: Verified credentials running the service, changed them, same problem. Uninstalled and re-installed MSDTC per Microsoft instructions.
Have run all the tools for checking DTC DTCPing etc and followed those procedures which typically in the past has resolved any DTC issues. Other than swapping out the offending pc for a new one we are at a loss.
I have created a Insert Trigger on a Table A. And I am inserting a Record on a Table A, then Trigger which binds on a Table A will fire immediately.
My doubt is, Will the Transaction(Action) be considered as a Single Transaction(both Row Insertion on a Table and followed by Trigger action) or two different Transactions.
And what will happen if Trigger Fails if it is a Single Transaction?.
During some non working days i got rid of old merge replication that I had and started new one. Today 5 of my 6 servers are working normally, but on the 6th one I am getting following error:
The transaction ended in trigger. The batch has been aborted. Table [dbo.A...] into which you are trying to insert, update, or delete data is currently being upgraded or initialized for merge replication. On the publisher data modofocations are disallowed until the upgrade completes and snapshot has succesfully run. On subscriber data modifications are disallowed until the upgrade completes or the initial snapshot has been successfully applied and it has synchronized with publisher.
Although this message is descriptive ebough, I must say that my publisher says that whole process of synchronyzing and initializing data with subscriber is done. In fact it allready did more than 100 cycles of replication.
There's a trigger defined on a table that doesn't allow inserting or updating a field if the value inserted already exists in the table. Here's the trigger:
Code Snippet set ANSI_NULLS ON set QUOTED_IDENTIFIER ON go ALTER TRIGGER [tUI_ID_ISIN] ON [dbo].[T_table] FOR INSERT, UPDATE AS --IF ((SELECT TRIGGER_NESTLEVEL()) = 1) BEGIN declare @ISIN varchar(12) set @isin = (select id_isin from inserted ) if not @isin is null begin if exists (select 1 from t_titoli, inserted where t_table.cd_isin = @isin and t_table.cd_titolo <> inserted.cd_titolo)
begin Raiserror ('ID ISIN alredy exists!', 16,1) rollback tran end end END
it all works fine when the user inserts/updates the record from the front end.
When I run the SSIS that inserts records into the table, I get the following error: The transaction ended in the trigger. The batch has been aborted.
There are no records that should fail the control. In other words, all ISINs are unique.
I have a stored procedure in SQL2005 that queries and updates a linked oracle server. The sp runs fine from Management Studio, but when called by a CLR trigger I get the following error message:
The operation could not be performed because OLE DB provider "OraOLEDB.Oracle" for linked server "ORACLE_LINK" was unable to begin a distributed transaction. (Source: MSSQLServer, Error number: 7391) Get help: http://help/7391
The operation could not be performed because OLE DB provider "OraOLEDB.Oracle" for linked server "ORACLE_LINK" was unable to begin a distributed transaction. (Source: MSSQLServer, Error number: 7391) Get help: http://help/7391
A .NET Framework error occurred during execution of user defined routine or aggregate 'PriorityTrigger': System.Data.SqlClient.SqlException: The operation could not be performed because OLE DB provider "OraOLEDB.Oracle" for linked server "ORACLE_LINK" was unable to begin a distributed transaction. Changed database context to 'pims'. OLE DB provider "OraOLEDB.Oracle" for linked server "ORACLE_LINK" returned message "New transaction cannot enlist in the specified transaction coordinator. ". System.Data.S (Source: MSSQLServer, Error number: 6549) Get help: http://help/6549
I've created an SSIS package that contains a Sequence Container with TransactionOption = Required. Within the container, there are a number of Execute Package Task components running in a serial fashion which are responsible for performing "Upserts" to dimension and fact tables on our production server. The destination db configuration is loaded into each of these packages using an XML configuration file. The structure of these "Upsert" packages are nearly identical, while some execute correctly and others fail. Those that fail all provide the same error messages.
These messages appear during Pre-Execute
[Insert new dimension record [1627]] Error: The AcquireConnection method call to the connection manager "DW" failed with error code 0xC0202009.
[DTS.Pipeline] Error: component "Insert new dimension record" (1627) failed the pre-execute phase and returned error code 0xC020801C.
... which are followed by
[Connection manager "DW"] Error: The SSIS Runtime has failed to enlist the OLE DB connection in a distributed transaction with error 0x8004D00A "Unable to enlist in the transaction.".
[Connection manager "DW"] Error: An OLE DB error has occurred. Error code: 0x8004D00A.
While still in debug mode, I can check the properties of the "DW" connection and successfully test the connection within the packages that fail.
The same packages run successfully when tested outside the container (i.e. no transaction) or when the configuration file is modified to point the "DW" connection to a development version of the db which is running on the same server as the source database.
I have successfully used DTCtester to verify that transactions from source to destination server are working correctly. Also tried setting DelayValidation = True with no change. I have opened a case with Microsoft and am awaiting a reply so I thought I'd throw a post out here to see if anyone else has encountered this and might have a resolution. Here's some more on the environment:
Source Server:
Windows Server 2003 Enterprise Edition SP1 SQL Server 2005 Enterprise Edition SP0
Destination Server:
Windows Server 2003 Enterprise Edition SP1 SQL Server 2000 Enterprise Edition SP3 (clustered)
Thank you in advance for any feedback you might be able to provide.
I have a c# stored procedure that is being called from a trigger. When I execute it from management studio, it works just fine. But, when I update a record in the table that has the trigger that calls the sp, I get the error "transaction context in use by another session error". I've tried a few of the "fixes" that I found through searching, but so far nothing seems to work. What I've tried so far is...
Removing the transaction from my code making sure my code is only using 1 connection setting XACT_ABORT ON
In the ECASE table there is trigger to get the max value of case_id column in ecase based on project and increment one to that case_id value and insert into ecase table .
When we insert a new record to the ECASE table this trigger calls and insert the case_id column value.
When i run with multiple threads , the transaction is rolled back because of trigger . The reason is , on the project table the lock is happening while getting the max value of case_id column based on project.
I'm getting this when executing the code below. Going from W2K/SQL2k SP4 to XP/SQL2k SP4 over a dial-up link.
If I take away the begin tran and commit it works, but of course, if one statement fails I want a rollback. I'm executing this from a Delphi app, but I get the same from Qry Analyser.
I've tried both with and without the Set XACT . . ., and also tried with Set Implicit_Transactions off.
set XACT_ABORT ON Begin distributed Tran update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TRANSACTIONMAIN set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.TRANSACTIONMAIN set REPFLAG = 0 where REPFLAG = 1 and DONE = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.WBENTRY set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.WBENTRY set REPFLAG = 0 where REPFLAG = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.FIXED set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.FIXED set REPFLAG = 0 where REPFLAG = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.ALTCHARGE set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.ALTCHARGE set REPFLAG = 0 where REPFLAG = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TSAUDIT set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.TSAUDIT set REPFLAG = 0 where REPFLAG = 1 COMMIT TRAN
It's got me stumped, so any ideas gratefully received.Thx
I have a design a SSIS Package for ETL Process. In my package i have to read the data from the tables and then insert into the another table of same structure.
for reading the data i have write the Dynamic TSQL based on some condition and based on that it is using 25 different function to populate the data into different 25 column. Tsql returning correct data and is working fine in Enterprise manager. But in my SSIS package it show me time out ERROR.
I have increase and decrease the time to catch the error but it is still there i have tried to set 0 for commandout Properties.
if i'm using the 0 for commandtime out then i'm getting the Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.
and
Failed to open a fastload rowset for "[dbo].[P@@#$%$%%%]". Check that the object exists in the database.
I am getting this error :Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.Data.OleDb.OleDbException: Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.have anybody idea?!