Problem Returning A Timestamp Column Inside An TSQL Transaction
Jan 15, 2007
I cannot manage to fetch the new timestamp value inside a TSQL Transaction. I have tried to Select "@LastChanged" before committing the transaction and after committing the transaction. A TimestampCheck variable is used to get the timestamp value of the Custom Business Object. It is checked against the row updating to see if they match. If they do, the Update begins as a Transaction. I send @LastChanged (timestamp) and an InputOutput param, But I also have the same problem sending in a dedicated timestamp param ("@NewLastChanged"):
1 select @TimestampCheck = LastChanged from ADD_Address where AddressId=@AddressId
2
3 if @TimestampCheck is null
4 begin
5 RAISERROR ('AddressId does not exist in ADD_Address: E002', 16, 1) -- AddressId does not exist.
6 return -1
7 end
8 else if @TimestampCheck <> @LastChanged
9 begin
10 RAISERROR ('Timestamps do not match up, the record has been changed: E003', 16, 1)
11 return -1
12 end
13
14
15 Begin Tran Address
16
17 Update ADD_Address
18 set StreetNumber= @StreetNumber, AddressLine1=@AddressLine1, StreetTypeId=@StreetTypeId, AddressLine2=@AddressLine2, AddressLine3=@AddressLine3, CityId=@CityId, StateProvidenceId=@StateProvidenceId, ZipCode=@ZipCode, CreateId=@CreateId, CreateDate=@CreateDate
19 where AddressId= @AddressId
20
21 select @error_code = @@ERROR, @AddressId= scope_identity()
22
23 if @error_code = 0
24 begin
25 commit tran Address
26
27 select @LastChanged = LastChanged
28 from ADD_Address
29 where AddressId = @AddressId
30
31 if @LastChanged is null
32 begin
33 RAISERROR ('LastChanged has returned null in ADD_Address: E004', 16, 1)
34 return -1
35 end
36 if @LastChanged = @TimestampCheck
37 begin
38 RAISERROR ('LastChanged original value has not changed in ADD_Address: E005', 16, 1)
39 return -1
40 end
41 return 0I do not have this problem if I do not use a TSQL Transaction. Is there a way to capture the new timestamp inside a Transaction, or have I missed something?Thank you,jspurlin
I'm trying to find out how to determine the Server name of the server that a stored proc is running on. Something that is similar to DB_NAME() i.e. SERVER_NAME()
i recently found a little error in a stored procedure that was included in a project handed over to me....
the sp was rather simple. it just inserted a record into a table and returned the identity and the timestamp as follows
IF @@ERROR>0 BEGIN SELECT @int_InterventionID = 0 RETURN @@ERROR END ELSE BEGIN SELECT @int_InterventionIDReturned = MAX(InterventionID) FROM tblIntervention SELECT @ts_TimestampReturned = [Timestamp] FROM tblIntervention WHERE InterventionID = @int_InterventionIDReturned SELECT @int_InterventionID = @int_InterventionIDReturned, @ts_Timestamp = @ts_TimestampReturned RETURN 0 END
i figured that it should be using @@Identity for the interventionIdentity rather than max(InterventionID)
so i changed to...
IF @@ERROR>0 BEGIN SELECT @int_InterventionID = 0 RETURN @@ERROR END ELSE BEGIN SELECT @int_InterventionIDReturned = @@IDENTITY SELECT @ts_TimestampReturned = [Timestamp] FROM tblIntervention WHERE InterventionID = @int_InterventionIDReturned SELECT @int_InterventionID = @int_InterventionIDReturned, @ts_Timestamp = @ts_TimestampReturned RETURN 0 END
it returns the @int_InterventionIDReturned but the timestamp now comes back as null??? why??
how can i ensure that i always get the timestamp of the record it has just inserted
For starters, please feel free to move this if it is in the wrong forum.
The issue I have is this. I have been asked to delete all information from a table that was inserted before May 12 this year. The issue is that when the DB was created, whoever designedd it neglected to add a timestamp column for the user data table (the one I need to purge). Does SQL, by default, happen to store insert times? Would it be something that might hide ina log file somewhere?
I have scheduled a database backup at 9 pm and 5 Transaction Log backups. The transaction Log backups happen at 11 am, 1 pm, 3 pm and 5 pm (device does not get initialized) and one at 11 pm where i initialize the backup device. now i have another server with the same database on it. I restore these backups on to this server by simple copying the files and then doing Restore ...From Device...Add file...option. I am able to restore the DB backup but when i try to restore the log backups it get the error msg -
'Specified file 'G:...' is out of sequence. current time stamp is Feb 9 2000 3:54 pm while dump was from Feb 9 2000 3 pm.'
When i am restoring the logs i start from the 1st backup buyt still i get this error. Can someone pls let me know what the problem is and how to resolve it. Thanks.
I'm experiencing a weird problem with log shipping in SQL 2005.
I've setup Log Shipping for a production database between two sites. The standby database is being updated correctly and everything seems to be working as expected but for one detail: the name of the transaction log backups are generated with an UTC timestamp instead of my local timezone.
The the data below extracted from the backup history:
Is it possible to do the following? Begin Transaction Insert into some table select @@identity as someid insert into relatedtables(forgeignkey) values(someid) Commit
Or does SQL not know the value of @@IDentity until the actual transaction commits, thus not allowing the select @@identity as someid? How does one circumvent this issue
Hi to all,Probably I'm just doing something stupid, but I would like you to tellme that (if it is so), and point the solution.There ist the thing:I' having a sp, where I call other sp inside.The only problem is, the name of this inside sp is builded variously,and executed over sp_executesql:create pprocedure major_sp@prm_outer_1 varchar(1),@prm_outer_2 varchar(2)assome codingset @nvar_stmtStr = N'exec @int_exRetCode = test_sp_' + @prm_outer_1 +@prm_outer_2set @nvar_stmtStr = @nvar_stmtStr + ' @prm_1, @prm_2, @prm_3, @prm_4output'set @nvar_prmStr = N'@prm_1nvarchar(128), ' +N'@prm_2nvarchar(128), ' +N'@prm_3nvarchar(4000), ' +N'@int_exRetCodeint output, ' +N'@prm_4varchar(64) output'exec sp_executesql @nvar_stmtStr,@nvar_prmStr,@prm_1,@prm_2,@prm_3,@int_exRetCode = @int_exRetCode output,@prm_4 = @prm_4 outputNow the issue is, I've transactions inside test_sp_11 lets say wherethe 11 is @prm_outer_1 + @prm_outer_2.These procedures are existing inside database, but are called dynamiclydepending of the parameters.The problem is, when I call the specified sp directly, the rollbacktransaction is working without any problem.Inside this procedures test_sp_xx, is a call of another sp (lets sayinside_sp).There is a transaction included.When it is called over major_sp, then the rollback is not performedbecause of error:Server: Msg 6401, Level 16, State 1, Procedure inside_sp, Line 54Cannot roll back transactio_bubu. No transaction or savepoint of thatname was found.The funniest way is, if there is no error inside, the commit is workingwithout any problem!The question is majory (because I'm almost sure, that this is anissue): is it possible, to have atransaction inside dynamicly called sp over sp_executesql?If ok to do that?Thank's in advanceMatik
I have a trigger which requires dropping a member from a role and includes user transactions. However you cannot call sp_droprolemember from within a user transaction. Looking at the code for the procedure gives me a line
I can find no documentation on SetRoleMember or %%owner and attempts to run this line as an exec statement fails with "syntax error near '%'" Nor can I find any way of dropping a member from a role using T-SQL or DDL constructs.
How do I drop a member from a role using T-SQL code and avoiding sp_droprolemember? And where do I find information about the %%Owner construct?
When i execute the following set of statements only 8 is getting inserted into table instead 6 and 8.
Create Table BPTest(id int) Declare @Id Int Set @Id = 0 While (@Id < 10) Begin begin tran Insert into BPTest values (@id) if(@Id > 5) begin if(@Id % 2 = 0) begin print 'true' print @Id commit tran end else begin print 'false' print @Id rollback tran end end Set @Id = @Id + 1 End Select * from BPTest drop table BPTest
I'm using transactions with my SqlConnection ( sqlConnection.BeginTransaction() ).
I do an Insert into my User table, and then subsequently use the generated identity ( via @@identity ) to make an insert into another table. The foreign key relationship is dependant on the generated identity. For some reason with transactions, I can't retrieve the insert identity (it always returns as 1). However, I need the inserts to be under a transaction so that one cannot succeed while the other fails. Does anyone know a way around this problem?
So heres a simplefied explanation of what I'm trying to do:
Begin Transaction Insert into User Table Retrieve Inserted Identity (userID) via @@Identity Insert into UserContact table, using userID as the foreign key. Commit Transaction.
Because of the transaction, userID is 1, therefore an insert cannot be made into the UserContact table because of the foreign key constraint. I need this to be a transaction in case one or the other fails. Any ideas??
Is it normal practice to check for @@ERROR after a SELECT statement that retrieves data from a table OR we should only check for @@ERROR after a DELETE/INSERT/UPDATE type of statement? The SQL statement is inside a transaction.
Can we include a Send Mail Task inside a transaction? We intend to callback a sent email in case there is an error in the subsequent task, if it is possible.
We are designing an ETL solution for a BI project using SSIS.
We need to load a dimension table from a source DB into the DW; inside the DW there are two tables for each source dimension table: a current dimension table and a storical dimension table.
The source table includes technical columns that indicate if each record was processed correctly by ETL procedures.
Each row in the source table can be a new record, or an exisisting one. If it's a new record, a corresponding new record should be inserted into the current dimension table of the DW; if it's an exisisting one, a new record should be inserted in the storical dimension table and the existing record in the current dimension table should be updated.
Furthermore for each record we need to update the source table with an error code. If the record was processed with succes the code is zero, otherwise it contains an error code.
I try to use a single data flow task, using the 'ole db command trasformation' task for managing update and 'ole db destination' task for managing insert.
The problem is that some insert and update should be executed inside a single transaction, for each record; for example if the source contains a new record I should insert the record in the current dimension table and update the source record with error code zero if every think goes fine.
There is some way to group task in the data flow pane in a single transaction ?
I have some VB.NET code that starts a transaction and after that executes one by one a lot of queries. Somehow, when I take out the transaction part, my queries are getting executed in around 10 min. With the transaction in place it takes me more than 30 min on one query and then I get timeout. I have checked sp_lock myprocessid and I've noticed there are a lot of exclusive locks on different objects. Using sp_who I could not see any deadlocks. I even tried to set the isolation level to Read UNCOMMITED and still have the same problem. As I said, once I execute my queries without being in a transaction everything works great. Can you help me to find out the problem?
BEGIN TRAN INSERT Z_Test SELECT STATE_CODE FROM View_STATE_CODE COMMIT
View_STATE_CODE points to remote SQL server named PROD. There is error when I run this query:
Server: Msg 8501, Level 16, State 1, Line 12 MSDTC on server 'PROD' is unavailable. Server: Msg 7391, Level 16, State 1, Line 12 The operation could not be performed because the OLE DB provider 'SQLOLEDB' was unable to begin a distributed transaction. OLE DB error trace [OLE/DB Provider 'SQLOLEDB' ITransactionJoin::JoinTransaction returned 0x8004d01c].
It looks like remote server is not available inside the local transaction. How to handle that?
Hi guys, If I have a temporary table called #CTE With the columns [Account] [Name] [RowID Table Level] [RowID Data Level] and I need to change the column type for the columns: [RowID Table Level] [RowID Data Level] to integer, and set the column [RowID Table Level] as Identity (index) starting from 1, incrementing 1 each time. What will be the right syntax using SQL SERVER 2000?
I am trying to solve the question in the link below: http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=2093921&SiteID=1
Thanks in advance, Aldo.
I have tried the code below, but getting syntax error...
ALTER TABLE #CTE ALTER COLUMN [RowID Table Level] INT IDENTITY(1,1), [RowID Data Level] INT;
I have also tried:
ALTER TABLE #CTE MODIFY [RowID Table Level] INT IDENTITY(1,1), [RowID Data Level] INT;
Hi guys, In the query below, I need to insert a new Column that shows the last Document date. In example, Document Date Last Date 2006/01/15 2007/11/30 2007/03/25 2007/11/30 2007/11/30 2007/11/30 I need to show the last (max) date for each row. Thanks in advance for any help, Aldo.
Code Block SELECT Stock.VALUEDATE AS 'Document Date'
FROM Stock As Stock INNER JOIN ACCOUNTS AS Accounts LEFT OUTER JOIN ACCSORTNAMES AS AccSortNames ON Accounts.SORTGROUP = AccSortNames.ACCSORTCODE LEFT OUTER JOIN EXTRANOTES AS ExtraNotes01 ON Accounts.ACCOUNTKEY = ExtraNotes01.KEF AND ExtraNotes01.NOTEID IN ('17') LEFT OUTER JOIN EXTRANOTES AS ExtraNotes02 ON Accounts.ACCOUNTKEY = ExtraNotes02.KEF AND ExtraNotes02.NOTEID IN ('18') ON Stock.ACCOUNTKEY = Accounts.ACCOUNTKEY INNER JOIN DOCUMENTSDEF AS DocumentsDef ON Stock.DOCUMENTID = DocumentsDef.DOCUMENTID
WHERE Accounts.SORTGROUP Between '0' AND '379' AND Accounts.SORTGROUP Not Between '100' AND '150' AND Stock.DOCUMENTID In ('1', '4', '5', '11', '78', '95') AND Stock.STATUS Not In ('0', '2') AND Accounts.SORTGROUP Between '0' AND '379' AND Accounts.ACCOUNTKEY Like '313%'
I am getting an error about "Cannot perform a shrinkfile operation inside a user transaction", but I don't have a shrinkfile command in my procedure. Â Does SQL hang on to that command if it was received earlier in a different procedure?
I have a table with a [Timestamp] field which is a datetime data type. What I need to do is add one hour to the timestamp. Each entry in the [Timestamp] field looks like this: 2008-03-09 16:44:06.313
What is the best way to do this?
(Should I be using UPDATE [Tablename] or Alter Table [Tablename])
ASP.NET application, MS SQL DB, and a table with a timestamp(8) column.
Error: Cannot insert a non-null value into a timestamp column. Use INSERT with a column list or with a default of NULL for the timestamp column.
I'm using an SQLDataAdapter and typed DataSet, inserting a row into the dataset, and calling Update() to send changes to the DB.
The thing that baffles me about receiving this error, is that the DB column can be null and I get this error wheter I attempt to insert NULL or a valid byte array into the column.
(No - I am not trying to insert a datetime into the column.)
The error just doesn't seem too descriptive of the problem I'm having, and I'm quite confused.
I have a question regarding the rowversion / timestamp column. I want to use it to track changes in some tables and I created a non clustered index on the timestamp column (with three includes) to get a better performance while searching / gathering the data. It works very well, but I experienced a big performance hit on the inserts and my first thoughts are that it has something to do with this index.
I know that this NCI has to be updated with every insert and so this could have an impact on performance, but we have a lot of other indexes as well, which are also get updated with every insert. Why this index / column has such a big impact? Our application use a lot of dynamic cursors, maybe it has something to do with that?
I've created an SSIS package that contains a Sequence Container with TransactionOption = Required. Within the container, there are a number of Execute Package Task components running in a serial fashion which are responsible for performing "Upserts" to dimension and fact tables on our production server. The destination db configuration is loaded into each of these packages using an XML configuration file. The structure of these "Upsert" packages are nearly identical, while some execute correctly and others fail. Those that fail all provide the same error messages.
These messages appear during Pre-Execute
[Insert new dimension record [1627]] Error: The AcquireConnection method call to the connection manager "DW" failed with error code 0xC0202009.
[DTS.Pipeline] Error: component "Insert new dimension record" (1627) failed the pre-execute phase and returned error code 0xC020801C.
... which are followed by
[Connection manager "DW"] Error: The SSIS Runtime has failed to enlist the OLE DB connection in a distributed transaction with error 0x8004D00A "Unable to enlist in the transaction.".
[Connection manager "DW"] Error: An OLE DB error has occurred. Error code: 0x8004D00A.
While still in debug mode, I can check the properties of the "DW" connection and successfully test the connection within the packages that fail.
The same packages run successfully when tested outside the container (i.e. no transaction) or when the configuration file is modified to point the "DW" connection to a development version of the db which is running on the same server as the source database.
I have successfully used DTCtester to verify that transactions from source to destination server are working correctly. Also tried setting DelayValidation = True with no change. I have opened a case with Microsoft and am awaiting a reply so I thought I'd throw a post out here to see if anyone else has encountered this and might have a resolution. Here's some more on the environment:
Source Server:
Windows Server 2003 Enterprise Edition SP1 SQL Server 2005 Enterprise Edition SP0
Destination Server:
Windows Server 2003 Enterprise Edition SP1 SQL Server 2000 Enterprise Edition SP3 (clustered)
Thank you in advance for any feedback you might be able to provide.
I can not insert CURRENT_TIMESTAMP into column type timestamp I defined a data Table that has column type timestamp. I did not achieve insert CURRENT_TIMESTAMP data into column typed timestamp. My statement is below.What is wrong with it? Thanks set @cmdS2 = 'Insert Into TABLOLAR Values(' + CHAR(39) + @TABLO + CHAR(39) + ',' + CHAR(39) + @TABLO_ACIKLAMASI + CHAR(39) + ',' + CHAR(39) + CURRENT_TIMESTAMP + CHAR(39) + ',' + CHAR(39) + @KLLNC + CHAR(39) + ')'
I have been provided with a table where one of the columns is of TimeStamp data type. My question is how to insert and update data in this column through my SQL Statement? When I run my SQL statement, it gives me an error with this column name in the error.
Im trying to insert the values from this query into a table, so I can later check the history of memory usage:
SELECT [total_physical_memory_kb] / 1024 AS [Total_Physical_Memory_In_MB] ,[available_page_file_kb] / 1024 AS [Available_Physical_Memory_In_MB] ,[total_page_file_kb] / 1024 AS [Total_Page_File_In_MB] ,[available_page_file_kb] / 1024 AS [Available_Page_File_MB] ,[kernel_paged_pool_kb] / 1024 AS [Kernel_Paged_Pool_MB] ,[kernel_nonpaged_pool_kb] / 1024 AS [Kernel_Nonpaged_Pool_MB] ,[system_memory_state_desc] AS [System_Memory_State_Desc] FROM [master].[sys].[dm_os_sys_memory]
What I'm missing is a way to insert the current timestamp every time I insert to the table.My plan is to use the insert into command.
I have transactional replication, The publisher DB contains table call Courser with timestamp column, this column values are unique for the publisher DB
The subscriber DB also contains same copy of data in publisher DB Course table, but the timestamp column values are different.
So my problem is how can I keep this two tables (Course) identically, (same timestamp column vales in both table)
NOTE: Publisher and Subscriber DB reside under two different SQL server instance
I am using SSMA 6.0 for DB2. When trying to migrate data with a table have timestamp column, it fails with an error "Hour, Minute, and Second parameters describe an unrepresentable DateTime." however i don't see any issues with the source data.