Long Blocked Transaction Logging
May 5, 2007
Is there anyone know any feature in SQL server which can detect/log the transaction which has been blocked unusually long? My idea is:
If a transaction has run for over 30 minutes, then log the following information in SQL server log:
1. State of the transaction : Blocked, Running, Sleeping
2. Time happeing
3. Duration
4. Last SQL it run
5. Blocked by which thread and info of the thread : the last SQL it run, the state of the thread.
Possibly it would be run in SQL server agent or SQL profiler has such feature, I am not sure about it. Can anyone suggest? Thanks.
View 4 Replies
ADVERTISEMENT
Jul 20, 2005
I have written a stored procedure to list out all tables in whichrows or the table itself is locked. The only information I amnot able to get is the time when the lock occurred. The way Iwant is that if I run the procedure it should show all lockson a table which are at least 5 or x seconds old. This way I canavoid momentary locks on a table which go away after few seconds.Which table and column of master database has that information?Thanks.--email id is bogus
View 3 Replies
View Related
Jun 10, 2004
Hi guys,
I have a stored procedure which generates the next sequence number...
it uses SERIALIZABLE Option. procs look something like below..
begin
SET TRANSACTION ISOLATION LEVEL SERIALIZABLE
BEGIN TRANSACTION
Sequence generating statement...
COMMIT TRANSACTION
set @NextSequenceValue = @NextSequenceValue
Return @NextSequenceValue
end
For some reason when i call the proc with below parameters to get next sequence number.. its hungs up..
declare @NextSequenceValue int
set @NextSequenceValue = 0
exec spGetNextSequence 19, 'LotSequence', @NextSequenceValue output, Null
select @NextSequenceValue as NextSequenceValue
When i queried sp_who2 it shows that my processid is blocked by some other processid.. and when i do DBCC INPUTBUFFER (blockingprocessid), the query of blocking processid and my nextsequence generation stored proc is not realted at all..
Can you help shed some light on why my nextsequence generating proc is getting hunged...?
help is appericated..
View 4 Replies
View Related
Oct 11, 2006
Hello,
I am trying to execute next query, but when doing it, TABLE1 locks and it does not finish.
SERVER2 is a linked server.
BEGIN TRAN
INSERT INTO TABLE1
SELECT * FROM SERVER2.DATABASE2.DBO.TABLE2 WHERE TAB_F1 IS NULL
COMMIT TRAN
I have same configuration in other 2 computers and it works ok.
What is the problem?
Thank you!!
View 8 Replies
View Related
Oct 20, 2005
Lets say I have version 1 of a database - DB1. I am creating the second database, DB2.
What I need is a log of all the SQL statements that where used to
change DB1 into DB2. This means recording both what happened in the GUI
and in the SQL Query Analyser.
Is there a way I can do this? I know SQL Server has a transaction log
somewhere. Is there a way to set this to output all the changes made
from a set date on a database into a SQL log file?
Thanks in advance for any help.
Jagdip
View 3 Replies
View Related
Jan 29, 2001
If you use the import feature of DTS to import data into various tables and your database is setup with truncate log on checkpoint set to false, will SQL Server log those transactions?
Thanks,
Veronica
View 1 Replies
View Related
Oct 17, 2005
We have a publish job that we are trying to automate, the problem is getting the output back to the app. or a file. Originally we had print statements, this worked great when we manually ran the proc in QA and could capture the output, now that we are automating it from an application I am not sure how to capture these Print statements - ideally I would like to find this out.
The App. is doing a Try-Catch block so using something like isql.exe will not do the trick otherwise that is the route we would go.
I tried logging everyting to a table but those inserts get rolled back with XACT_ABORT. What about the xp proc that logs it to the event log? Thought of that but that would make a real mess of the event log with all of our status messages.
Now we are considering using xp_cmdshell that calls a batch file to output our status text, is this my best option? I would prefer to capture all of the print statements so if anyone knows how to do this that would be preferable!
Thanks!
View 1 Replies
View Related
Jul 20, 2005
HiI take one nightly full database backup at 02:00 and backup thetransaction log to one backup set every 15mins.The commands to do this are as follows and are set up to run asdatabase jobs:-- Database BackupBACKUP DATABASE [ThikosDatabase] TO [DatabaseBackUp] WITH NOINIT ,NOUNLOAD , NAME = N'ThikosDatabase backup', NOSKIP , STATS = 10,NOFORMAT-- Transaction Log Backup Every 15mins.BACKUP LOG [ThikosDatabase] TO [TransactionLog] WITH NOUNLOAD , NAME= N'TransactionLogBackUp', NOSKIP , STATS = 10, DESCRIPTION =N'BackUp the transaction Log every 15 minutes every day.', NOFORMAT,RETAINDAYS = 0At the moment the transaction log backups every 15mins are simplybeing added one by one to the one backup set. I want the backed uptransaction logs in the backup set after 24hrs to be overwritten sodisk space does not add up. I thought RETAINDAYS would do this as anexpirydate is set to expire on the time it is created so i would haveexpexted it to be overwritten when that time comes around the nextday. (But is doesent)I think it is ok to overwrite the transaction log in the backupsetafter 24hrs as you only need the transaction logs since the last fullbackup (which happens nightly at 02:00) for recovery.Does anyone have any ideas?Many thanks.Thiko!
View 3 Replies
View Related
May 14, 2007
We use SQL PE 2000 in our lab to store sampled data (lots of data collected at high sampling rate) in real-time. i know this may be an uncommon use of SQL server. For this particular application, performance is much more important than data recoverability. In other words, once-in-a-while data loss is acceptable. Actually we have never used the transaction log for recovery since we started using the SQL PE in this way a few years ago. Certain transaction logging such as bulk deletion is particularly wasteful of the resource. As my understanding from reading the online book and some threads on this forum, the best thing we can do is using bulk-logged recovery model. I am not sure if that helps us at all because it seems that bulk deletion (i.e. DELETE FROM MyTable WHERE ..., may affect hundreds of thousands of records) is not on the list of minimally logged transactions.
I am wondering if anyone could share some good advice.
Thanks in advance!
View 4 Replies
View Related
May 8, 2007
Hi,
We have a DW project where I would like to disable the transaction logging. Any good advices anyone?
View 3 Replies
View Related
May 9, 2000
Is it possible to disable the logging operations for insert/update/delete in the transaction file for certain tables?
I have an application where I have to write temporary tables for calculations and printing and all the insert/update/delete absolutely don't have to be logged. They get deleted after use anyway.
Thomas Schoch
View 1 Replies
View Related
Jul 23, 2005
Hallo All,I'm making in my DB some logs.I have a separate table containing:ID,Date,Source,Type,ErrorNo,DescriptionAcctualy the table struct is not immportand that's why I do not postthe script.I have one procedure, which I call in my other function procedures,triggers etc. with parameters.This procedure (WriteToLog) is called, if in other procedures, triggersis any validation, exception or only information which I would like tolog me separately, and it writes the record into this table.But sometimes, it is very hard, to follow much logs, and I figured out,if I could have something like transaction ID, task ID, it will be mucheasier.Firs idea was to, by going from procedures to other procedures, pass aself generated token (ie. date with any additional number). But then, Ineed to change everywhere where I call WriteToLog procedure the callsyntax.Let's say that we have following situation:Procedure1:Action1Select1CompareOfValues1exec WriteToLogAction2CompareOfValues2exec WriteToLogAction3exec Procedure2Procedure2:Action1exec WriteToLogSelect1Insert1Trigger1Started:Action1CompareOfValues1exec WriteToLogAction2Trigger1Exit:Procedure2Exit:Procedure1Exit:Now ... I could pass this my generated number, from SP to SP and so on.But I would like to now, does the MS SQL server has something whatidentifies transaction like descripted below.In this case, I do not need to pass any number, only I need to get thisnumber anyhow in SP WriteToLog, and insert it into my log table.Any sugestions?Thank's in advance.Mateusz
View 3 Replies
View Related
Jul 20, 2005
Hi,I am still not very proficient in SQLServer. So apology if thequestion sounds basic.We have a script to clean old unwanted data. It basically deletesall rows which are more than 2 weeks old. It deletes data from33 tables and the number of rows in each table runs into few millions.What I see in the script (not written by me :-) ) is that all data isdeleted within a single BEGIN TRANSACTION and COMMIT TRANSACTION. AsI have background in informix, such an action in Informix may resultin "LONG TRANSACTION PROBLEM". Does SQLServer have a similar concept.Also won't it have performance problem if all rows are marked lockedtill they are committed.TIA.
View 3 Replies
View Related
Nov 22, 1999
Is there a way to switch off transaction logging for insert and update statements and not only for select into bulk copy?
View 1 Replies
View Related
Mar 18, 2008
Hi y'all, I am doing some searching in the archived threads, but I have a need to copy a table in a database to a new table in the same database, but the new table will be just a table with test data. There are several million rows in the table and I want to do the copy without logging the new inserts in the transaction log.
Is there an easy way to do this? I found this in my search efforts so far, but am just wondering if there is an easier/better way to accomplish what I want to do.
BTW, I normally wouldn't care, but the boss is complaining that it is taking too long to do the copy for a different team, so asked if I knew a way to copy data to a new table without logging. I don't, so here I am ;)
Here is what I found so far:Following 3 things need to be done
1) create table as not logged initially
2) set autocommit=off
3) and activate the not logged initially option
Now the inserts happen without the use of transaction logs
View 12 Replies
View Related
Sep 19, 2007
I'm curious what are considerations for choosing a good transaction retention time? The default SQL uses is 0 to 72 hours. With this setting I found that cleanup was taking upwards of 30 minutes (for a process that defaults to run every 10 minutes). I've read that lowering it can improve performance, and that also you don't want this running too long because of deadlock issues between this and the log reader. So how short is too short? Optimally, since the system this runs on is under heavy use I'd like to optimize this as much as possible, which makes me think that the smaller the retention the better, but is something like 1 or 2 hours too short? What are possible consequences of such a short period of time?
View 2 Replies
View Related
Jul 23, 2005
Hi all,I would like to perform anINSERT INTO LINKEDSVR.dbo.xyz.abcSELECT ... FROM dbo.dfgwhere LINKEDSVR is a linked server on another machine. Both servers arerunning SQLServer 2000 and have the DTC running.When I run this batch from QueryAnalyzer without explicitly usingtransactions, it works well (takes about 5 sec) - however, when Ienclose it usingbegin [distributed] tran/commit tranthe query runs forever.I also tried to use the local server as linked server (loopback) but itdid not work either.Any suggestions?Thanks,Jo
View 2 Replies
View Related
Dec 6, 2005
I just wanted to post a follow up to a message I posted some months agoabout a long running transaction that was blocking all other users...The link is belowhttp://groups.google.com/group/comp...649bee2002646a2By using the new "Row versioning" functionality of SQL 2005, itcompletely solved this problem. By reading the books online, it saysthere is a performance impact, but that the better performance of SQL2005 in general might offset it. So far this seems to be the case. justposting it here in case anyone else has the problem. The SQL command Iihad to execute to get everything working properly was:ALTER DATABASE DBnameSET READ_COMMITTED_SNAPSHOT ON;
View 1 Replies
View Related
Jul 20, 2006
Hello,
I seem to be misunderstanding the way transactions work with service broker queues. We have developed and deployed a service broker application that 5 queues and a windows service for each queue on multiple servers (3 currently). Due to a last minute issue, we had to not use transactions when the services executed a recieve and I am not updating the code base to use transactions and am running into blocking issues. One of the services runs for 90 seconds (spooling to the printer) and all of the servers block on the receive operation for this queue. I thought that if I was receving messages from a single conversation, other receives against this queue would not block.
Thanks,
Jim Stallings
View 9 Replies
View Related
Mar 30, 2007
Hello,
I'm trying to figure out why my transaction log backup is taking up to an hour to complete. I started off with a full recovery model with a Full database back up every Sunday, differential backups every Tuesday/Thursday and log backups every 5 minutes. I would have thought that the log file backups would execute much quicker because I'm backing them up more often.
Here is my backup statement, I'm hoping I've got a wrong option that you can point out to me:
BACKUP LOG [xxxx] TO [LogFilexxxxBackups] WITH NOINIT , NOUNLOAD , NAME = N'xxxx log backup', SKIP , STATS = 10, NOFORMAT
View 1 Replies
View Related
Oct 1, 2007
We have a web-based third-party application that has both background processes and user activity requests running in the same database (SQL Server 2005 SP2). The problem is that a background process will start a long-running transaction and hold an exclusive lock on a few rows in a given table (a small table, <100 rows). The web clients need to scan this same table, but when their "select *" statements get to those locked row(s), the web client queries stall waiting for that exclusive lock to be released. This effectively brings the entire web front end to a halt because all clients must hit this table for each user action. I realize that this is the classic lock condition that multiversioning databases like Oracle, PostgreSQL, SQL Server Compact Edition, and other databases do not suffer because they don't use shared read locks like SQL Server. But since we're on SQL Server for this app, what is the way to get around this problem? Modifying the clients to use WITH (NOLOCK) is not an option... there will be major consistency issues unless the clients run in Read Committed or higher. Any ideas? We could tweak this app if needed. Does SQL Server 2008 introduce multiversioning or at least some mechanism to get around this problem? I did not see it mentioned on the Microsoft site, but maybe I missed it. Thanks in advance.
Austin
View 6 Replies
View Related
Aug 25, 2007
How can I execute a long running transaction using something similar to the fire and forget pattern?
I intend to start the execution of a very long stored proc from within IIS. I would like to execute a sql script that will start the job and return immediately so that it doesn't hold an IIS thread.
View 3 Replies
View Related
May 14, 2014
I know Oracle provide pragma directives to execute autonomous transactions which i used before on Oracle for logging. Now i want to repeat the same in SQL Server but unfortunately i found such pragmas are not existing in SQL Server.After several google searching, i have found that i can use loopback linked server to generate autonomous transaction calls.
If i have Server A & Server B where server B is a loop back server of Server A and all my objects are existing on Server A. I just wanted to user Server B for logging only. should i have logging tables on Server B? Logging procedures on Server A? and call logging procedures (via Execute ) from application procedures residing on Server A?
View 2 Replies
View Related
Jun 18, 2007
Greetings,
I am developing a package on my local workstation. I have defined two logging service providers. One is for SQL Server and the other is for the Windows Event Log. I am using the Dts.Log method in a script task to write log entries.
Logging is working properly with the SQL Server provider and rows are being inserted into the sysdtslog90 table. However, the only events that are being logged in the Windows Event Log are the package start and end events which I believe SSIS is doing automatically anyway.
Is there something I need to do to enable WIndows Event Log logging other than defining a log provider and making sure it is checked active? Won't SSIS write to two different logs with one Dts.Log call? Any ideas on what might be going wrong with my approach?
Thanks,
BCB
View 3 Replies
View Related
Oct 17, 2007
Hi,
I decided to use the SQL Server log provider to store logging data of all my Integration Services packages. I also created some reports about this data for operating purposes.
I have a problem occurs the name of the executing package is not always written to the log,but the name of the single task which failed. But that is not very useful information for operating, because I do not see any chance to get the name of the package by the information which is logged in the sysdtslog90 table in the database which I defined for SSIS Logging.
How do I configure the package to always log the package information into the table, too?
Best regards,
Stefoon
View 5 Replies
View Related
Sep 12, 2005
I recently read the project real ETL design best practices whitepaper. I too, want to do custom logging as I do today, and also use SSIS logging. The paper recommended using the variable system::PackageExecutionId to tie the 2 logging methods together.
View 4 Replies
View Related
May 31, 2008
Hi All
I'm getting this when executing the code below. Going from W2K/SQL2k SP4 to XP/SQL2k SP4 over a dial-up link.
If I take away the begin tran and commit it works, but of course, if one statement fails I want a rollback. I'm executing this from a Delphi app, but I get the same from Qry Analyser.
I've tried both with and without the Set XACT . . ., and also tried with Set Implicit_Transactions off.
set XACT_ABORT ON
Begin distributed Tran
update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TRANSACTIONMAIN
set REPFLAG = 0 where REPFLAG = 1
update TSADMIN.TRANSACTIONMAIN
set REPFLAG = 0 where REPFLAG = 1 and DONE = 1
update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.WBENTRY
set REPFLAG = 0 where REPFLAG = 1
update TSADMIN.WBENTRY
set REPFLAG = 0 where REPFLAG = 1
update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.FIXED
set REPFLAG = 0 where REPFLAG = 1
update TSADMIN.FIXED
set REPFLAG = 0 where REPFLAG = 1
update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.ALTCHARGE
set REPFLAG = 0 where REPFLAG = 1
update TSADMIN.ALTCHARGE
set REPFLAG = 0 where REPFLAG = 1
update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TSAUDIT
set REPFLAG = 0 where REPFLAG = 1
update TSADMIN.TSAUDIT
set REPFLAG = 0 where REPFLAG = 1
COMMIT TRAN
It's got me stumped, so any ideas gratefully received.Thx
View 1 Replies
View Related
Feb 22, 2007
I have a design a SSIS Package for ETL Process. In my package i have to read the data from the tables and then insert into the another table of same structure.
for reading the data i have write the Dynamic TSQL based on some condition and based on that it is using 25 different function to populate the data into different 25 column. Tsql returning correct data and is working fine in Enterprise manager. But in my SSIS package it show me time out ERROR.
I have increase and decrease the time to catch the error but it is still there i have tried to set 0 for commandout Properties.
if i'm using the 0 for commandtime out then i'm getting the Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.
and
Failed to open a fastload rowset for "[dbo].[P@@#$%$%%%]". Check that the object exists in the database.
Please help me it's very urgent.
View 3 Replies
View Related
Feb 6, 2007
I am getting this error :Distributed transaction completed. Either enlist this session in a new
transaction or the NULL transaction. Description:
An unhandled exception occurred during the execution of the current web
request. Please review the stack trace for more information about the error and
where it originated in the code. Exception Details:
System.Data.OleDb.OleDbException: Distributed transaction completed. Either
enlist this session in a new transaction or the NULL transaction.have anybody idea?!
View 1 Replies
View Related
Dec 22, 2006
i have a sequence container in my my sequence container i have a script task for drop the existing tables. This seq. container connected to another seq. container. all these are in for each loop container when i run the package it's work fine for 1st looop but it gives me error for second execution.
Message is like this:
Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.
View 8 Replies
View Related
Jan 8, 2008
Hi,
i am getting this error "Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.".
my transations have been done using LINKED SERVER. when i manually call the store procedure from Server 1 it works but when i call it through Service broker it dosen't work and gives me this error.
Thanks in advance.
View 2 Replies
View Related
Dec 7, 1999
Hello...
Is it normal in SQL Server 6.5 the user who only running the query blocking the other user who try to update/add the records?
note: The query is a complex SQL.
Many Thanks!
View 2 Replies
View Related