Lock Blocks
May 24, 2001Hi, Everyone
Can anyone tell me how to find who blocks the table or record, because the Lock Blocks is extremly high in the performance monitor.
Regards,
Hi, Everyone
Can anyone tell me how to find who blocks the table or record, because the Lock Blocks is extremly high in the performance monitor.
Regards,
I simply made my script task (or any other task) fail
In my package error handler i have a Exec SQL task - for Stored Proc
SP statement is set in following expression (works fine in design time):
"EXEC [dbo].[us_sp_Insert_STG_FEED_EVENT_LOG] @FEED_ID= " + (DT_WSTR,10) @[User::FEED_ID] + ", @FEED_EVENT_LOG_TYPE_ID = 3, @STARTED_ON = '"+(DT_WSTR,30)@[System::StartTime] +"', @ENDED_ON = NULL, @message = 'Package failed. ErrorCode: "+(DT_WSTR,10)@[System::ErrorCode]+" ErrorMsg: "+@[System::ErrorDescription]+"', @FILES_PROCESSED = '" + @[User::t_ProcessedFiles] + "', @PKG_EXECUTION_ID = '" + @[System::ExecutionInstanceGUID] + "'"
From progress:
Error: The Script returned a failure result.
Task SCR REIL Data failed
OnError - Task SQL Insert Error Msg
Error: A deadlock was detected while trying to lock variable "System::ErrorCode, System::ErrorDescription, System::ExecutionInstanceGUID, System::StartTime, User::FEED_ID, User::t_ProcessedFiles" for read access. A lock could not be acquired after 16 attempts and timed out.
Error: The expression ""EXEC [dbo].[us_sp_Insert_STG_FEED_EVENT_LOG] @FEED_ID= " + (DT_WSTR,10) @[User::FEED_ID] + ", @FEED_EVENT_LOG_TYPE_ID = 3, @STARTED_ON = '"+(DT_WSTR,30)@[System::StartTime] +"', @ENDED_ON = NULL, @message = 'Package failed. ErrorCode: "+(DT_WSTR,10)@[System::ErrorCode]+" ErrorMsg: "+@[System::ErrorDescription]+"', @FILES_PROCESSED = '" + @[User::t_ProcessedFiles] + "', @PKG_EXECUTION_ID = '" + @[System::ExecutionInstanceGUID] + "'"" on property "SqlStatementSource" cannot be evaluated. Modify the expression to be valid.
Warning: The Execution method succeeded, but the number of errors raised (4) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
And how did I get 4 errors? - I only set my script task result to failure
Hi
We are facing an acute situation in our web-application. Technology is ASP.NEt/VB.NET, SQL Server 2000.
Consider a scenario in which User 1 is clicking on a button which calls a SQL stored procedure. This procedure selects Group A of records of Database Page1.
At the same time if User 2 also clicks the same button which calls same SQL stored procedure. This procedure selects Group B of records of Database Page1.
So, its the same Page1 but different sets of records. At this moment, both the calls have shared locked on the Page1 inside the procedure.
Now, in call 1, inside the procedure after selecting Group A of records, the next statement is and update to those records. As soon as update statement executes, SQL Server throws a deadlock exception as follows :
Transaction (Process ID 78) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction
We are able to understand why its happening. Its because, Group A and Group B of records are on the same Page1. But both the users have shared lock on the Page1. So, no one gets the exclusive lock in records for update, even though, the records are different.
How can I resolve this issue? How can I get lock on wanted rows instead of entire page?
Please advice. Thanks a bunch.
Pankaj
Hi,
I have set of 2 DTS packages, one of which calls the other by forming a command-line (dtexec) using a Execute Process task.
From the parent package-> Execute Process Task->
dtsexec /F etc... /<pkg variable> = "servername"
Each of the parent and the called package have a variable: "User::DWServerSQLInstance" which is mapped to the SQL server connection manager server name property using an expression. The outer package has the above variable and so does the inner called package (which gets assigned through the command line from the outerpackage call to inner)
I "sometimes" get the following error:
OnError,I4,TESTDOMAdministrator,ACDWAggregation,{A1F8E43F-15F1-4685-8C18-6866AB31E62B},{77B2F3C7-6756-46EB-8C01-D880598FB4B3},5/22/2006 5:10:28 PM,5/22/2006 5:10:28 PM,-1073659822,0x,The variable "User::DWServerSQLInstance" is already on the read list. A variable may only be added once to either the read lock list or the write lock list.
Help would be appreciated!
I have seen other posts on this but, not able to relate the solution to my scenario.
I know blocks and Deadlocks are different but how related are they? Seems like when I get reports of deadlocks I always have blocks and the blocks grow as time passes.
View 2 Replies View RelatedI need to test a block-alert and would like to test this by causing a block on our test server. Anyone have a script handy that will do this?
View 5 Replies View RelatedI would like to log blocks to see what causes it.
One hack of a way to do it is to run code similar to that which populates the blk column in sp_who; on a job every 5 seconds. It would store the results of code similar to the sp_who output whenever any of the rows has a value greater than zero in the blk column.
This way, I can see what was going on at the time there was a problem.
Does anyone have a better strategy for this?
Is it possiable to go through the history blocks and change field names?...ie
I want to change "PS001_Pump1.RT_Daily" to "PS001_Pump1.Runtime_Daily"
so i can access the historical data thru the new tagname.
If so , is this using an udpate query?
Hi All,
I have seen a few other people have this error.
Package works fine when run from BIDS, DTExec, dtexecui. When I schedule it, It get these random errors. (See below)
The main culprit is a variable called "RecordsetFileDIR" which is set using an expression. (@[User::_ROOT] + "RecordSets\")
A number of other variables use this as part of their expression and as they all fail, pretty much everything dies.
I have installed SP1 (Not Beta) on server. Package uses config files to set the value of _ROOT.
The error does not always seem to be with this particular variable though. Always a variable that uses an expression but errors are random. Also, It will run 3 out of 10 times without a problem. I am the only person on the server at the time.
Any ideas?
Cheers,
Crispin
Error log:
OnError,,,POSBasketImport,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1073659822,0x,The variable "User::RecordsetFileDIR" is already on the read list. A variable may only be added once to either the read lock list or the write lock list.
OnError,,,POSBasketImport,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1073639420,0x,The expression for variable "rsHeaderFile" failed evaluation. There was an error in the expression.
OnError,,,DF_Header_Header,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1071636247,0x,Accessing variable "User::rsHeaderFile" failed with error code 0xC00470EA.
OnError,,,Move All Data,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1071636247,0x,Accessing variable "User::rsHeaderFile" failed with error code 0xC00470EA.
OnError,,,Load Open Batches and Process Files,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1071636247,0x,Accessing variable "User::rsHeaderFile" failed with error code 0xC00470EA.
OnError,,,POSBasketImport,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1071636247,0x,Accessing variable "User::rsHeaderFile" failed with error code 0xC00470EA.
OnError,,,DF_Header_Header,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1071636390,0x,The file name is not properly specified. Supply the path and name to the raw file either directly in the FileName property or by specifying a variable in the FileNameVariable property.
OnError,,,Move All Data,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1071636390,0x,The file name is not properly specified. Supply the path and name to the raw file either directly in the FileName property or by specifying a variable in the FileNameVariable property.
OnError,,,Load Open Batches and Process Files,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1071636390,0x,The file name is not properly specified. Supply the path and name to the raw file either directly in the FileName property or by specifying a variable in the FileNameVariable property.
OnError,,,POSBasketImport,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1071636390,0x,The file name is not properly specified. Supply the path and name to the raw file either directly in the FileName property or by specifying a variable in the FileNameVariable property.
OnError,,,DF_Header_Header,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1073450901,0x,"component "rsHeader" (365)" failed validation and returned validation status "VS_ISBROKEN".
OnError,,,Move All Data,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1073450901,0x,"component "rsHeader" (365)" failed validation and returned validation status "VS_ISBROKEN".
OnError,,,Load Open Batches and Process Files,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1073450901,0x,"component "rsHeader" (365)" failed validation and returned validation status "VS_ISBROKEN".
OnError,,,POSBasketImport,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1073450901,0x,"component "rsHeader" (365)" failed validation and returned validation status "VS_ISBROKEN".
I need to find a way to send mail in blocks of 100 and 500 at intervals of 120 seconds. Does anyone know of any code for this?
Thanks!
Etymon
Dear All,we are running SQL2000 Sever and make use of the xp_sendmail.For any reason the mail service can run into problems and it lookslike that the statemnt below gets not finished.EXEC @Status = master..xp_sendmail @recipients=@TOList,@copy_recipients=@CCList,@subject='the subject goeshere',@message=@MailText,@no_output=TRUEUnfortunately the statement is in an update trigger and hence itblocks the table for any further updates.My questions are:Can I achieve a kind of timeout check in my trigger in order to bypassthexp_sendmail call ?In general, sending mail in a trigger may not be a good idea.How can this be solved better ?Any hint is highly welcomeRegardsRolf
View 6 Replies View RelatedHi
I was trying to clean up some conversation in Service Broker and caused alot of blocking that I seem to unable to kill. there was 1 conversation that I was not able to end, so I wanted to restart sql service, But I can't even restart the SQL service. I get the following in Event Viewer
Timeout occurred while waiting for latch: class 'SERVICE_BROKER_TRANSMISSION_INIT', id 00000001A2B03540, type 2, Task 0x0000000000C2EDA8 : 0, waittime 5400, flags 0xa, owning task 0x00000002DEBCA5C8. Continuing to wait.
Has anyone come across this
thanks
Paul
The following two stored procedures works fine. I need to add try catch blocks in the stored procedures. How to do it? Pls modify my stored procedure with try and catch blocks and in the catch block i need to call the Procedure called ErrorMsg which contains error severity.
Procedure 1------------set ANSI_NULLS ONset QUOTED_IDENTIFIER ONgo
-- =============================================-- Author: C.R.P. RAJAN-- Create date: March 26, 2008-- Description: PROCEDURE FOR IMPLEMENTING SOFT PURGE-- =============================================ALTER PROCEDURE [dbo].[SP_SOFTPURGE] @POID INTASBEGINUPDATE POMASTER SET FLAG=1 WHERE POID=@POIDSELECT * FROM POMASTER WHERE FLAG=0END
Procedure 2-----------set ANSI_NULLS ONset QUOTED_IDENTIFIER ONgo
-- =============================================-- Author: C.R.P. RAJAN-- Create date: March 26, 2008-- Description: Stored Procedure for Hard Purge-- =============================================ALTER PROCEDURE [dbo].[SP_HARDPURGE] @PoId INTAS
BEGIN TRAN INSERT INTO ARCHIVE SELECT * FROM POMASTER WHERE POID = @POID IF @@ERROR != 0 BEGIN ROLLBACK TRAN RETURN END
DELETE FROM POMASTER WHERE POID = @POID IF @@ERROR != 0 BEGIN ROLLBACK TRAN RETURN ENDCOMMIT TRANSELECT * FROM POMASTER
When I am running reports on SQL Server 6.5 database which involves creation of temp tables, whole tempdb is locked. All other users are blocked,
The reports involve queries which fetch data into temporary tables, as,
SELECT c1 , c2, c3 ...
INTO #tmp1
FROM T1
WHERE C1 = xxx
AND .........
When any of these queries are run, it blocks all other operations using TEMPDB, like creation of temporary tables or queries using sorts etc.
It appears that the query locks the system tables (sysobjects, syscolumns) in TEMPDB.
However, the above query works fine if it is constructed as,
CREATE TABLE #tmp1 (a1 ..., a2 ...., a3 ..... )
INSERT INTO #tmp1
SELECT (c1, c2, c3)
FROM T1
WHERE c1 = xxx
AND .........
Could someone please tell me what actually is happening in the first scenario. Is there a way avoid the blocking of TEMPDB usage other than the workaround mentioned above (which means I have to change all my queries) ?
Thanks
Nishant
I'm trying to place records into groups of 100, so I need a query that will return records but not based upon an autonumber type field. For instance, if I had a set of records that had been sequentially numbered (by autonumber) with IDs to greater than 3,600 but there were actually only 300 IDs remaining ranging from 1 to 3600 (say the others were deleted), then I would want to use a query to make three groups of 100 of the remaining IDs.
I can use SELECT TOP 100 for the first 100 records. How do I get subsequent groups of 100?
It's probably obvious and I am thinking too hard.
Hello,we use two instances of the ms sql server 2000 (Version: 8.00.760) on a4 processor windows 2000 (sp 4) machine.A dts-packet on one instance completly blocks the work on the otherinstance. Both instances have 2 dedicated processors. The twoprocessors of the blocked instance are in idle states.How is this possible? Has someone had the same behavior of the sqlserver and found a solution for this problem?Thanks in advance for answersJürgen Simonsen
View 1 Replies View RelatedI wanted to set up a mechanism that would transfer blocks of records (a few dozen to in rare cases a few thousand), with slight modification, from one database to another. It's a sort of custom partial archiving process that would be triggered from a web-based admin application. Records in the target db would be identical except:
-- the primary key in the source table, an identity field, would be just an integer in the target table
-- the target table has an extra field, an integer batch ID supplied by the web application that triggers the process
It's a simple, if not efficient matter to do it within the web application: query the source table, suck the records into memory, and insert them one by one into the target db. This will be an infrequent process which can be done at off-hours, so a bit of inefficiency is not the end of the world. But I wondered if there is a more sensible, orthodox approach:
-- Could this process be done, and done efficiently, as a stored procedure with the batch ID passed as a parameter?
-- Is there any way to do a bulk insert from a recordset or array in memory using ADO and SQL? And if so, is that better than inserting records one by one?
Advice on the best general approach would be appreciated, and I will try to figure out the details.
Hallo,
I'am relatively new to SSIS-programming and experiencing a serious problem with a package (6618 KB large) containing 5 dataflows. These dataflows all start from a datareader-origin and flow trough multiple transformations. While editing the fifth dataflow the developmentenvironment suddenly became unworkably slow. After any modification made to the dataflow it takes several minutes before I can continue editing. Meanwhile in the task manager the process devenv takes a very large portion (50 to 100%) of CPU.
Building the package,shutting down SSIS and even rebooting the PC don't make any difference. When you close the package ands reopen it, this takes at least 10 minutes.
We work with Microsoft SQL Server Integration Services Designer
Version 9.00.1399.00.
Does anyone know if there is a limit to the number of transformations in a flow or the size of a package? Could that be the problem?
Before this problem occured I was multiplying transformations by copying ,renaming and altering them within a dataflow. Could SSIS have some problem with that?
Any suggestion for solving this problem, other than chucking it all away an restarting, will be gratefully accepted.
With kind regards,
Paul Baudouin
Hi:
Somebody tellme that with a format of my database disk with 64K blocks NTFS, i can have better performance, is that true ? there is any problem in SQL with this block size ?
We are working for a long time on optimizing this script, and latest modification I did was splitting script into small blocks and using UNION ALL. This supported a lot but still it takes some 45 mins on prod environment. I'm not able to find anything more in this script to optimize.
View 9 Replies View RelatedI'm trying to set the query to send email ONLY when it returns records of blocks and i cant seem to get this going.
declare @blocks varchar(max)
set @blocks = (SELECT spid,
sp.[status],
loginame [Login],
hostname,
blocked BlkBy,
sd.name DBName,
[Code] ....
I have several stored procedures with a similar query. The SELECT clauses are identical, but the FROM and WHERE clauses are different.
Since the SELECT clause is long and complicated (and often changes), I want to place the SELECT clause in a separate file and then have the stored procedures include the block of text for the SELECT clause when they compile.
Is this possible? Looking through the docs and searching online, I don't see any way to do this.
I have the following SQL that I want to log the number of visits from clients. a session is a block of time, for example, any time between 9-12 is a morning sessions, 12-5 is an afternoon session etc. I need to count the number of sessions in a month not the appointments within the session, make sense?
select sr.resourceid, st.TimeStart, datename(m,sr.scheduledate) as sesmonth, datename(yy,sr.scheduledate) as year,
(CASE
WHEN DATEPART(hour, st.TimeStart) BETWEEN 0 AND 12 THEN 'MORNING SESSION'
WHEN DATEPART(hour, st.TimeStart) BETWEEN 12 AND 17 THEN 'AFTERNOON SESSION'
WHEN DATEPART(hour, st.TimeStart) BETWEEN 17 AND 24 THEN 'EVENING SESSION'
[Code] ....
writing SQL code to convert blocks of columns into rows.
Example,
Id A1 A2 B1 B2
1 1 1 1 1
2 2 2 2 2
3 3 3 3 3
into
Id Group Value
1 A 1
1 A 1
1 B 1
1 B 1
2 A 2
2 A 2
2 B 2
2 B 2
Hi. Periodically I need to run a delete statement that deletes old data. The problem is that this can timeout using ODBC (via the CDatabase and CRecordSet classes in legacy code). Also, while its running the delete, the table its operating on is locked and my application can't continue to run and operate on rows not affected by the delete.
Are there any workarounds for this? Can the timeout be set in the connect string?
Thanks,
Brian
I've read that microsoft.applicationblocks.data for .net v2 can't be deployed to a mobile app, and my experience bears that out. So I 'm wondering if there are application blocks for windows mobile 5 that would know how to both talk to sql server mobile and sql server 2005. I see OpenNetCF has a port but as far as I can tell, they only address sql server mobile and not talking to a sql server 2005 remote database. I can use the OpenNetCF version for my sql server mobile requirements, but I'm hoping there is an encapsulation of sqlclient calls for communication with my server db.
thanks
braden
I wanted to set up a mechanism that would transfer blocks of records (a few dozen to in rare cases a few thousand), with slight modification, from one database to another. It's a sort of custom partial archiving process that would be triggered from a web-based admin application in plain old ASP (not .net alas). Records in the target db would be identical except:
-- the primary key in the source table, an identity field, would be just an integer in the target table
-- the target table has an extra field, an integer batch ID supplied by the web application that triggers the process
It's a simple, if not efficient matter to do it within the web application: query the source table, suck the records into memory, and insert them one by one into the target db. This will be an infrequent process which can be done at off-hours, so a bit of inefficiency is not the end of the world. But I wondered if there is a more sensible, orthodox approach:
-- Could this process be done, and done efficiently, as a stored procedure with the batch ID passed as a parameter?
-- Is there any way to do a bulk insert from a recordset or array in memory using plain ASP, ADO and SQL? And if so, is that better than inserting records one by one?
I realize that the ASP.NET tableadapter and dataset features might provide a good solution, but in the short run I can't rewrite the whole application. Advice on the best general approach from an ASP-ADO platform would be appreciated, and I will try to figure out the details.
hi All,
Code:
begin try
begin transaction trans1
//some transaction
begin try
begin transaction trans2
//some transaction
//Raise Error
//commit transaction trans2
end try
begin catch
//Raise error
end catch
//some transaction
//commit transaction trans1
end try
begin catch
//Rollback trans1
end catch]
In the above code when an error is raised in the transaction trans2 the immediate catch is not invoked where as the outer catch is being invoked. I dont know y?. Can anybody help. Thanx
I want to display data in block. Below is the Create,Insert Script and format of desired output.
CREATE TABLE [dbo].[Test_AA](
[CounterpartID] [varchar](2000) NULL,
[CounterpartName] [varchar](2000) NULL,
[SATName] [varchar](2000) NULL,
[NameOfBO] [varchar](2000) NULL,
[Code] ....
Result Should be like This :
ID:17388
Name:Scottie Berwick
SATName:Scottie Berwick
NameOfBO:
SATAccountNumber:
[Code] ....
Hi,We use a database with about 40 related tables. Some tables contain asmuch as 30.000 records. We use Access97 as an interface to thedatabase. Now recently we have the problem that when we want to inserta row in one specific table (alwasy the same) the database makesblocks.Details:- about 10% of the data was inserted using copying from Excel, beforethis action there was no problem, though there is no evidence thatthis causes the problem.- inserting rows via the Query Analyzer works fine, via Access causestrouble.- the tempdb lofile has grown to 48Mb.Has anyone ideas about what is going on and what I can do to solve theproblem?TAV,Jan Willems
View 1 Replies View RelatedWe have data that consists of an employee number, a start time and a finish time, similar to the example below
EMP Â STARTTIME Â Â Â Â Â Â Â Â Â Â ENDTIME
00001 10-Feb-2012 06:00:00 10-Feb-2012 10:00:00
00002 10-Feb-2012 07:15:00 10-Feb-2012 10:00:00
00003 10-Feb-2012 08:00:00 10-Feb-2012 10:00:00
I am trying to come up with a procedure in SQL that will give me each 15 minute block throughout the day and a count of how many employees are expected to be at work at the start of that 15 minute block. So, given the example above I would like to return
10-Feb-2012 00:00:00Â Â Â Â 0
10-Feb-2012 00:15:00Â Â Â Â 0
10-Feb-2012 06:00:00Â Â Â Â 1
10-Feb-2012 06:15:00Â Â Â Â 1
[code]....
I'm not too worried if the date part is not included in the result as this could be determined elsewhere, but how can I do this grouping/counting?
Hi,
I want to lock a table so others cannot lock it but able to read it inside transactions.
The coding I need is something like this: set implicit_transactions on begin transaction select * from table1 with (tablock, holdlock) update table2 set field1 = 'test' commit transaction commit transaction
I have tried the coding above, it won't prevent others from locking table1.
So, I changed the tablock to tablockx to prevent others from locking table1. But this will also prevent others from reading table1. So, how can I lock table1 so others cannot lock it but still able to read it?
Thank you for any help
We have a massive database with an almost massive amount of traffic to and from it.
I've been requested to implement a sliding window partitioning with 2 partitions an active and passive 1,I managed to test this on a very small testbed last month.
I currently moved 97k table on to the partition function leaving me another 26 k to go
I'm using the following stored procedure to implement the sliding window
CREATE PROCEDURE [dbo].[ManageFactSlidingWindow](@pFunction nvarchar(max),@pSchema nvarchar(max),@FG nvarchar(max),@moveDays int)
/*****************************************************************************
PROCEDURE NAME: [ManageFactSlidingWindow]
AUTHOR: Arshad Ali
CREATED: 02/24/2013
DESCRIPTION: This stored procedure manages sliding window for the partitioned table
VERSION HISTORY:
DATE EMAIL Company DESCRIPTION
[Code] .....
When I try to move the partition even a single day I get loads of locks.