I need to lock a database (prevent users from connecting) in order to
update it. I already know how to kick everyone out with their spid but
i can't figure how to prevent them from reconnecting.
I am using conditional split Checking to see if a record exists and if so update else insert. But this cause database dead lock any one has suggestion?
Hi, I am working on a project which need to produce a sequential certificate number, Everytime I need a new certificate number, I need to find out what is the max number in the database and then the new certificate number just max+1. But how can I block another transaction to check what is max certi. number while this transaction is in the middle of writing the new certificate number(max+1) into database . Does ADLockOptimistic work in this case? Here is the code: My database sql 2000.
cmdTemp.CommandText="Select max(certificateNumber) from product_table where certificateNumber<> 8888888" set cert_info=Server.CreateObject("ADODB.RecordSet") cert_info.Open cmdTemp, , AdOpenKeySet, adLockOptimistic If Not cert_info.EOF then
Hi, I'm taking an Excel spreadsheet (that could have around 30k rows) and processing it in SSIS. I essentially have a flag in one of the spreadsheet cols that indicates whether the record is already in the database or not.
I'm splitting the data using a conditional split on this column and using a OLE DB Destination (Fast Load) to perform the inserts and a OLE DB Command to fire a stored procedure to perform any updates. Both the OLE DB Destination and the stored procedure are hitting the same table and the two operations could be executing at the same time as they both appear directly after the Conditional Split, so the OLE DB Destination is set NOT to lock the table.
This seemed to work OK until recently. I've just added 2 triggers onto the table in question which I don't want to fire 30,000 times during the import. As the OLE DB Destination is set to use Fast Load, it doesn't fire the triggers - cool. In the update stored procedure it disables the trigger before performing it's update and re-enables the trigger when finished. Currently this does mean that if you only had updates, the trigger could be enables/disabled 30,000 times. That sounds kinda bad, but I don't really know if this carries a large overhead or not?
If, when importing now you have both updates and inserts the whole process locks up. From looking at activity monitor, it seems as though the INSERT gets suspended.
Do I have a fundamental problem with how I've structured the Data Flow or am I just being really stupid in Enabling/Disabling a trigger that many times, which is probably causing the problem?
I open query analyser and on one tab I update a record in a transaction and hold it.
begin tran update customers set territory = 'x' where customer = 'A00001' --rollback tran
In a second tab I attempt to update all records in the table
update customers set carrier = ''
Clear this fails because of the lock placed during the first script and this is fine.
However, is there a way to get the 2nd script to ignore the locked rows and just update as many as it can? The obvious answer seemed to be the READPAST hint like follows…
update customers with (READPAST) set carrier = ''
…but this is still blocked by the original lock. I’ve tried combining it with all sorts of other table hints but all seem to get blocked.
The following does work, ignoring the lock and not returning the data
Select * from customers with (READPAST) where customer = 'A00001'
I’ve tried combining this with the update like so…
update customers set carrier = '' from customers with (READPAST) where customer = 'A00001'
..but this is blocked too.
I’m so desperate I tried moving the update into a cursor and update one row at a time. Nothing worked. I thought I might be able to do something like this
If (Select count(*) from customers with (READPAST) where customer = 'A00001') > 0 --then perform update
..but this returns a value of 1 even though the following returns no rows.
Select * from customers with (READPAST) where customer = 'A00001'
I need to search for such SPs in my database in which the queries for update a table contains where clause which uses non primary key while updating rows in table.
If employee table have empId as primary key and an Update query is using empName in where clause to update employee record then such SP should be listed. so there would be hundreds of tables with their primary key and thousands of SPs in a database. How can I find them where the "where" clause is using some other column than its primary key.
If there is any other hint or query to identify such queries that lock tables, I only found the above few queries that are not using primary key in where clause.
Hi All,I have a database that is serving a web site with reasonably hightraffiic.We're getting errors at certain points where processes are beinglocked. In particular, one of our people has suggested that an updatestatement contained within a stored procedure that uses a wherecondition that only touches on a column that has a clustered primaryindex on it will still cause a table lock.So, for example:UPDATE ORDERS SETprod = @product,val = @valWHERE ordid = @ordidIn this case ordid has a clustered primary index on it.Can anyone tell me if this would be the case, and if there's a way ofensuring that we are only doing a row lock on the record specified inthe where condition?Many, many thanks in advance!Much warmth,Murray
We recently implemented merge replication.We were expereincing. The replication is between 2 SQL Servers (2005) over same network box, and since we have introduced the replication, the performance has degraded considerably on subscriber end.
1) One thing that should be mention is that its a "unidirectional Direction" flow of changes is from publisher towards subscriber (only one publisher and distributor as well and one subscriber ).
2) Updates are high than inserts and only one article let say "Article1" ave update up to 2000 per day and i am experiecing that dbo.MSmerge_upd_sp_Article1_GUID taking more cpu time.what should be do..
on subscriber database response time is going to slow and i am experiencing a lot of number of LOCK time outs on application end.
can any one can also suggest me server level settings for aviding locking time out.
Error: The Script returned a failure result. Task SCR REIL Data failed
OnError - Task SQL Insert Error Msg Error: A deadlock was detected while trying to lock variable "System::ErrorCode, System::ErrorDescription, System::ExecutionInstanceGUID, System::StartTime, User::FEED_ID, User::t_ProcessedFiles" for read access. A lock could not be acquired after 16 attempts and timed out. Error: The expression ""EXEC [dbo].[us_sp_Insert_STG_FEED_EVENT_LOG] @FEED_ID= " + (DT_WSTR,10) @[User::FEED_ID] + ", @FEED_EVENT_LOG_TYPE_ID = 3, @STARTED_ON = '"+(DT_WSTR,30)@[System::StartTime] +"', @ENDED_ON = NULL, @message = 'Package failed. ErrorCode: "+(DT_WSTR,10)@[System::ErrorCode]+" ErrorMsg: "+@[System::ErrorDescription]+"', @FILES_PROCESSED = '" + @[User::t_ProcessedFiles] + "', @PKG_EXECUTION_ID = '" + @[System::ExecutionInstanceGUID] + "'"" on property "SqlStatementSource" cannot be evaluated. Modify the expression to be valid.
Warning: The Execution method succeeded, but the number of errors raised (4) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
And how did I get 4 errors? - I only set my script task result to failure
I have a script that automates some db drop/restore operations and bringing the database to single user mode is part of it: ALTER DATABASE ... SET SINGLE_USER WITH ROLLBACK IMMEDIATE...I want this to executes under a login, that has restricted privileges, so I've created a login and granted it a dbcreator role + ALTER ANY DATABASE privileges.
Problem: When I run the script against a database with an active/sleeping connection:It fails when using the restricted login: "Msg 5061...ALTER DATA BASE failed because a lock could not be placed on database ..."It completes successfully when using a sysadmin login According to stackoverflow.com the solution is to kill the active/sleeping connections to the database, before ALTER-ing it, which works fine, but the question is....
Questions: Why the "ALTER DATABASE..." statement works under the sysadmin login, but not under a dbcreator one?Does this mean the sysadmin login kills the connections to the target database in the background?Is it possible to grant additional privileges to the restricted login, so the "ALTER DATABASE..." statement won't need preventive killing of the connections?
One annoying problem has occurred....I want to put database in offline mode....but it is giving me some error...
"ALTER DATABASE failed because a lock could not be placed on database [database]. Try again later. ALTER DATABASE statement failed. (Microsoft SQL Server, Error: 5061)" ...
I am using SQL Server 6.5 and this database has 50 database tables and each table has 10000 records in it. So, in short it is very big database. Very often I get the "Database Lock" problem. I tried to tune the database but no help. Is it like that if I upgrade SQL Server 6.5 to SQL Server 7.0, will it help me ? If yes, how ?
In my system I got a problem. One of the database's Tables are not getting opened and I can not even alter a Stored Procedure which is related to the tables of that database. How can I solve that? Subhasish
Hi We are facing an acute situation in our web-application. Technology is ASP.NEt/VB.NET, SQL Server 2000.
Consider a scenario in which User 1 is clicking on a button which calls a SQL stored procedure. This procedure selects Group A of records of Database Page1.
At the same time if User 2 also clicks the same button which calls same SQL stored procedure. This procedure selects Group B of records of Database Page1.
So, its the same Page1 but different sets of records. At this moment, both the calls have shared locked on the Page1 inside the procedure.
Now, in call 1, inside the procedure after selecting Group A of records, the next statement is and update to those records. As soon as update statement executes, SQL Server throws a deadlock exception as follows :
Transaction (Process ID 78) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction
We are able to understand why its happening. Its because, Group A and Group B of records are on the same Page1. But both the users have shared lock on the Page1. So, no one gets the exclusive lock in records for update, even though, the records are different.
How can I resolve this issue? How can I get lock on wanted rows instead of entire page?
How do I lock entire database? I want an exclusive lock on the db by a user who is the dbo of that database only (not sa user).
The scenario is we have a web application and each week we need to do data uploads (with etl). During this upload, the users accessing the website should not be able to read data. This is why I want database lock.
Now the catch here is, the application access the data using user say abc. Abc is dbo for that database and the etl is also done by abc login. So will db locking help in this case as the website can also read the data being a abc user?
Hello,Could some clarify for me? I’ve been told that I should add lock statements to my database write querys (update, delete, insert). I’m not sure If it’s necessary though. The only documentation I’ve found so far is how to implement a lock statement with threading but I’m not using the threading namespace. What best practice and how would I implement this without threading or should I be using threading?Thanks in advance!
I am in the process of writing a stored procedure that takes data from a table, within a cursor, and puts that data to the network server via the xp_cmdshell procedure. The xp_cmdshell command is at the end of the cursor in the SP.
I have worked with the admins on getting the proper permissions to execute this procedure and have tested it by running the procedure directly in Query Analyzer and by calling it in Query Analyzer successfully.
However when I try to call this procedure from another programming envirnoment, once the xp_cmdshell process is called, the procedure locks up and blocks the rest of the process, including the other software programming environment. When I look at the error log, it indicates that correct command, as a matter of fact I have copied the log file command into Query Analyzer and ran the procedure successfully. So for some reason, the statement is not being committed or there is some kind of threading issue I do not know how to address.
If anyone out there has any suggestions or has a simular problem in the past and knows how to address it, it would be greatly appreciated.
I have question on lock on table in SQL Server while inserting data using multiple processes at a single time into same table.Here are my questions on this,
1) Is it default behavior of SQL server to lock table while doing insert? 2) if yes to Q1, then how we can implicitly mention while inserting data. 3) If I have 4 tables and one table is having foreign keys from rest of the 3 tables, on this scenario do I need to use the table lock explicitly or without that I can insert records into those tables?
I was at a conference and had it asserted to me by an Oracleafficiando that Oracle and DB2 handled low-level locks "better" thanSQL Server, and that this was likely the cause of SQL Server'srelatively slower and more deadlock-prone performance when running thesame application. (SQL does seem to perform more poorly for this app,a PeopleSoft customer service and billing app.)Is there any significant difference in lock escalation strategiesbetween major databases, and if so, what are the implications?Jeff Roughgarden, MSCD, MCDBA
Hi, I have set of 2 DTS packages, one of which calls the other by forming a command-line (dtexec) using a Execute Process task.
From the parent package-> Execute Process Task-> dtsexec /F etc... /<pkg variable> = "servername"
Each of the parent and the called package have a variable: "User::DWServerSQLInstance" which is mapped to the SQL server connection manager server name property using an expression. The outer package has the above variable and so does the inner called package (which gets assigned through the command line from the outerpackage call to inner)
I "sometimes" get the following error:
OnError,I4,TESTDOMAdministrator,ACDWAggregation,{A1F8E43F-15F1-4685-8C18-6866AB31E62B},{77B2F3C7-6756-46EB-8C01-D880598FB4B3},5/22/2006 5:10:28 PM,5/22/2006 5:10:28 PM,-1073659822,0x,The variable "User::DWServerSQLInstance" is already on the read list. A variable may only be added once to either the read lock list or the write lock list.
Help would be appreciated!
I have seen other posts on this but, not able to relate the solution to my scenario.
Package works fine when run from BIDS, DTExec, dtexecui. When I schedule it, It get these random errors. (See below)
The main culprit is a variable called "RecordsetFileDIR" which is set using an expression. (@[User::_ROOT] + "RecordSets\")
A number of other variables use this as part of their expression and as they all fail, pretty much everything dies.
I have installed SP1 (Not Beta) on server. Package uses config files to set the value of _ROOT.
The error does not always seem to be with this particular variable though. Always a variable that uses an expression but errors are random. Also, It will run 3 out of 10 times without a problem. I am the only person on the server at the time.
Any ideas?
Cheers,
Crispin
Error log:
OnError,,,POSBasketImport,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1073659822,0x,The variable "User::RecordsetFileDIR" is already on the read list. A variable may only be added once to either the read lock list or the write lock list.
OnError,,,POSBasketImport,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1073639420,0x,The expression for variable "rsHeaderFile" failed evaluation. There was an error in the expression.
OnError,,,DF_Header_Header,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1071636247,0x,Accessing variable "User::rsHeaderFile" failed with error code 0xC00470EA.
OnError,,,Move All Data,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1071636247,0x,Accessing variable "User::rsHeaderFile" failed with error code 0xC00470EA.
OnError,,,Load Open Batches and Process Files,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1071636247,0x,Accessing variable "User::rsHeaderFile" failed with error code 0xC00470EA.
OnError,,,POSBasketImport,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1071636247,0x,Accessing variable "User::rsHeaderFile" failed with error code 0xC00470EA.
OnError,,,DF_Header_Header,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1071636390,0x,The file name is not properly specified. Supply the path and name to the raw file either directly in the FileName property or by specifying a variable in the FileNameVariable property.
OnError,,,Move All Data,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1071636390,0x,The file name is not properly specified. Supply the path and name to the raw file either directly in the FileName property or by specifying a variable in the FileNameVariable property.
OnError,,,Load Open Batches and Process Files,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1071636390,0x,The file name is not properly specified. Supply the path and name to the raw file either directly in the FileName property or by specifying a variable in the FileNameVariable property.
OnError,,,POSBasketImport,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1071636390,0x,The file name is not properly specified. Supply the path and name to the raw file either directly in the FileName property or by specifying a variable in the FileNameVariable property.
OnError,,,DF_Header_Header,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1073450901,0x,"component "rsHeader" (365)" failed validation and returned validation status "VS_ISBROKEN".
OnError,,,Move All Data,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1073450901,0x,"component "rsHeader" (365)" failed validation and returned validation status "VS_ISBROKEN".
OnError,,,Load Open Batches and Process Files,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1073450901,0x,"component "rsHeader" (365)" failed validation and returned validation status "VS_ISBROKEN".
OnError,,,POSBasketImport,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1073450901,0x,"component "rsHeader" (365)" failed validation and returned validation status "VS_ISBROKEN".
I want to lock a table so others cannot lock it but able to read it inside transactions.
The coding I need is something like this: set implicit_transactions on begin transaction select * from table1 with (tablock, holdlock) update table2 set field1 = 'test' commit transaction commit transaction
I have tried the coding above, it won't prevent others from locking table1.
So, I changed the tablock to tablockx to prevent others from locking table1. But this will also prevent others from reading table1. So, how can I lock table1 so others cannot lock it but still able to read it?
hi need help how to send an email from database mail on row update from stored PROCEDURE multi update but i need to send a personal email evry employee get an email on row update like send one after one email
i use FUNCTION i get on this forum to use split from multi update
how to loop for evry update send an single eamil to evry employee ID send one email
Hello,I am trying to update records in my database from excel data using vbaeditor within excel.In order to launch a query, I use SQL langage in ADO as follwing:------------------------------------------------------------Dim adoConn As ADODB.ConnectionDim adoRs As ADODB.RecordsetDim sConn As StringDim sSql As StringDim sOutput As StringsConn = "DSN=MS Access Database;" & _"DBQ=MyDatabasePath;" & _"DefaultDir=MyPathDirectory;" & _"DriverId=25;FIL=MS Access;MaxBufferSize=2048;PageTimeout=5;" &_"PWD=xxxxxx;UID=admin;"ID, A, B C.. are my table fieldssSql = "SELECT ID, `A`, B, `C being a date`, D, E, `F`, `H`, I, J,`K`, L" & _" FROM MyTblName" & _" WHERE (`A`='MyA')" & _" AND (`C`>{ts '" & Format(Date, "yyyy-mm-dd hh:mm:ss") & "'})"& _" ORDER BY `C` DESC"Set adoConn = New ADODB.ConnectionadoConn.Open sConnSet adoRs = New ADODB.RecordsetadoRs.Open Source:=sSql, _ActiveConnection:=adoConnadoRs.MoveFirstSheets("Sheet1").Range("a2").CopyFromRecordset adoRsSet adoRs = NothingSet adoConn = Nothing---------------------------------------------------------------Does Anyone know How I can use the UPDATE, DELETE INSERT SQL statementsin this environement? Copying SQL statements from access does not workas I would have to reference Access Object in my project which I do notwant if I can avoid. Ideally I would like to use only ADO system andSQL approach.Thank you very muchNono
I want to retrieve the last update time of database. Whenever any update or delete or insert happend to my database i want to store and retrieve that time.
I know one way is that i have to make a table that will store the datetime field and system trigger / trigger that can update this field record whenever any update insert or deletion occur in database.
But i don't know exactly how to do the coding for this?
We have a SQL database that uses Active Directory with Windows Authentication. Can users that are members of the Active Directory group that has read/write access to the SQL database create ODBC connections to access the database directly and update the data? They dont have individual logins on the server. They are only members of the Active Directory group that has a login?
I had to to relocate the database log file and I issued an Alter database command but by mistake I put a space in the file name as below. The space is at the beginning file name. Now I am unable get the database loaded to SQL Server. The database has 2 replications configured, so deleting and re-attaching the database means the replication needs to be re-configured. Is there an alternative way to issue a command to update the database FILENAME ? Not sure if this can be edited in master database (sys files).
ALTER DATABASE [User_DB] MODIFY FILE (NAME = User_DB_log, FILENAME = 'I:SQLLogs User_DB_log.ldf') GO