SQL Server 2014 :: Minimizing Locking On Update Statements?
May 4, 2015
I have a main table called Stock for example.
One process ( a service ) inserts data into this table , as well as updates certain fields of the table periodically.
Another process ( SQL Job ) updates the table with certain defaults and rules that are unknown to the service - to deal with some calculations and removal of null values where we can estimate the values etc.
These 2 processes have started to deadlock each other horribly.
The SQL Job calls one stored procedure that has around 10 statements in it. This stored proc runs every minute. Most of them are of the form below - the idea being that once this has corrected the data - the update will not affect these rows again. I guess there are read locks on the selecting part of this query - but usually it updates 0 rows - so I am wondering if there are still locks taken ?
UPDATE s
SET equivQty = Qty * ISNULL(p.Factor,4.5) / 4.5
FROM Stock s
LEFT OUTER JOIN Pack p
on s.Product = p.ProductId
AND s.Pack = p.PackId
WHERE ISNULL(equivQty,0) <> Qty * ISNULL(p.Factor,4.5) / 4.5
The deadlocks are always between these statements from the stored procedure - and the service updating rows. I can't really see how the deadlocks occur but they do.
Another suggestion has been to try and use an exists before the update as below
IF EXISTS( SELECT based on above criteria )
( UPDATE as before )
Does this reduce the locking at all ? I don't know how to test the theory - i added this code to some of the statements, and it didn't seem to make much difference ?
Is there a way to make a process ( in my case the stored procedure ) - give up if it can't aquire the locks rather than being deadlocked - which leads to job failures and emails etc ?
We are currently trying to filter down the data that is updated to be only the last few months - to reduce the amount of rows even analyzed - as the deadlocking does seem to be impacted by the number of rows in the tables.
Hello all, I am working on an Access 2000 project that is used to generate reports on archived sales data. This app uses linked tables that are pointing to some SQL 2K tables. Some of the queries are large and end up creating a table lock when executed, blocking all other users from using the app until it has finished executing the query. I would like to have queries that read data ignore this lock on the table but do not know of a way to do this using linked tables in Access 2000. I would use a NOLOCk table hint but need something else for this issue. Any ideas? Thanks
I have two tables for insertion in one transaction scope. Table one have 10 rows. After first table insert statement (not yet committed) if I run select on first table from other session, it holds table until my insert is committed or rolled back and from (SSMS), it display 10 rows and then wait for transaction scope till finished. My question is do I need to use no lock hint in this situation. Or there is something wrong with isolation level. One saying that in this situation table should not hols select while insert is in transaction scope.
I would like to know if there is any option to Restrict DML statements in SSMS for a user where the same user should be able to perform these actions through application on particular database.
I have a query with huge number of case statements. Basically I need to short this query with getting rid of these hundreds of CASE statements.
Because of the nature of the application I am not allowed to use a function, and just wondering if there is a possible way to rewrite this with COALESCE().
SELECT CASE WHEN A.[COL_1] LIKE '%cricket%' THEN 'ck' + ',' ELSE '' END + CASE WHEN A.[COL_1] LIKE '%soccer%' THEN 'sc' + ',' ELSE '' END + .... CASE WHEN A.[RESIUTIL_DESC] LIKE '%base%ball' THEN 'BB' + ',' ELSE '' END FROM TableName A
I have a script contains multiple statements to update multiple tables. How can I make sure that either all statements get executed successfully or no changes apply to the tables (in case one or more errors occur)? I've been searching on Internet and it seems like I need to use Rollback and begin transaction.
fix the below SP which is having cursor output with an alternate logic which can make the SP work.(may be using temp table or any other option)
CREATE PROCEDURE Get_IDS /* * SSMA warning messages: * O2SS0356: Conversion from NUMBER datatype can cause data loss. */ @p_uid float(53), @return_value_argument varchar(8000) OUTPUT
When viewing an estimated query plan for a stored procedure with multiple query statements, two things stand out to me and I wanted to get confirmation if I'm correct.
1. Under <ParameterList><ColumnReference... does the xml attribute "ParameterCompiledValue" represent the value used when the query plan was generated?
2. Does each query statement that makes up the execution plan for the stored procedure have it's own execution plan? And meaning the stored procedure is made up of multiple query plans that could have been generated at a different time to another part of that stored procedure?
We have a database on a 2005 box, which we need to keep in sync with one on a 2014 box (until we can turn off the one on 2005). The 2005 database is still being updated with changes that must be applied to the 2014 database, given the nature of the data (medical documents) we need to ensure updates are applied to the 2014 database in very near real time (these changes are - for example - statuses, not the documents themselves).
Cunning plan #1, ulgy - not at all a fan of triggers - but use an after update trigger to run a sp on the remote box via a linked server in this format, with a SQL Server login for the linked server with permissions to EXEC the remote proc.
CREATE TRIGGER [dbo].[SourceUpdate] ON [dbo].[SourceTable] AFTER UPDATE AS SET XACT_ABORT ON; SET NOCOUNT ON; IF UPDATE(ColumnName)
[Code] ....
However, while the sp can be run against the linked server as a standalone query OK, when running it in a trigger it's throwing
OLE DB provider "SQLNCLI" for linked server "WIBBLE" returned message "The transaction manager has disabled its support for remote/network transactions.".
Msg 7391, Level 16, State 2, Procedure TheAfterUpdateTrigger, Line 19
The operation could not be performed because OLE DB provider "SQLNCLI" for linked server "WIBBLE" was unable to begin a distributed transaction.
Whether it actually possible to call a proc on a remote box via a trigger and if so what additional hoops need to be jumped through (like I said, it'll run OK called via SSMS)?
I have this script in my database, but it always gives 2054 rows back and if I actually DO change something it doesn't even notice...
UPDATE a SET a.[omschrijving]=SP.[omschrijving] ,a.[verkoopprijs]=SP.[verkoopprijs] ,a.[gewijzigd]=getDate() FROM [artikelen] a LEFT OUTER JOIN [Hofstede].[dbo].[sparepartsupdate] SP ON a.PartNrFabrikant = sp.PartNrFabrikant WHERE ((A.omschrijving != SP.[omschrijving]) OR (A.[verkoopprijs] != SP.[verkoopprijs]))
I am new to this level of coding in SQL SERVER 2012, but I am looking to update the TITLE_YEAR field in the temp table with the Year the employee is in that title. For example for employee 11127 the data should look like this:
EMP_IDTitle DateValue TITLE_YEAR 3 Senior Consultant 2009-01-01 00:00:00.0001 3 Director 2010-01-01 00:00:00.0001 3 Director 2011-01-01 00:00:00.0002 3 Director 2012-01-01 00:00:00.0003 3 Director 2013-01-01 00:00:00.0004 3 Senior Director 2014-01-01 00:00:00.0001
I have got 4 MS Access Database Files, which have got 3 Tables each, means Total 12 Tables which gets updated with new data every evening, by an external application. Means new data gets appended to all these 12 Tables.
I want to have exact same 4 Databases, which have got 3 Tables each, means Total 12 Tables, but WITHIN MS SQL SERVER. And then update all of these 12 Tables every evening, with the corresponding updates from the respective tables from the MS Access Databases.
I do not want to Manually Update all these 12 tables every evening into SQL Server. Hopefully there would be some easier method to do this in automatic manner.
I'm working on databases where statistics of some indexes (tables) are changing too frequently. Once I update them manually, one minute after they get 10-20% change, and five minutes after they get over 100% change. Tables get updated very frequently (multiple times in a second).
When I run a query to read from sys.stats, sys.dm_db_stats_properties and other dynamic views, I see that they were last updated when I did it manually, but the change rate overpassed the 500+20% (tables have multiples of 10K rows). Auto create and update statistics are set to true on all databases, and I don't know why sql server does not do that automatically.
I have a job scheduled that imports a table from a Oracle database. The job runs at 3am and reports success. But for some reason when i query the table to see how many records there are, I see the same row count as the day before (it should increase everyday- student enrollment). When i execute the package manually, the table updates fine.
We face slow performance issue for like taking long time for same query execution after We apply index rebuild and reorganize index. But, after execution of query or procedure for 2 -3 times, performance will be faster. I have following questions
1 do we need to update stats after we rebuild an reorganize index. 2. is it will be slow for 1-2 times for every query and stored procedure execution after we rebuild and reorganize index?
I would like a simple statement to select some rows and then update them. I would like to use SERIALIZABLE locking, will this do the trick. Below is my TSQL
SET TRANSACTION ISOLATION LEVEL SERIALIZABLE begin tran select * from patient_medication_dispersal_ where ddate = convert(char(10),getdate(),101)
update patient_medication_dispersal_ set rec_status = 1 where ddate = convert(char(10),getdate(),101) and rec_status =1
We’re creating a program and need some advice about locking. The program will select data for the user to view; the user may or may not then perform an update on the set. There exists a possibility of another user attempting the same operation at the same time. Thus we want to have the first user lock the data set and the second to be able to select only until the first is complete. We’ve run some tests using the UPDLOCK hint. It works as expect except for the blocking. If a connection attempts to create another Update Lock after another connection has one, then the new connection is blocked. We’re looking for a way to let the program know that an update lock is in existence before it tries to create a new one. The sp_lock procedure can display all the locks, but you can not select specific values (i.e. select any update locks on ObjectID = x). Do any of you know another way or how we can use sp_lock for our purpose?
I am trying to run a simple update statement that updates around 1 million records on SQL Server 2005. For example:
update customers set CustCode='AAB' where CustType=72
I would like to update the table WITHOUT locking. In this case, there is no need to have "all or nothing" transactions. If it does a partial update and then fails, it's ok to only have half the records updated.
The server is using up a lot of resources creating and releasing the locks. Plus users are getting locked out of the records during the update. I know this is by design, but in this case it's OK. I know I can use the "set transaction isolation level READ UNCOMMITTED" statement to fix the select statements from getting blocked, but there are way too many places that would have to be changed. Plus there are other updates to this table that need to be locked.
So here is my question: Is there a way to do a transaction-less update?
If you were to do a fresh install it would set permissions on the disk so everything just works.
Now when changing the service account (e.g. to a domain user) use the configuration manager, does it do the same magic (possibly sans if the database data/log files are on another disk)? Or do you need to trawl through the dozens of folders and assign rights manually?
I am trying to execute a stored procedure to update a table and I am getting Invalid Object Name. I am create a cte named Darin_Import_With_Key and I am trying to update table [dbo].[Darin_Address_File]. If I remove one of the update statements it works fine it just doesn't like trying to execute both. The message I am getting is Msg 208, Level 16, State 1, Line 58 Invalid object name 'Darin_Import_With_Key'.
BEGIN SET NOCOUNT ON; WITH Darin_Import_With_Key AS ( SELECT [pra_id] ,[pra_ClientPracID]
I have a table #vert where I have value column. This data needs to be updated into two channel columns in #hori table based on channel number in #vert table.
CREATE TABLE #Vert (FILTER VARCHAR(3), CHANNEL TINYINT, VALUE TINYINT) INSERT #Vert Values('ABC', 1, 22),('ABC', 2, 32),('BBC', 1, 12),('BBC', 2, 23),('CAB', 1, 33),('CAB', 2, 44) -- COMBINATION OF FILTER AND CHANNEL IS UNIQUE CREATE TABLE #Hori (FILTER VARCHAR(3), CHANNEL1 TINYINT, CHANNEL2 TINYINT) INSERT #Hori Values ('ABC', NULL, NULL),('BBC', NULL, NULL),('CAB', NULL, NULL) -- FILTER IS UNIQUE IN #HORI TABLE
One way to achieve this is to write two update statements. After update, the output you see is my desired output
UPDATE H SET CHANNEL1= VALUE FROM #Hori H JOIN #Vert V ON V.FILTER=H.FILTER WHERE V.CHANNEL=1 -- updates only channel1 UPDATE H SET CHANNEL2= VALUE FROM #Hori H JOIN #Vert V ON V.FILTER=H.FILTER WHERE V.CHANNEL=2 -- updates only channel2 SELECT * FROM #Hori -- this is desired output
my channels number grows in #vert table like 1,2,3,4...and so Channel3, Channel4....so on in #hori table. So I cannot keep writing too many update statements. One other way is to pivot #vert table and do single update into #hori table.
I do not insert/update/delete on the view directly.
For every insert/update in table A /B the values should get insert/update in the view respectively. This insert/update on view should invoke the trigger.
And I am unable to see this trigger work on the view if any insert/update occurs on base table level.
Trigger is working only if any operation is done directly on the view.
We have a WEB site and my BOSS wants that It should work like me (24x7). Any time when one SQL Server go down , another should take its place. What will be the best way to do this? Can any body guide me in DETAIL. or refer to me at any good material (STEP BY STEP)
I faced problem related to Locking and Isolation Level on Table(s).
My problems is there r some tables which r frequently updated, and I also want to fire select query over those tables every 1 seconds and want to get only committed records. In current scenario we start transactions with ReadCommitted Lock for updating records. But in this scenario I can€™t get select query result because of some of recourses r used by transactions so after some time it gives Deadlock error.
So I want solution like both operation run simultaneously and get only committed records at a time of transaction running Please help me for solving my problem.
I have a DB of transaction log grown to 19GB. I have some questions. 1. Is there any way that we can minimize the entries to transaction log. 2. If i use Transactions in my stored procedure will it be usefull to automatically remove the transaction entries in transaction log once after the commit transaction is given. 3. Is there a way to reduce my current transaction log and not to grown in such a manner futher.
Hi guys! Is there a way to combine these update statements? Dim update_phase As New SqlCommand("INSERT INTO TE_shounin_zangyou (syain_No,date_kyou,time_kyou) SELECT syain_No,date_kyou,time_kyou FROM TE_zangyou WHERE [syain_No] = @syain_No", cnn) Dim update_phase2 As New SqlCommand(" UPDATE TE_shounin_zangyou SET " & " phase=2, phase_states2=06,syounin2_sysd=CONVERT(VARCHAR(10),GETDATE(),101) WHERE [syain_No] = @syain_No", cnn)
The same table is updated so I think it would be better to have just one update statement. But the problem is that, the first update statement retrieves values from another table, whereas the update values of the second statement is fixed. Is there a way to combine these two statements. I tried to do so but it does not update. Here's my code... Dim update_phase As New SqlCommand("UPDATE TE_shounin_zangyou SET TE_shounin_zangyou.syain_No=TE_zangyou.syain_No, TE_shounin_zangyou.date_kyou=TE_zangyou.date_kyou, TE_shounin_zangyou.time_kyou=TE_zangyou.time_kyou FROM TE_zangyou WHERE TE_zangyou.syain_No = TE_shounin_zangyou.syain_No", cnn) Please help me. Thanks.
I am creating my companys' database and I have a small problem that must be solved.
I have a pictures table: PicturesTable ------------- ProductID int ForeignKey Picture nvarchar(30) ...
(A product can have many pictures & the ProductID is unique for any product, but not for the table of pictures).
What I want to do is to somehow do a procedure to: 1) Check if any images (for a productID) exists in the table 2) if they do not exist then add the appropriate images into the table 3) if the images exist, then update the images with the new one that I have.
What I thought was to just delete all the images from the table for the specific product:
DELETE FROM PicturesTable WHERE ProductID = '10-11'
and then add the appropriate images: INSERT INTO PicturesTable (ProductID, Picture) VALUES ('10-11', 'Dir1/Pic1.gif') INSERT INTO PicturesTable (ProductID, Picture) VALUES ('10-11', 'Dir1/Pic2.gif') INSERT INTO PicturesTable (ProductID, Picture) VALUES ('10-11', 'Dir1/Pic3.gif')
but I do not like a lot this idea because if a user tries to read the pictures for that product (at the same time I was deleting them) s/he would get nothing. Is any other way that I can do it please?
My manager is interested in knowing if there is a way to update our website's SQL database using a method with excel, similar to importing.
The person who was previously in my position had imported a few hundred new products into the database with an excel spreadsheet.
Now, we would like to make updates such as a price changes or similar adjustments to a number of the products in the database. We could use a web interface, but ours requires us to find each product individually and it takes too much time. I told him that it would probably be necessary to write an SQL statement to update the tables, but we're also interested in maintaining the integrity of the database and are worried about loosing data due to a typo. Is it possible to export the db contents to an excel file, make changes, and then merge those changes into the existing database? I have tried and failed, so I am wondering if any experienced users could help me out.
Also, is there some kind of phpmyadmin for MS SQL? A free, open source alternative would be best.