Lets say I have table A and table B and another table AB where each row in AB refereces a row in A and a row in B. Furthermore, I set both relationships to cascade upon delete.
Then one user deletes a row from A which cascades to two rows in AB. But another user has deleted a row in B which is also trying to delete the same two rows in AB. The first transaction deletes one of the AB rows, the second deletes the other and then both transactions cannot get the other row in AB to delete because its locked. So this is a deadlock! Is it really that easy to get a dead lock?
I have a table with contains Text column. and which is accessed by each page of the site. It is getting frequent Dead Lock Any body can suggest me a good solution pl.
I have 2 transactions, the first one has MANY updates to the table A and it finishes with a commit or rollback (ONLY AT THE END), the second one has only one insert into the table A that finishes with a commit or rollback, the problem is that the update process takes a long time to finish, and the insert process could be thrown during the first process, there's where I get everything locked cause the table A is locked and my java aplication gets stuck.
Note: When I execute each transaction independient I have no problems.
Is there any possibility to lock table A completly for the first transaction and release It for second one ??
Could you give me any suggestion of what to do step by step ?
Hi There, I was updating a Custome Error Log table in different Data Flow Task in a Package. Custom log table is being used to log the Redirect Rows which sent by a component if it has errors. So I am catching the error raised by componets and putting them all into one table. I thing this could be the cause for the following error. I am not sure Please help me out. What could be the cause of this error and how to prevent it?
SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft OLE DB Provider for SQL Server" Hresult: 0x80004005 Description: "Transaction (Process ID 71) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.".
I am build an application based on different WCF services which colect different type of data from an SQL database. I can have up to 10 clients whcih could at a certain times collect the same data as a read or right operation. I have run a test where I have only 2 clients, and I have been surprise to receive a nice error of dead lock process.
The error is as follow :
Transaction (Process ID 63) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction."
In the store procedure which I am suspected, not sure yet, I have been advise to use the
SET TRANSACTION ISOLATION LEVEL SERIALIZABLE; but no help
According to what I could notice in my application, I have one process which insert rows in this table, while clients read the same table..but I guess dead lock need to investigate in a more better way and handle it anyway...
I am not a databse administrator so I have no idea how to handle such situation.
Please help me to remove deadlock. Server Error in '/' Application.
Transaction (Process ID 69) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.
Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.
Exception Details: System.Data.SqlClient.SqlException: Transaction (Process ID 69) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.
Source Error:
Line 292: da.SelectCommand.Parameters.Add(HierCode3)Line 293: Dim ds As New DataSetLine 294: da.Fill(ds, "Employees")Line 295: Con.Close()Line 296: Dim dt As DataTable = ds.Tables("Employees")
I'm having a transactional replication which replicates data every 1min.... and i got a job which runs every 15mins and pulls the reports from source db which is being replicated.
Some times when the replication is pushing the data the same time the job is trying to generate reports and as the table is being locked at the time of replication the dead lock issue is rinsing and the job is getting failed.
i am having an exe application which runs every half an hour which will do some database operations, at the same time the web application will also uses the same DB when users acces the web application. at this time i am getting Dead Lock Exception as below, to avoid this what should i do. please guide me.
System.Data.SqlClient.SqlException: Transaction (Process ID 88) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction. at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection) at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection) at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj) at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj) at System.Data.SqlClient.SqlDataReader.HasMoreRows() at System.Data.SqlClient.SqlDataReader.ReadInternal(Boolean setTimeout) at System.Data.SqlClient.SqlDataReader.Read() at System.Data.Common.DataAdapter.FillLoadDataRow(SchemaMapping mapping) at System.Data.Common.DataAdapter.FillFromReader(DataSet dataset, DataTable datatable, String srcTable, DataReaderContainer dataReader, Int32 startRecord, Int32 maxRecords, DataColumn parentChapterColumn, Object parentChapterValue) at System.Data.Common.DataAdapter.Fill(DataTable[] dataTables, IDataReader dataReader, Int32 startRecord, Int32 maxRecords) at System.Data.Common.LoadAdapter.FillFromReader(DataTable[] dataTables, IDataReader dataReader, Int32 startRecord, Int32 maxRecords) at System.Data.DataTable.Load(IDataReader reader, LoadOption loadOption, FillErrorEventHandler errorHandler) at System.Data.DataTable.Load(IDataReader reader) at B.DAL.LRQ.Select(String sLoanID) at QN.Program.QNo()
I am using conditional split Checking to see if a record exists and if so update else insert. But this cause database dead lock any one has suggestion?
any useful SQL Queries that might be used to identify lists of potential duplicate records in a table?
For example I have Client Database that includes a table dbo.Clients. This table contains various columns which could be used to identify possible duplicate records, such as Surname | Forenames | DateOfBirth | NINumber | PostalCode etc. . The data contained in these columns is not always exactly the same due to differences caused by user data entry; so some records may have missing data from some of the columns and there could be spelling differences too. Like the following examples:
1 | Smith | John Raymond | NULL | NI990946B | SW12 8TQ 2 | Smith | John | 06/03/1967 | NULL | SW12 8TQ 3 | Smith | Jon Raymond | 06/03/1967 | NI 99 09 46 B | SW12 8TQ
The problem is that whilst it is easy for a human being to review these 3 entries and conclude that they are most likely the same Client entered in to the database 3 times; I cannot find a reliable way of identifying them using a SQL Query.
I've considered using some sort of concatenation to a new column, minus white space and then using a "WHERE column_name LIKE pattern" query, but so far I can't get anything to work well enough. Fuzzy Logic maybe?
the results would produce a grid something like this for the example above:
ID | Surname | Forenames | DuplicateID | DupSurname | DupForenames 1 | Smith | John Raymond | 2 | Smith | John 1 | Smith | John Raymond | 3 | Smith | Jon Raymond 9 | Brown | Peter David | 343 | Brown | Pete D next batch of duplicates etc etc . . . .
The following error is encountered when importing a delimited flat file with date of fomat "dd.mm.yyyy"
Error: 0xC02020A1 at Data Flow Task, Source - DCDtest_xpt [1]: Data conversion failed. The data conversion for column "value date" returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
This was when I manually built the package.
I get the same error when using the Import/Export wizard
I am even using the "suggest types" button and have tried sampling the default number of rows (?200) and also 2000.
The type it suggests is DT_DATE.
But reading the BOL, this would appear to be the wrong type:
http://msdn2.microsoft.com/en-us/library/ms141036.aspx (obviously the right hand doesn't know what the left hand is doing)
seems to indicate that
DT_DBTIMESTAMP
is the correct value
(I cannot believe that they have different datatypes in SSIS than in SQL. I can't believe for a minute this is for all the hundreds of thousands of Oracle users who obviously switched to SSIS when they saw what a high quality product it is.)
I tried other DT_... values but no dice.
Can anyone help?
I always thought that Classic ASP was the worst product I've ever worked with from the Microsoft stable, but I was wrong.
I am fed up of having to post on this board (no wonder it is so 'popular')
Talking to peers, reading books, googling nearly always enables me to figure out a problem with any application I have ever used, but SSIS breaks the mould in sheer crapness and the weirdnes and unfathomability of its cryptic errors,.
I want to lock a table so others cannot lock it but able to read it inside transactions.
The coding I need is something like this: set implicit_transactions on begin transaction select * from table1 with (tablock, holdlock) update table2 set field1 = 'test' commit transaction commit transaction
I have tried the coding above, it won't prevent others from locking table1.
So, I changed the tablock to tablockx to prevent others from locking table1. But this will also prevent others from reading table1. So, how can I lock table1 so others cannot lock it but still able to read it?
I have two servers (lets call them sA and sB) connected from sA -> sB via a linked server (i.e. sA pulls data across from sB).
I'm building a temp table full of stock symbols on sA, and then I need to update some values on sA using content on sB. The tables on sB are very large (about 500m rows) so I don't want to pull even close to everything across the linked server. Ordinarily I'd do this by joining to the linked server table, but the target table needs to have nolock on it due to their high use.
update t set someValue = s.SomeValue from #myTab t inner join lnk_sB.xref.dbo.Symbols s with (nolock) on t.id = s.id
From reading around I gather that nolock doesn't work across linked servers. It was noted in another SSC article that you could use nolock by using an OPENQUERY, but then I can't join to my local temp table, and I end up pulling all .5B rows across the linked server.
Is there some way I can limit the content on sB by my temp table on sA but still use nolock?
Error: The Script returned a failure result. Task SCR REIL Data failed
OnError - Task SQL Insert Error Msg Error: A deadlock was detected while trying to lock variable "System::ErrorCode, System::ErrorDescription, System::ExecutionInstanceGUID, System::StartTime, User::FEED_ID, User::t_ProcessedFiles" for read access. A lock could not be acquired after 16 attempts and timed out. Error: The expression ""EXEC [dbo].[us_sp_Insert_STG_FEED_EVENT_LOG] @FEED_ID= " + (DT_WSTR,10) @[User::FEED_ID] + ", @FEED_EVENT_LOG_TYPE_ID = 3, @STARTED_ON = '"+(DT_WSTR,30)@[System::StartTime] +"', @ENDED_ON = NULL, @message = 'Package failed. ErrorCode: "+(DT_WSTR,10)@[System::ErrorCode]+" ErrorMsg: "+@[System::ErrorDescription]+"', @FILES_PROCESSED = '" + @[User::t_ProcessedFiles] + "', @PKG_EXECUTION_ID = '" + @[System::ExecutionInstanceGUID] + "'"" on property "SqlStatementSource" cannot be evaluated. Modify the expression to be valid.
Warning: The Execution method succeeded, but the number of errors raised (4) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
And how did I get 4 errors? - I only set my script task result to failure
Hi,all: This problem almost drives me crazy, hope I can get some hints from you guyz!!! Ok, here is the situation: I wanna only one users 2 modify the data(update) from my page each time, and if at the same time, there are some other users connecting my database through .aspx page, they can only browse the data until the first users finish updating. It seems I need to implement locking the database, but I am not sure how I am gonna do that using asp.net!!! Thanx in advance!
hi, i have an application that updates some records in sql tables, and i want to do a web application that updates records in the some database-table(sql) so, my question is how can i lock the row or table so i dont have concurrency problems.tnx in advance.
Hello Friends, I am having a VB application running for the SQL SERVER DB. The VB application is installed on the multiple of PCs in the network. Now when I am trying to fetch the same from all the different PCs simultaneously, its amazingly fast. But the issue comes when I am trying to update the same table (but different rows) from the different PCs simultaneously. The time taken is directly proportaional to the number of users. I am not getting what could be the problem? Can any one suggest me the approch? Is it some related to table / row / page locking? As all the connections are trying to update on the same table. I checked the isolation level. Its default, "READ COMMITTED". Kindly suggest...
Hi,If I run an insert statement from the query analyzer and then try toopen the table from enterprise manager then it takes long time to openthe table. But this problem dissapears when i put the statement insideBegin/End Transaction statement.Any idea why this is happening?Thank in advance.Taw.
Which lock type or isolation level should I use to be sure that no onewill read or write or do anything with the table I'm using?Code block should look something like this:lock tableread value from tablechange value to new_valueupdate table set value = new_valuerelease lockWhile I'm changing the value absolutly no one should be able to readfrom the table.
I was writing a query using both left outer join and inner join. And the query was ....
SELECT S.companyname AS supplier, S.country,P.productid, P.productname, P.unitprice,C.categoryname FROM Production.Suppliers AS S LEFT OUTER JOIN (Production.Products AS P INNER JOIN Production.Categories AS C
[code]....
However ,the result that i got was correct.But when i did the same query using the left outer join in both the cases
i.e..
SELECT S.companyname AS supplier, S.country,P.productid, P.productname, P.unitprice,C.categoryname FROM Production.Suppliers AS S LEFT OUTER JOIN (Production.Products AS P LEFT OUTER JOIN Production.Categories AS C ON C.categoryid = P.categoryid) ON S.supplierid = P.supplierid WHERE S.country = N'Japan';
The result i got was same,i.e
supplier country productid productname unitprice categorynameSupplier QOVFD Japan 9 Product AOZBW 97.00 Meat/PoultrySupplier QOVFD Japan 10 Product YHXGE 31.00 SeafoodSupplier QOVFD Japan 74 Product BKAZJ 10.00 ProduceSupplier QWUSF Japan 13 Product POXFU 6.00 SeafoodSupplier QWUSF Japan 14 Product PWCJB 23.25 ProduceSupplier QWUSF Japan 15 Product KSZOI 15.50 CondimentsSupplier XYZ Japan NULL NULL NULL NULLSupplier XYZ Japan NULL NULL NULL NULL
and this time also i got the same result.My question is that is there any specific reason to use inner join when join the third table and not the left outer join.
How can I see which table is locked up by some particular process? I know that I can view paricular spid from 'current activity'. But is there any way I can see which table is the center of problem? I really appreciate your help..
Hi, I am working on a project which need to produce a sequential certificate number, Everytime I need a new certificate number, I need to find out what is the max number in the database and then the new certificate number just max+1. But how can I block another transaction to check what is max certi. number while this transaction is in the middle of writing the new certificate number(max+1) into database . Does ADLockOptimistic work in this case? Here is the code: My database sql 2000.
cmdTemp.CommandText="Select max(certificateNumber) from product_table where certificateNumber<> 8888888" set cert_info=Server.CreateObject("ADODB.RecordSet") cert_info.Open cmdTemp, , AdOpenKeySet, adLockOptimistic If Not cert_info.EOF then
How can i give a table wise exclusive lock in MSSQL Server ?
I got the description but, How can I apply this ? The sql : LOCK TABLE <tablename> IN EXCLUSIVE MODE is not working.
Is there any query/method to do this ? Please help ...
thanks
About Exclusive locks -------------------- Exclusive (X) locks are used for data modification operations, such as UPDATE, INSERT, or DELETE.
Other transactions cannot read or modify data locked with an Exclusive (X) lock. If a Shared (S) exists, other transactions cannot acquire an Exclusive (X) lock. --------------------
I need to update a row but keep a lock on the table (so no one else can update it) while I do run some more code. In Oracle, it always locks whatever you update until you hit commit, but sql server works opposite. How do I tell it not to commit a statement, or how would I explicitly get a lock and then release it later?