Conditional Split For Insert Or Update Cause Dead Lock On Database Level
Aug 28, 2007
Hi
I am using conditional split Checking to see if a record exists and if so update else insert. But this cause database dead lock any one has suggestion?
I have a Variable called - UpdateIDs. I would like to create a conditional split on id's that have no responses and id's that have responses. The idea is. There is a whole bunch of tables that can be updated if the foreign key is in the no responses id's.
So I have created a data flow. The conditional split is there, but I do not have a "Update Global Variable" Destination source.
Is there away to achieve this.
Basically this is the logic that I am trying to use
Dataflow --> Store Id's into Global Variable --> Data Flow update/insert Tables and Rows where Id in Gloabl Variable.
Is there another way this can be done? Sorry I am very new to this, and would appreciate any help
How do I lock entire database? I want an exclusive lock on the db by a user who is the dbo of that database only (not sa user).
The scenario is we have a web application and each week we need to do data uploads (with etl). During this upload, the users accessing the website should not be able to read data. This is why I want database lock.
Now the catch here is, the application access the data using user say abc. Abc is dbo for that database and the etl is also done by abc login. So will db locking help in this case as the website can also read the data being a abc user?
I have 2 transactions, the first one has MANY updates to the table A and it finishes with a commit or rollback (ONLY AT THE END), the second one has only one insert into the table A that finishes with a commit or rollback, the problem is that the update process takes a long time to finish, and the insert process could be thrown during the first process, there's where I get everything locked cause the table A is locked and my java aplication gets stuck.
Note: When I execute each transaction independient I have no problems.
Is there any possibility to lock table A completly for the first transaction and release It for second one ??
Could you give me any suggestion of what to do step by step ?
Hi There, I was updating a Custome Error Log table in different Data Flow Task in a Package. Custom log table is being used to log the Redirect Rows which sent by a component if it has errors. So I am catching the error raised by componets and putting them all into one table. I thing this could be the cause for the following error. I am not sure Please help me out. What could be the cause of this error and how to prevent it?
SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft OLE DB Provider for SQL Server" Hresult: 0x80004005 Description: "Transaction (Process ID 71) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.".
I am build an application based on different WCF services which colect different type of data from an SQL database. I can have up to 10 clients whcih could at a certain times collect the same data as a read or right operation. I have run a test where I have only 2 clients, and I have been surprise to receive a nice error of dead lock process.
The error is as follow :
Transaction (Process ID 63) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction."
In the store procedure which I am suspected, not sure yet, I have been advise to use the
SET TRANSACTION ISOLATION LEVEL SERIALIZABLE; but no help
According to what I could notice in my application, I have one process which insert rows in this table, while clients read the same table..but I guess dead lock need to investigate in a more better way and handle it anyway...
I am not a databse administrator so I have no idea how to handle such situation.
Please help me to remove deadlock. Server Error in '/' Application.
Transaction (Process ID 69) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.
Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.
Exception Details: System.Data.SqlClient.SqlException: Transaction (Process ID 69) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.
Source Error:
Line 292: da.SelectCommand.Parameters.Add(HierCode3)Line 293: Dim ds As New DataSetLine 294: da.Fill(ds, "Employees")Line 295: Con.Close()Line 296: Dim dt As DataTable = ds.Tables("Employees")
I'm having a transactional replication which replicates data every 1min.... and i got a job which runs every 15mins and pulls the reports from source db which is being replicated.
Some times when the replication is pushing the data the same time the job is trying to generate reports and as the table is being locked at the time of replication the dead lock issue is rinsing and the job is getting failed.
i am having an exe application which runs every half an hour which will do some database operations, at the same time the web application will also uses the same DB when users acces the web application. at this time i am getting Dead Lock Exception as below, to avoid this what should i do. please guide me.
System.Data.SqlClient.SqlException: Transaction (Process ID 88) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction. at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection) at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection) at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj) at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj) at System.Data.SqlClient.SqlDataReader.HasMoreRows() at System.Data.SqlClient.SqlDataReader.ReadInternal(Boolean setTimeout) at System.Data.SqlClient.SqlDataReader.Read() at System.Data.Common.DataAdapter.FillLoadDataRow(SchemaMapping mapping) at System.Data.Common.DataAdapter.FillFromReader(DataSet dataset, DataTable datatable, String srcTable, DataReaderContainer dataReader, Int32 startRecord, Int32 maxRecords, DataColumn parentChapterColumn, Object parentChapterValue) at System.Data.Common.DataAdapter.Fill(DataTable[] dataTables, IDataReader dataReader, Int32 startRecord, Int32 maxRecords) at System.Data.Common.LoadAdapter.FillFromReader(DataTable[] dataTables, IDataReader dataReader, Int32 startRecord, Int32 maxRecords) at System.Data.DataTable.Load(IDataReader reader, LoadOption loadOption, FillErrorEventHandler errorHandler) at System.Data.DataTable.Load(IDataReader reader) at B.DAL.LRQ.Select(String sLoanID) at QN.Program.QNo()
I have a table with contains Text column. and which is accessed by each page of the site. It is getting frequent Dead Lock Any body can suggest me a good solution pl.
Lets say I have table A and table B and another table AB where each row in AB refereces a row in A and a row in B. Furthermore, I set both relationships to cascade upon delete.
Then one user deletes a row from A which cascades to two rows in AB. But another user has deleted a row in B which is also trying to delete the same two rows in AB. The first transaction deletes one of the AB rows, the second deletes the other and then both transactions cannot get the other row in AB to delete because its locked. So this is a deadlock! Is it really that easy to get a dead lock?
Hi, I'm taking an Excel spreadsheet (that could have around 30k rows) and processing it in SSIS. I essentially have a flag in one of the spreadsheet cols that indicates whether the record is already in the database or not.
I'm splitting the data using a conditional split on this column and using a OLE DB Destination (Fast Load) to perform the inserts and a OLE DB Command to fire a stored procedure to perform any updates. Both the OLE DB Destination and the stored procedure are hitting the same table and the two operations could be executing at the same time as they both appear directly after the Conditional Split, so the OLE DB Destination is set NOT to lock the table.
This seemed to work OK until recently. I've just added 2 triggers onto the table in question which I don't want to fire 30,000 times during the import. As the OLE DB Destination is set to use Fast Load, it doesn't fire the triggers - cool. In the update stored procedure it disables the trigger before performing it's update and re-enables the trigger when finished. Currently this does mean that if you only had updates, the trigger could be enables/disabled 30,000 times. That sounds kinda bad, but I don't really know if this carries a large overhead or not?
If, when importing now you have both updates and inserts the whole process locks up. From looking at activity monitor, it seems as though the INSERT gets suspended.
Do I have a fundamental problem with how I've structured the Data Flow or am I just being really stupid in Enabling/Disabling a trigger that many times, which is probably causing the problem?
Hi,I need to lock a database (prevent users from connecting) in order toupdate it. I already know how to kick everyone out with their spid buti can't figure how to prevent them from reconnecting.Thanks !
I do not insert/update/delete on the view directly.
For every insert/update in table A /B the values should get insert/update in the view respectively. This insert/update on view should invoke the trigger.
And I am unable to see this trigger work on the view if any insert/update occurs on base table level.
Trigger is working only if any operation is done directly on the view.
For doing this I used a simple Conditional Split Task after table 1 First Approach Output Name; Null Data Condition: ISNULL(Col2)
I routed the output Null Data to Table3 and the default to Table2.
Strangely I see some data in Table3 which is not NULL. That is Table 3 is having a data which is not equal to null in Col2. I have no clue why will it do that.
Second approach Output Name: Data, Condition: !(ISNULL(Col2))
I routed the output: Data to Table2 and the default to Table3.
Strangely I see some data in Table3 which is not NULL. That is Table 3 is having a data which is not equal to null in Col2.
I have a oledb source and destination in a data flow task.. I would like to put the records where customer_key is null to an error table and rest of records to a destination table ( customers) using conditional split task.. how can i do this?
I am using a conditional split to evaluate the condition below. It should only send records to my SQL Server database if the PatientZip matches one of the eight below and the PatientCity is not Wichita Falls (you wouldn't believe how bad this is mispelled sometimes). I checked the output table and it has all records for the zipcodes below both matching and non-matching the cityname of Wichita Falls. The table should not have entries for records with the cityname of Wichita Falls. Do I have the code correct or could I have missed something?
I want to use conditional split on a column that has either a 0 or 1 in order to proceed with the workflow on my conditional split command i have ([colnam])==1 but the transformation still grabs all the data in the table whether the condition is 1 or 0. What could I be doing wrong?
Hi, In my Excel file I have the columns Col1, Col2. I want to send those records to Sqlserver table only if the Col1 and Col2 is not null. For this I am using the Conditional Split expression like this: (!ISNULL([Col1])) && (!ISNULL([Col2])). And sending this result to Sqlserver table. But I am not getting any records into the table. But the records col1 and col2 not null exist in Excel file. Is there any thing wrong in my expression?
I am have an ID column that sometimes contains all numeric characters and sometimes contains all digits. I would like to the records with all digits (0-9) to continue downstream in my Data Flow. I would like the records that contain characters other than digits to be logged to a table.
This sounds like a job for the Conditional Split transformation, but I don't see a way to easily test for a numeric value. For example, I would like to use something like ISNUMERIC([MyIDField]) for testing the values in my Conditional Split, but I don't see a way to do this.
Do I have to create a Derived Column transformation prior to my conditional split that populates a "numeric" ID column for each of my records then test this Derived Column in my Conditional Split? Seems like more work than I would to see for something as simple as testing for a numeric...
I have a zipcode column that contains xxxxx-xxxx, i want to use conditional split so that i can take the last 4 digits and put them into a different column, I tried to use the SUBSTRING ("ZIP", 6, 4) but it returns an error, any ideas on how i can split it?
I have setup a SSIS package that takes a flat file fixed width input, and stores it to two SQL server tables in the same database. The flat file contains two types of records, lets call them Type1 and Type2. The two types of records are formatted differently, and the first character determines what type the record is. I used a conditional split to send record type1 down one path, and type2 down the other. On each of those I use a derived column task to build all the fields and then store to the table with the OLE destination. I put any errors that occur (like truncation) into an error table by setting the "redirected row" feature vs "Fail Component". This all works well and I have no issues.
The dilema is as follows. Type1 is essentially a parent record and the Type2 record is a child. There is a shared primary key / foreign key relationship field. I want errors when processing type1 to cause the associated type2 to also be redirected to the error table vs being inserted.
If anyone has suggestions on how this could be done, reference articles, etc... please let me know.
I have been transfering data from text file to sql databases.
I have a conditional split where i check to if the address has changed for a particular person.If yes i direct to update else i direct to default output which means no change.
when i connect error output of conditional split to a database or union all couple of rows are directed to error output.But i dont understand the reason.How would i be able to know why they r directed to error.
Good Day All, I have an interesting situation that I cannot believe is unique. I have a flat file (ragged right) that contains 5 different record types. Each row in the file identifies the record type in the first character. The layout is something like this:
File Header Group Header (Contains group id number) Data Item (Contains group id number) . . . Group Footer (DOES NOT CONTAIN GROUP ID NUMBER) Group Header (Contains group id number) Data Item (Contains group id number) . . . Group Footer (DOES NOT CONTAIN GROUP ID NUMBER) File Footer
Now I only want to extract data for ONE of the aforementioned groups, however I need the group footer as well because it contains some control totals for the group. The real problem is that the footers do not contain the group id number it goes with. It is a completely positional thing. Silly, yes I know but this particular file layout is an industry standard.
I thought the conditional split would be the way to go. Unfortuately, it seems the conditional split wants to split the entire data set before passing the results down stream rather than processing a single row at a time and passing that row down stream before processing the next one. (Blocking versus streaming I think its called) I could do it in a single god-awful script but I would rather try not to have to code the entire thing.
If I have 2 input fields to my conditional split, how can I do a compare based on if they are alike. Example, I have 2 IDs, I want to see if the IDs match for a PK/FK relationship, if they match, then output those rows to the conditional's output stream. Do I literally do this or is this not right for the expression? Is there a like statement I should be using instead?
[IDName] == [IDName]
Basically I have 2 OLE DB sources coming in, 2 sets of columns, and both tables behind each OLE DB souce have an ID field to determine the PK/FK relationship. Out of all the records going through from the OLE DB source to the conditional split, I want to output each set of records where the IDs are equal...thuse after my conditional split, I could then take those records and input them into another txt file....and then the process would repeat for the next records in the pipe where IDs are the same...