Running UPDATE Statements In Parallel On The Same Table
Feb 23, 2007
Hi all,
Does an UPDATE statment lock the entire table or just the rows that will be affected by the UPDATE?
I ask because -
Can I run UPDATE statements in parallel on the same table on the same column. The need for doing this is because the table is a large fact table. I plan to execute the same UPDATE statements on different time sections of the data to expedite the processing.
If the UPDATE statment lock the entire table then I cannot run an UPDATE in parallel. If the UPDATE statement just locks the rows that will be affected then maybe I can because rows affected will be different for each UPDATE.
Let me know.
Thanks,
Vivek
View 1 Replies
ADVERTISEMENT
Aug 5, 2015
I have a table with 8 columns, I need to update data in multiple columns on this table, this table contains 1 million records, having single update was taking time so I broke the single update into multiple update statements and running multiple update statements in parallel, Each update statement updates different column.
This approach is working fine but I am getting the deadlock error.
Transaction (Process ID 65) was deadlocked on lock | communication buffer resources with another process and has been chosen as the deadlock victim. Rerun the transaction.
I tried with various lock hints but no success.
View 4 Replies
View Related
Jul 20, 2005
HiI'm using the SQL 2000 table variable to hold 2 different fact sets.I'm declaring the variable @CurrentTable and inserting into it using aSELECT statement with no problems.I'm leaving certain of the columns null in order to later update themwith the PK.Problem is in the UPDATE syntax I'm usingUPDATE @CurrentTableSET ManagerTitle = (select mgrs.pos_title from mgrs) wheremgrs.pos_num = @CurrentTable.MgrPosNumIt is insisting I declare the @CurrentTable variable when I try to useit in the where clause.Is it simply out-of-scope or am I really doing something foolish?Andrew
View 2 Replies
View Related
Apr 13, 1999
One of the developers here wants to run a PB script consisting of 11
inserts to 11 respective tables at the same time. This is on a development
box.
Currently, we are running on MSSQL 6.5 ,sp4. We have very general
parameters set, nothing special. We have max async io set to the default
The inserts run serially now. What I need to do is run them in parallel.
My questions are:
1) How do I do this -- run in parallel?
2) What parameters do I need to set?
3)What else would I need to do?
Any information you can provide will be greatly appeciated. Thanks.
David Spaisman
View 1 Replies
View Related
Feb 19, 2008
Scenario
I have a package that is doing some file transformation (Text, XML, and Excel) job based on a variable value. This package is called by a Parent package, where I am calling this package parallel through a script Task. So there are three parallel script task and all variables are local to script task.
In Script Task I am assigning value to child package variable using following code.
packToLaunch = appLaunch.LoadPackage("SSISPackage.dtsx", Nothing)
For iVarCtr = 0 To Dts.Variables.Count - 1
VarName = Dts.Variables(iVarCtr).Name
packToLaunch.Variables(VarName).Value = Dts.Variables(VarName).Value
Next
Dts.TaskResult = packToLaunch.Execute()
Problem1
Out of three packages some are getting failed saying variables not found in collection.
To read a variable in child package I am setting it to ReadOnlyVariables List of Script Task.
In some places I am using following code
object objValue = null;
Microsoft.SqlServer.Dts.Runtime.Variables variables = null;
if (varDispenser.Contains(variableName))
{
varDispenser.LockOneForRead(variableName, ref variables);
objValue = variables[variableName].Value;
if (variables.Locked) variables.Unlock();
}
return objValue;
Problem2
I have created a hash table in child package and stored in a package variable through a script task.
When trying to access the variable through a custom task, in some of threads it€™s returning me System.Object not System.HashTable.
Both the problems are working fine on a single processor machine, it starts giving errors on multi processor 64 bit machine.
Need Urgent Help!!!!!
View 2 Replies
View Related
Aug 23, 2006
Hi
I have a SQL Server 2000 instance running on a Windows Server 2003 box with 4 processors. SQL Server is configured to use all 4 processors, and use all available processors for parallelism.
I have created a simple DTS package which has 2 "execute external process" tasks with no precedence constraints between them. There are no connections required or defined for the two tasks (sequential
processing is forced on tasks sharing connections). The DTS package
properties have the "limit the number of tasks to execute in parallel"
set to 4.
However, despite the above configuration, the two steps are never executed in parallel, but always sequentially.
Does anyone have any ideas as to why these tasks are not being executed in parallel?
Any suggestions welcome.
Thanks.
View 2 Replies
View Related
Feb 13, 2006
In my application code I am trying to invoke multiple threads in which each thread is loading an instance of the same SSIS package and would initialize the package variables with different values and execute the different instances in parallel. In each thread - after the package execution has completed successfully - I read that instance's SSIS package variables to get result information from that Instance run.
When I load the same package in different thread using LoadFromSqlServer() method
- does the code create multiple instances of the SSIS package and load the distinct instances in each of the thread
- Will the Package Execution ID be different for the different instances?
- Are the package level variables instance safe?
View 2 Replies
View Related
Oct 8, 2015
I try to find out how many jobs where run in parallel on my server in an interval of time. For example: between 1:00 AM and 2:00 AM there were MAX 66 jobs that run in parallel and MIN 4 jobs. I am not sure if I can find this info out from a system view or I need to play with sysjobhistory view.
View 10 Replies
View Related
Feb 2, 2007
I'm a bit confused about how ssis handles concurrent package runs
let's say I'm running this package and I 've got 3 variables set in it
VARA
VARB
VARC
and by default they are all set to 0
if I run
dtexec /File "C:ControlRoom.dtsx" /SET PackageVersion_Builder.Variables[VARA].Value;1
dtexec /File "C:ControlRoom.dtsx" /SET PackageVersion_Builder.Variables[VARB].Value;1
dtexec /File "C:ControlRoom.dtsx" /SET PackageVersion_Builder.Variables[VARC].Value;1
I'm expecting to run 3 isloated version of the package with in
first version
VARA=1
VARB=0
VARC=0
second version
VARA=0
VARB=1
VARC=0
third version
VARA=0
VARB=0
VARC=1
but it doesn't seem like doing that the maxconcurrent variable is set to 40 to be on the safe side.
when I run I get
first version
VARA=1
VARB=0
VARC=0
second version
VARA=1
VARB=1
VARC=0
third version
VARA=0
VARB=1
VARC=1
any ideas?
Thanks
View 2 Replies
View Related
Mar 27, 2007
I have a system of SSIS packages in which several packages perform the same lookup on the same table. E.g., i have PackageA, PackageB and PackageC all doing a lookup on TableA. All of these packages are spawned by the same PackageD and run frequently. In some cases, there is an issue with concurrency on these lookups. I get the following exception :
"
The ProcessInput method on component "LKP Lookup SecurityID" (6658) failed with error code 0xC004702C. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
"
The hex code of this exception corresponds to the following description : "DTS_E_BUFFERNOTLOCKED. This buffer is not locked and cannot be manipulated." That's as much as i could find on this.
My suspision is that the SSIS engine somehow figures that the lookup in these distinct packages is the same one and builds a shared version of the lookup table in memory. Then there is some sort of a multi-threading issue in accessing this shared memory which leads to the exception above.
Has anyone experienced this? Can someone shed some light on this?
Thanks a lot
-Alex
View 15 Replies
View Related
Dec 18, 2013
How can I update a table to have a running cumulative sum?
For Example:
Update Table1 Set Cumulative_Sum = Row_Sum + Previous_Row_Sum
It should look something like this:
Row_Sum Cumulative_Sum
1 1
2 3
3 6
4 10
View 6 Replies
View Related
Jan 11, 2008
Hi,
I have stumbled on a problem with running a large number of SSIS packages in parallel, using the €œdtexec€? command from inside an SQL Server job.
I€™ve described the environment, the goal and the problem below. Sorry if it€™s a bit too long, but I tried to be as clear as possible.
The environment:
Windows Server 2003 Enterprise x64 Edition, SQL Server 2005 32bit Enterprise Edition SP2.
The goal:
We have a large number of text files that we€™re loading into a staging area of a data warehouse (based on SQL Server 2k5, as said above).
We have one €œmain€? SSIS package that takes a list of files to load from an XML file, loops through that list and for each file in the list starts an SSIS package by using €œdtexec€? command. The command is started asynchronously by using system.diagnostics.process.start() method. This means that a large number of SSIS packages are started in parallel. These packages perform the actual loading (with BULK insert).
I have successfully run the loading process from the command prompt (using the dtexec command to start the main package) a number of times.
In order to move the loading to a production environment and schedule it, we have set up an SQL Server Agent job. We€™ve created a proxy user with the necessary rights (the same user that runs the job from command prompt), created an the SQL Agent job (there is one step of type €œcmdexec€? that runs the €œmain€? SSIS package with the €œdtexec€? command).
If the input XML file for the main package contains a small number of files (for example 10), the SQL Server Agent job works fine €“ the SSIS packages are started in parallel and they finish work successfully.
The problem:
When the number of the concurrently started SSIS packages gets too big, the packages start to fail. When a large number of SSIS package executions are already taking place, the new dtexec commands fail after 0 seconds of work with an empty error message.
Please bear in mind that the same loading still works perfectly from command prompt on the same server with the same user. It only fails when run from the SQL Agent Job.
I€™ve tried to understand the limit, when do the packages start to fail, and I believe that the threshold is 80 parallel executions (I understand that it might not be desirable to start so many SSIS packages at once, but I€™d like to do it despite this).
Additional information:
The dtexec utility provides an error message where the package variables are shown and the fact that the package ran 0 seconds, but the €œMessage€? is empty (€œMessage: €œ).
Turning the logging on in all the packages does not provide an error message either, just a lot of run-time information.
The try-catch block around the process.start() script in the main package€™s script task also does not reveal any errors.
I€™ve increased the €œmax worker threads€? number for the cmdexec subsystem in the msdb.dbo.syssubsystems table to a safely high number and restarted the SQL Server, but this had no effect either.
The request:
Can anyone give ideas what could be the cause of the problem?
If you have any ideas about how to further debug the problem, they are also very welcome.
Thanks in advance!
Eero Ringmäe
View 2 Replies
View Related
Aug 24, 2007
I have a file with about 600,000 lines of insert statement given to me by a developer.
There are basically 5 inserts into different tables for each Product or Item. Each Insert MUST run in the order specified and must complete before the next insert runs,
To complicate things further, there are triggers that fire on every insert and each trigger must complete its transaction before the next insert starts.
What is the best and most efficient way to run the inserts, while ensuring that each statement completes before the next.
Tried using Serialization but appears some of the transactions overlap and generate errors.
Tried disabling the triggers but run into other problems.
At this point I am tempted to run each statement manually
Thanks All for you help and input.
View 1 Replies
View Related
Nov 5, 2007
My goal is to run a bunch of select statements from different tables in one database and have them all insert to the same columns/table in the new database. Do I need a new data source for each statement, or is there a way to run all the statements in one set seeing as they all have the same destination. I keep receiving the SQL statement improperly ended error when trying.
View 5 Replies
View Related
Apr 26, 2004
How to update a column in the table using the data from another column in the same table? Thanks.
View 1 Replies
View Related
Jan 10, 2007
Hi guys! Is there a way to combine these update statements?
Dim update_phase As New SqlCommand("INSERT INTO TE_shounin_zangyou (syain_No,date_kyou,time_kyou) SELECT syain_No,date_kyou,time_kyou FROM TE_zangyou WHERE [syain_No] = @syain_No", cnn)
Dim update_phase2 As New SqlCommand(" UPDATE TE_shounin_zangyou SET " & " phase=2, phase_states2=06,syounin2_sysd=CONVERT(VARCHAR(10),GETDATE(),101) WHERE [syain_No] = @syain_No", cnn)
The same table is updated so I think it would be better to have just one update statement. But the problem is that, the first update statement retrieves values from another table, whereas the update values of the second statement is fixed. Is there a way to combine these two statements. I tried to do so but it does not update. Here's my code...
Dim update_phase As New SqlCommand("UPDATE TE_shounin_zangyou SET TE_shounin_zangyou.syain_No=TE_zangyou.syain_No, TE_shounin_zangyou.date_kyou=TE_zangyou.date_kyou, TE_shounin_zangyou.time_kyou=TE_zangyou.time_kyou FROM TE_zangyou WHERE TE_zangyou.syain_No = TE_shounin_zangyou.syain_No", cnn)
Please help me. Thanks.
Audrey
View 1 Replies
View Related
Mar 13, 2008
hello gang, Is it possible to combine sql update statements? something like:
UPDATE table_nameSET column_name = new_valueWHERE column_name = some_valueANDSET column_name = new_valueWHERE column_name = some_other_value
View 4 Replies
View Related
May 14, 2002
Hello,
I am creating my companys' database and I have a small problem that must be
solved.
I have a pictures table:
PicturesTable
-------------
ProductID int ForeignKey
Picture nvarchar(30)
...
(A product can have many pictures & the ProductID is unique for any product,
but not for the table of pictures).
What I want to do is to somehow do a procedure to:
1) Check if any images (for a productID) exists in the table
2) if they do not exist then add the appropriate images into the table
3) if the images exist, then update the images with the new one that I have.
What I thought was to just delete all the images from the table for the
specific product:
DELETE FROM PicturesTable
WHERE ProductID = '10-11'
and then add the appropriate images:
INSERT INTO PicturesTable (ProductID, Picture)
VALUES ('10-11', 'Dir1/Pic1.gif')
INSERT INTO PicturesTable (ProductID, Picture)
VALUES ('10-11', 'Dir1/Pic2.gif')
INSERT INTO PicturesTable (ProductID, Picture)
VALUES ('10-11', 'Dir1/Pic3.gif')
but I do not like a lot this idea because if a user tries to read the pictures
for that product (at the same time I was deleting them) s/he would get
nothing. Is any other way that I can do it please?
I would appreciate it if someone answers me.
Yours, sincerely
Efthymios Kalyviotis
ekalyviotis@comerclub.gr
View 1 Replies
View Related
Mar 29, 2004
My manager is interested in knowing if there is a way to update our website's SQL database using a method with excel, similar to importing.
The person who was previously in my position had imported a few hundred new products into the database with an excel spreadsheet.
Now, we would like to make updates such as a price changes or similar adjustments to a number of the products in the database. We could use a web interface, but ours requires us to find each product individually and it takes too much time. I told him that it would probably be necessary to write an SQL statement to update the tables, but we're also interested in maintaining the integrity of the database and are worried about loosing data due to a typo. Is it possible to export the db contents to an excel file, make changes, and then merge those changes into the existing database? I have tried and failed, so I am wondering if any experienced users could help me out.
Also, is there some kind of phpmyadmin for MS SQL? A free, open source alternative would be best.
View 8 Replies
View Related
Jun 5, 2007
Hi,
I have this update statement that works, it updates the totalamount to calc amount, but I want to update totalamount only when it is not equal to calcamt.I have tried many things but in vain.Can some one please help me.
How do i use case statements to update only when totalamount!=calcamt.
update c
set totalamount= calcamt
from Prepay c
JOIN
(select sum(amt) as calcamt, payid
from pay
group by payid
)b ON b.payid= c.payid
where c.cust_no='somenum'
View 4 Replies
View Related
Sep 19, 2006
HI Gurus
I have written one CTE (common table expression) and trying to use same CTE with three seperate UPDATE statements which gives me error saying "Invalid Object name" (it works fine when I try to use with 1 update statement (any one from three update statements)
Isnt it possible that I can use 1 CTE with mutiple update statements?
waiting for your reply....
View 4 Replies
View Related
Oct 31, 2007
What is the single SQL statement to truncate the blank space on either side of data.
Ex.
Table1 has Name as column.
I have records filled with blank space on both side for Name field.
With one query I want to correct (truncate the leading and trailing space) the data.
How?
SQL Server 2005 SP2.
Thank you,
Smith
View 1 Replies
View Related
Jun 5, 2007
Hi,
I have this update statement that works, it updates the totalamount to calc amount, but I want to update totalamount only when it is not equal to calcamt.I have tried many things but in vain.Can some one please help me.
How do i use case statements to update only when totalamount!=calcamt.
update c
set totalamount= calcamt
from Prepay c
JOIN
( select sum(amt) as calcamt, payid
from pay
group by payid
)b ON b.payid= c.payid
where c.cust_no='somenum'
View 1 Replies
View Related
Nov 2, 2004
I am almost sure I can update variables columns in one select/case type
statement, but having problems working out the syntax.
I have a table with transactions - with tran types as the key.
in this example, types = A,B,C ,D.
in this first example I am updating the sum of QTY to value t_A based on
tran types =A.
can I perform sub query/case to update with the same where clause
but for types B,C and D?? I also have to insert for specific lot numbers each sum values.
Create table #t_reconcile(
t_lot_number int not null,
t_A float,
t_B float,
t_C float,
t_D float)
insert #t_reconcile
select t.lot_number, sum(t.qty)
from i , t
where i._id = t.event_id
i.transaction_type = 'A'
group by t.lot_number
order by t.lot_number
View 3 Replies
View Related
Mar 18, 2008
I'm working on a query which involves changing the case of a field from mixed case to all lower case. The field exists in multiple tables, so to do this I have multiple update statements. Is there a way to make this more efficient?
See below for example:
update InvestBroker
set BrokerID = lower(BrokerID)
update InvestFill
set BrokerID = lower(BrokerID)
update InvestBrokerAccount
set BrokerID = lower(BrokerID)
update InvestFIXBroker
set BrokerID = lower(BrokerID)
update InvestUploadBrokerFilter
set BrokerID = lower(BrokerID)
update InvestSettleInstructions
set BrokerID = lower(BrokerID)
View 2 Replies
View Related
Jun 22, 2015
I get the error:
(0 row(s) affected)
Msg 208, Level 16, State 1, Line 41
Invalid object name 'X_SET_PREOP'.
FOR THE FOLLOWING CODE SEGMENT.. I am trying to do 2 updates with just one WITH BLOCk.Create table #temp( MPOG_CASE_ID uniqueidentifier, lab_name varchar(100), lab_date datetime, lab_value decimal(19,2) );
with X_SET_PREOP as
(
SELECT
xx= ROW_NUMBER() OVER ( PARTITION BY lab.MPOG_Case_ID, lab.lab_name ORDER BY lab.lab_date DESC ),
lab.MPOG_Case_ID,
lab.lab_name,
lab.lab_value,lab.lab_date
FROM
MPOG_Research..ACRC_427_lab_data lab
[code]....
View 7 Replies
View Related
Jul 20, 2005
Hi,I am using MS SQL Server 7.0 SP2 in Windows 2000 server SP4.I have one-to-many tables (TABLE_HEAD and TABLE_DETAILS)which I amgoing to update by using a stored procedure with UPDATE statements.But somehow ,ONCE IN A WHILE, when executing the stored procedurewith about 1000 rows updated, I lost 10-20 records from TABLE_HEAD(seems like 10-20 records were deleted) , and all data rows inTABLE_DETAILS were updated correctly (even details of lost rows ofTABLE_HEAD).In update procedure, I update both part of primary key and othercolumns with having WHERE condition.Please help , I really don't know why this happens.Thanks in advanceNipon Wongtrakul
View 3 Replies
View Related
Apr 8, 2008
Our existing DW's ETL was written in a very complex fashion by the previous team. They use DTS package lookups to read a row in the Source SQL Server database see if that row exists in the taget SQL Server database. If the row does not exist, they use ActiveX scripts to INSERT the row in the target SQL Server database. If it exists, they update the row on the target side. How would you do this in SSIS? Apologize if this sounds like a basic question, however, I would have done this via Stored Procedures or SQL Scripts especially since it involves SQL Servers alone. Appreciate any help.
View 6 Replies
View Related
Aug 17, 2007
Hello. I have a 32-bit SQL 2005 (SP1) server that is my current Production server. I also have a 64-bit SQL 2005 (SP1) server that will become my Production server. I have several SSIS packages that load/refresh data on a nightly basis to a few of my databases from DB2 (MVS). I have the packages setup on the current Production server (32-bit) and all is working well through the Microsoft OLE DB provider for DB2. However, on the 64-bit server, I am experiencing some issues with the SSIS packages failing due to large loads. Loads that are loading tables with 500K, or less, data seem to run without issue (through SQL Agent Jobs). But, larger table loads are failing.
I do have a linked server set up on the 64-bit server to the 32-bit server, for other processes. And because of this I have lightweight pooling turned off on the 64-bit server (because of distributed querying). Lightweight pooling is turned on on the 32-bit server. Could this be what is causing some of my issue? Since I don't have the lightweight pooling option turned on (on the 64-bit server), am I not getting the proper amount of through-put for my 8 dual core CPU server?
Thanks
Scottye
View 1 Replies
View Related
Jan 31, 2008
I have a SP that has the correct syntax. However when I run my web-app it gives me this error: "Only one expression can be specified in the select list when the subquery is not introduced with EXISTS. "
The procedure takes in three parameters and retrieves 23 values from the DB to display on my form.
Any ideas?
View 4 Replies
View Related
Mar 28, 2006
I have been posting to the Data Presentation Controls forum for about a month regarding a problem I've been dealing with.
http://forums.asp.net/thread/1223055.aspx
What it boils down to is that on a button click event, I was updating
some records, then re-executing a SELECT statement to get the records
back out and rebind my DataGrids. This was happening too quickly
and the data was not being updated in time before the SELECT was
executed. So my grids would still display "old" data.
How do I get SQL Server to commit the UPDATE before my C# code continues?
View 4 Replies
View Related
Mar 11, 2004
When working from within VB, should i be using Insert or Update statements, or should i pass the values to a stored proc that does it for me.
thanks
View 14 Replies
View Related
Aug 29, 2007
I searched a bit but didn€™t get too far in actually solving a case of deadlock in a simple query I have running here. The queries in question are executed under 2 separate transactions (Serializable IsolationLevel) and are shown below. I guess I don€™t understand how those 2 can deadlock because they are operating on different rows of the table and Serializable should keep them isolated pretty well too. Is it because I€™m using the column value inside an update stmt? How should this query be split if that€™s the case?
This is what the SQL Profiler has to say:
Lock: Deadlock Chain Deadlock Chain SPID = 59
Lock: Deadlock Chain Deadlock Chain SPID = 57
Lock: Deadlock my_user_name
57: UPDATE CreditCard SET Balance = Balance - 200 WHERE (Account = 0 AND CardHolder = 'Foo' AND Balance - 200 >= 0)
59: UPDATE CreditCard SET Balance = Balance - 250 WHERE (Account = 3 AND CardHolder = 'Bar' AND Balance - 250 >= 0)
I also used DBCC TRACEON(1204, 3605, -1) but I don€™t understand what the SQL log is telling me. Can anyone shed some light on why the above 2 statements sometimes cause the following:
System.Data.SqlClient.SqlException: Transaction (Process ID 57) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.
at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection)
at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj)
at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj)
at System.Data.SqlClient.SqlCommand.RunExecuteNonQueryTds(String methodName, Boolean async)
at System.Data.SqlClient.SqlCommand.InternalExecuteNonQuery(DbAsyncResult result, String methodName, Boolean sendToPipe)
at System.Data.SqlClient.SqlCommand.ExecuteNonQuery()
I really don't want to retry the update if I don't have to. Table looks like:
Column DataType Length
Account int 4
CardHolder char 64
Balance float 8
View 9 Replies
View Related