Parallel Transactions Error
Aug 21, 2007
Hi,All.
I'm writing test cases on C# for a few methods that make changes in database.To prevent making changes I used BeginTransaction-Rollback,everything was good.But this doesn't work if tested method has BeginTransaction-Rollback code itself.An error appears in NUnit: System.InvalidOperationException : SqlConnection does not support parallel transactions.
Do smb know how to solve the problem?
*******************************tc below********************
[Test]
[Category("Access")]
public void UpdateGroupUserTable(/*Int32 userID, GroupList dataset*/)
{
r.BeginTransaction();
try
{
r.UpdateGroupUserTable(userId,gl);
}
finally
{
r.Rollback();
}
*******************************tested method below********************
public void UpdateGroupUserTable(Int32 userID, GroupList dataset)
{
CheckDisposed();
UpdateCommand.CommandType = CommandType.Text;
UpdateCommand.Parameters.Clear();
UpdateCommand.Parameters.Add("@USERID", userID);
BeginTransaction();
try
{
Adapter.Update(dataset.acl_group);
Commit();
}
catch (SqlException)
{
Rollback();
throw;
}
}
View 5 Replies
ADVERTISEMENT
Jul 17, 2006
This is my code in vb.net with Sql transactionI am using insertcommand and update command for executing the sqlqueryin consecutive transactions as follows.How can I achive parallel transactions in sql------------------start of code---------------------trybID = Convert.ToInt32(Session("batchID")) strSQL = "" strSQL = "Insert into sessiondelayed (batchid,ActualEndDate) values (" & bID & ",'" & Format(d1, "MM/dd/yyyy") & "')"
sqlCon = New System.Data.SqlClient.SqlConnection(ConfigurationSettings.AppSettings("conString"))
Dim s1 As String = sqlCon.ConnectionString.ToString sqlDaEndDate = New System.Data.SqlClient.SqlDataAdapter("Select * from sessiondelayed", sqlCon) dsEndDate = New DataSet sqlDaEndDate.Fill(dsEndDate)
dbcommandBuilder = New SqlClient.SqlCommandBuilder(sqlDaEndDate)
'sqlCon.BeginTransaction() 'sqlDaEndDate.InsertCommand.Transaction = tr If sqlCon.State = ConnectionState.Closed Then sqlCon.Open() End If sqlDaEndDate.InsertCommand = sqlCon.CreateCommand() tr = sqlCon.BeginTransaction(IsolationLevel.ReadCommitted) sqlDaEndDate.InsertCommand.Connection = sqlCon sqlDaEndDate.InsertCommand.Transaction = tr sqlDaEndDate.InsertCommand.CommandText = strSQL sqlDaEndDate.InsertCommand.CommandType = CommandType.Text
sqlDaEndDate.InsertCommand.ExecuteNonQuery() tr.Commit() sqlDaEndDate.Update(dsEndDate) sqlCon.Close() End If Catch es As Exception
Dim s2 As String = es.Message If sqlCon.State = ConnectionState.Closed Then sqlCon.Open() End If strSQL = " update SessionDelayed set ActualEndDate= '" & Format(d1, "MM/dd/yyyy") & "' where batchid=" & bID & "" sqlDaEndDate.UpdateCommand = sqlCon.CreateCommand() tr1 = sqlCon.BeginTransaction(IsolationLevel.ReadCommitted) sqlDaEndDate.UpdateCommand.Connection = sqlCon sqlDaEndDate.UpdateCommand.Transaction = tr1 sqlDaEndDate.UpdateCommand.CommandText = strSQL sqlDaEndDate.UpdateCommand.CommandType = CommandType.Text sqlDaEndDate.UpdateCommand.ExecuteNonQuery() tr1.Commit() sqlDaEndDate.Update(dsEndDate) sqlCon.Close()
End Try
'-------------End----------------
View 1 Replies
View Related
Mar 27, 2006
hello
i have a problem when i try to begin transaction in sql
this is the massege
"SqlConnection does not support parallel transactions."
but the connection dosent have another transAction
what can i do
thanks
View 7 Replies
View Related
Mar 29, 2006
I have three SQL tasks executing in parallel in an Integration Services package.
+-B-+
A-+-C-+-E
+-D-+
It starts with task A; then B, C, and D all execute in parallel; and finally task E runs after BCD are done.
B, C, and D are all Execute SQL tasks, all with the same connection manager. Here is their code:
B) SELECT CASE WHEN COUNT(*) = 0 THEN 0 ELSE 1 END AS Process
FROM temp_B
C) SELECT CASE WHEN COUNT(*) = 0 THEN 0 ELSE 1 END AS Process
FROM temp_C
D) SELECT CASE WHEN COUNT(*) = 0 THEN 0 ELSE 1 END AS Process
FROM temp_D
Each one is setting a binary value to a package variable (using Result Set settings) based on the count of records from different tables.
This works with no problems when I run it against one server (development). But when I switch to the production server, task B and D both fail. I'v checked to make sure all of the temp tables exist in the database for that connection manager and that all three have the same connection manager - all is okay.
Here's the trickier part. When I'm still pointing to the production server and I run these tasks individually, they are all successful. It is only when they are attempting to run in parallel that they fail.
Here is the Output error:
Error: 0xC002F210 at Process Med?, Execute SQL Task: Executing the query "SELECT CASE WHEN COUNT(*) = 0 THEN 0 ELSE 1 END AS Process FROM temp_B" failed with the following error: "Invalid object name 'temp_B'.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
What could be causing this? I'm at a loss.
View 5 Replies
View Related
Nov 16, 2005
I cannot find any information on this error. It occurs on packages that are writing to the same table using a sql server destination. I suppose it would be a good exercise in error handling, but I'd rather avoid it.
View 4 Replies
View Related
Mar 9, 2001
I have a DTS package which contains:
- 1 "Execute SQL" task
- 1 "Connection" object
The provider for the connection is SQLOLEDB ("Microsoft OLEDB Provider for SQL Server"), and this works just fine with transactions in ADO etc.
The MDAC version is 2.5, and the SQL Server Client Utils version is 7.0 SP2.
The package properties are set as follows:
- "Use transactions" is on
- "Auto commmit transaction" is on
- "Read committed" isolation level
The execute SQL task has the following workflow properties:
- "Join transaction if present" is on
- "Commit transaction on successful..." is on
- "Rollback transaction on failure" is on
- ("Execute on main package thread" just in case)
When I execute the package (from the designer, or cmd line), I get the following (most informative) error:
"Error Source: Microsoft Data Transformation Services (DTS) Package
Error Description: Unspecified error"
If I change the package properties to remove "Use transactions", it executes just fine.
Any ideas?
TIA
View 2 Replies
View Related
Apr 17, 2008
So I have a few data flow tasks that I need to all execute successfully before I commit the changes. So I use a few nested sequence containers with the parent set to Required and all of the children set to Supported. This should work right? Instead I get "The AcquireConnection method call to the connection manager "Databasenamehere" failed with error code 0xC0202009. If I switch the parent back to Supported or NotSupported it will execute fine.
HELP!
Thanks in advance!
View 2 Replies
View Related
May 22, 2005
Hi there,
I have decided to move all my transaction handling from asp.net to stored procedures in a SQL Server 2000 database. I know the database is capable of rolling back the transactions just like myTransaction.Rollback() in asp.net. But what about exceptions? In asp.net, I am used to doing the following:
<code>Try 'execute commands myTransaction.Commit()Catch ex As Exception Response.Write(ex.Message) myTransaction.Rollback()End Try</code>Will the database inform me of any exceptions (and their messages)? Do I need to put anything explicit in my stored procedure other than rollback transaction?
Any help is greatly appreciated
View 3 Replies
View Related
Apr 2, 2007
Greetings,
We are using peer-to-peer transactional replication. When using the default agent profile, the replicator stops processing commands when it encounters an error. It appears to keep trying to apply the command that caused the error and essentially hangs on that command. We have changed our agent profiles to "Continue on data consistency errors" - not the kind of option to give you a warm fuzzy feeling but we just don't know what else to do.
Where are the commands that cause errors stored within the replication environment? Can they be individually viewed, edited, and/or deleted? We would like some alternative to "continue on data consistency errors."
Thanks,
BCB
View 5 Replies
View Related
Nov 14, 2007
Hi, I am working on vs2005 with sql server 2000. I have used TransactionScope class. Example Reference: http://www.c-sharpcorner.com/UploadFile/mosessaur/TransactionScope04142006103850AM/TransactionScope.aspx The code is given below. using System.Transactions; protected void Page_Load(object sender, EventArgs e) { System.Transactions.TransactionOptions transOption = new System.Transactions.TransactionOptions(); transOption.IsolationLevel = System.Transactions.IsolationLevel.ReadCommitted; transOption.Timeout = new TimeSpan(0, 2, 0); using (System.Transactions.TransactionScope tranScope = new System.Transactions.TransactionScope(TransactionScopeOption.Required,transOption)) { using (SqlConnection con = new SqlConnection(ConfigurationManager.ConnectionStrings["nwConnString"].ConnectionString)) { int i; con.Open(); SqlCommand cmd = new SqlCommand("update products set unitsinstock=100 where productid=1", con); i = cmd.ExecuteNonQuery(); if (i > 0) { using (SqlConnection conInner = new SqlConnection(ConfigurationManager.ConnectionStrings["pubsConnString"].ConnectionString)) { conInner.Open(); SqlCommand cmdInner = new SqlCommand("update Salary set sal=5000 where eno=1", conInner); i = cmdInner.ExecuteNonQuery(); if (i > 0) { tranScope.Complete(); // this statement commits the executed query. } } } } // Dispose TransactionScope object, to commit or rollback transaction. } } It gives error like
"The partner transaction manager has disabled its support for remote/network transactions. (Exception from HRESULT: 0x8004D025)" The database I have used is northwind database and pubs database which is by default in sql server 2000. So, Kindly let me know how to proceed further. Thanks in advance,Arun.
View 1 Replies
View Related
Mar 22, 2007
I am having a problem getting error rows to redirect between an OLE DB Source and an OLE DB Destination when using transactions. Each time I turn on the transaction control I get an error stating:
"[OLE DB Destination [48]] Error: The input "OLE DB Destination Input" (61) cannot be set to redirect on error using a connection in a transaction."
I get the above Error when using MSDTC. I have the data flow inside of a Sequence Container with the transaction option set to REQUIRED and the Isolation Level set to Serializable. I have tried all the Isolation levels.
I have the error rows piped off to a seperate OLE DB Destination. I have also tried using native SQL transactions with Execute SQL tasks to BEGIN, COMMIT or ROLLBACK the transaction. This does not work either. It looks like it works properly when the data flow is successful but using profiler I can see SSIS opens up a seperate process for the BEGIN and then another one with the Data Flow task. When I intentionally fail the Data Flow the Rollback always fails. I made sure I had RetainSameConnection turned on for the Connection I was using.
I am speculating that the Data Flow does not know what to Rollback the actual rows that succeeded or the error rows that are getting piped off.
I am fairly stumped on this one so any help is appreciated.
Thanks
View 4 Replies
View Related
Nov 14, 2006
I'm receiving the below error when trying to implement Execute SQL Task.
"The ROLLBACK TRANSACTION request has no corresponding BEGIN TRANSACTION." This error also happens on COMMIT as well and there is a preceding Execute SQL Task with BEGIN TRANSACTION tranname WITH MARK 'tran'
I know I can change the transaction option property from "supported" to "required" however I want to mark the transaction. I was copying the way Import/Export Wizard does it however I'm unable to figure out why it works and why mine doesn't work.
Anyone know of the reason?
View 1 Replies
View Related
Apr 25, 2001
Hi All,
I am working on SQL Server 7.0. Every weekend we go for reindexing of some tables. I want to know if it is possible to run the re-indexing of tables in parallel so that I can save time.
Our database is of size 80GB and one table is around 22GB. Rebuilding of index on this table takes a lot of time and we are unable to index the other tables.
Any solutions/suggestions are mostly appreciated.
Regards,
Mitra
View 2 Replies
View Related
Jan 4, 2008
hi,
we currently use the Database Maintenance Plan to do backups for our SQL Server 2000 databases.
I notice that the database are backed up one after the other.
I would like to know how to run the backups in parallel rather than sequentially.
To do this, is there any dependency on the number of CPUs?
regards
Andrew
View 2 Replies
View Related
Oct 22, 2006
I just thought I would share this info:
I created the package to download 4 ftp files at once.
I set the MaxConcurrentExecutables for the SSIS package to 4. So in BIDS in downloads 4 files at the same time.
However, when I started the job I noticed that only 3 files were downloaded at the time (looking at temp files in download directory)
Solution:
Sure enough after digging around for awhile - in Step properties for SSIS package - there is execution tab - and "Maximum Concurrent Executables" was -1 (which for some reason defaults to 3 concurrent processes even on our dual CPU server) - so after chanign that value to 4 - tada - all 4 files in parallel
View 1 Replies
View Related
Oct 30, 2007
Is there any way to run a stored procedure in parallel to another one? i.e. I have a stored procedure that sends an email. I then scan a table and send any unsent emails. I do not want the second part to slow the response to the user.
View 2 Replies
View Related
Dec 6, 2001
I'd like to run 4 stored procedures in parallel on 4 different processors on our server.
Does anyone know the syntax to do that? (Could not find it on the web so far).
Anything else I need to do in addition to the specific syntax (i.e. any specific properties to set in Enterprise Manager)?
Your help is greatly appreciated!
Lana
View 1 Replies
View Related
Apr 13, 1999
One of the developers here wants to run a PB script consisting of 11
inserts to 11 respective tables at the same time. This is on a development
box.
Currently, we are running on MSSQL 6.5 ,sp4. We have very general
parameters set, nothing special. We have max async io set to the default
The inserts run serially now. What I need to do is run them in parallel.
My questions are:
1) How do I do this -- run in parallel?
2) What parameters do I need to set?
3)What else would I need to do?
Any information you can provide will be greatly appeciated. Thanks.
David Spaisman
View 1 Replies
View Related
Mar 19, 2014
Assuming I have a line, is there a function I can call to create a parallel line at a given distance away.i.e - with the below I would want to draw a parallel line to the one output.
DECLARE @line geometry = 'LINESTRING(1 1, 2 2, 3 3, 4 4)'
SELECT @line
View 9 Replies
View Related
Jun 14, 2006
Hi ,I need to place the results of two different queries in the same resulttable parallel to each other.So if the result of the first query is1 122 343 45and the second query is1 342 443 98the results should be displayed as1 12 342 34 443 45 98If a union is done for both the queries , we get the results in rows.How can the above be done.Thanks in advance,vivekian
View 7 Replies
View Related
May 13, 2008
I have several packages within secuence containers and into one main dtsx package with a checkpoint configuration and when I run it some succeed and some don´t. The problem is that when I rerun it checkpoint doesn´t seem to work ´cause some of the successful packages are rerun as well (and not skipped as it should be...) In other words, the process does not begin on the point of failure..
Seems to be that packages that finish after the failure point (and succeed) are not registered in the checkpoint file, then when I rerun the main package these succeeded packages are rerun too....
Any ideas??
View 1 Replies
View Related
May 11, 2007
Hello,
I have created a project to do de-dupification of addresses.
I understand that Fuzzy Grouping will take less time if it has lesser data volume to process.
My source feed file is sometimes huge. So I am splitting the input into multiple branches based on
the first letter of the city. There are 7 branches in the process.
Source File Feed
|
Split data into 7 groups
|
------------------------------------------------------------------------------------------------------------------------------------------
| | | | | | |
FzGrpg FzGrpg FzGrpg FzGrpg FzGrpg FzGrpg FzGrpg
| | | | | | |
Split Split Split Split Split Split Split
| | | | | | |
------------- -------------- -------------- -------------- -------------- -------------- --------------
| | | | | | | | | | | | | |
<- - - - - - - Write the Canonicals and Dupes from each of these splits into database - - - - - - - - ->
When I designed this I was hoping that each of the Fuzzy Grouping tasks will execute in parallel.
But in reality they are processing one after the other.
Is there anyway to make them execute in parallel?
Appreciate your help.
Thanks
KM
View 12 Replies
View Related
Feb 24, 2015
We have a monitoring tool that find a query that is using most of execution time of all sessions on the server.
I located it, and found it is a data flow task in an SSIS package.
It export data from a table which has big mount of data to another database.
I know it only executes one time, but I see in the monitoring tool it executes 4 times.
I am wondering is it because SSIS is doing it in parallel execution automatically?
We use all default settings, and the server physical cpu is 4.
Also it says the query is slow is because it has a wait called PREEMPTIVE_OS_WAITFORSINGLEOBJECT
not sure what does that mean
View 0 Replies
View Related
Nov 6, 2015
what the ideal CPU count and Max Degree of Parallelism are for a 3rd party database server.The server has 12 CPUs, 32GB RAM and all database sizes add up to < 30GB so they can all fit in memory (I tried to force this by doing a select * from every table). On certain payroll days, the CPU gets maxed out to 100% for a few seconds.
MAXDOP was originally set to the default 0. We later changed it to 8 based on several 'best-practices' articles. However the vendor suggests to change it to 1 (no parallelism), while others suggest changing it to 4, so that one run-away query doesn't hog most of the CPUs.
I'd like to find out how many CPUs are actually being used by queries. There is a Degree of Parallelism event in URL.... The BinaryData column says :
0x00000000, indicates a serial plan running in serial.
0x01000000, indicates a parallel plan running in serial.
>= 0x02000000 indicates a parallel plan running in parallel.- What does "parallel plan running in serial" mean ?
I see a lot of 0x01000000, and a few 0x08000000's in my trace.How can i determine whether one query is hogging CPUs and if reducing it to 4 will work?
View 4 Replies
View Related
Jun 2, 2007
I have another post here regarding SQL 2005 running a query 50% slower than on 2000. It was discovered that 2005 runs the query in series whereas 2000 runs it in parallel.
Even with "Cost Threshold For Parallelism" set to a default value – 0, 2005 still executes my query in series. Does anyone know how to force a query to run in parallel in SQL 2005. I specifically want to set it at the database level.
View 12 Replies
View Related
Jul 23, 2005
I have a SQL 7 db with a union query (view), and I'm getting the error, "Thequery processor could not start the necessary thread resources for parallelquery execution." This union query has been in place for about two years nowwith no problems until just now, though I haven't changed anything. Also, Ihave a local copy of the database on my machine, and the query runs fine.As noted, I haven't changed anything in the query, nor in the SQL settings.There is a network administrator, so it's possible that he may have changeda setting, but I don't know what. The query is reproduced below. Any ideasas to what's going on would be appreciated.NeilMain query:SELECT Tmp.INVCUST, Tmp.SDNBR, Tmp.SDBOOK, Tmp.SDIVCLN,Tmp.SDPAID, Tmp.SDPRICE, Tmp.SDCOPIES, Tmp.Location,INVTRY.AUTHILL1, Tmp.INVDATE, INVTRY.SaleSrc,INVTRY.HoldInitFROM (SELECT INVDATE, INVCUST, SDNBR, SDBOOK, SDIVCLN,SDPAID, SDPRICE, SDCOPIES, 'P' AS LocationFROM vwInvoiceDetUNION ALLSELECT INVDATE, INVCUST, SDNBR, SDBOOK, SDIVCLN,SDPAID, SDPRICE, SDCOPIES, 'N' AS LocationFROM vwInvoiceDetNUNION ALLSELECT INVDATE, INVCUST, SDNBR, SDBOOK, SDIVCLN,SDPAID, SDPRICE, SDCOPIES, 'M' AS LocationFROM vwInvoiceDetM) Tmp INNER JOINdbo.INVTRY ON Tmp.SDBOOK = dbo.INVTRY.[Index]vwInvoiceDet:SELECT tabInvoice.INVDATE, tabInvoice.INVCUST,SALEDET.SDNBR, SALEDET.SDBOOK, SALEDET.SDINVNUM,SALEDET.SDPRICE, SALEDET.SDPAID, SALEDET.SDCOPIES,SALEDET.SDIVCLN, tabInvoice.INVNBR, SALEDET.SDIDFROM dbo.tabInvoice INNER JOINdbo.SALEDET ONdbo.tabInvoice.INVNBR = dbo.SALEDET.SDNBR(vwInvoiceDetN and vwInvoiceDetM are similar to vwInvoiceDet.)
View 2 Replies
View Related
Dec 9, 2005
SQL Server 2000 SP3ALast week one of our processes starting issuing or suffering deadlockdetected errors every 15 minutes or so.I have read several articles at MS on the subject. I set a couple ofstartup parameters related to producing deadlock detection informationand ran SQL Profiler. I found the SQL statements being issued by thedeadlocked statements. In every deadlock the same UPDATE statementappears however the data values being searched on are different. Thebest I can tell from trying to query the actual data each update hitsonly one or very few rows. No indexed column is updated so the indexesshould not be the source of conflict.Looking at the query I noticed that the query does not have anavailable index and Query Analyzer shows that the full table scan isbeing done in parallel.My question: Does SQL Server change or modify its locking rules whenqueries are converted to be ran using parallel processing? If so, doyou have a reference?Here is the deadlock entries posted to the error log:SPID=167ResType:LockOwner Stype:'OR' Mode: IX SPID:63 ECID:0 Ec:(0x65971510)Value:0x3c577e60 Cost:(0/0)Input Buf: Language Event: UPDATE Station_Upload setStation_Accept_Status = 'ACC',HeadStatus ='ACC',LastProcessedSta='110',HeadPartType='1' WHERE Part_Serial_No ='SCH1119323' AND Station = 'H110'SPID=63ResType:LockOwner Stype:'OR' Mode: IX SPID:167 ECID:0 Ec:(0x65801510)Value:0x3c27d060 Cost:(0/0)Input Buf: Language Event: UPDATE Station_Upload setStation_Accept_Status = 'ACC',HeadStatus ='ACC',LastProcessedSta='70',HeadPartType='1' WHERE Part_Serial_No ='SCH1119060' AND Station = 'H070'I have suggested adding an index to support the query.Any ideas?Thanks -- Mark D Powell --
View 4 Replies
View Related
Jun 29, 2007
I have seen a number of posts regarding parallel development of SSIS packages and need some further information.
So far we have been developing SSIS packages along a single development stream and therefore have managed to avoid parallel development of our packages.
However, due to business pressures we will soon have multiple project streams running in parallel, and therefore multiple code branches, as part of that we will definitely need to redevelop the same SSIS packages in parallel. Judging from your post above and some testing we have done this is going to be a nightmare as we cannot merge the code. We can put in place processes to try and mitigate this but there are bound to be issues along the way.
Do you know whether this problem is going to be fixed? We are now using Team Foundation Server but presumably the merge algorythm used is same/similar to that of VSS and therefore very flaky?
However, not only are we having problems with the merging of the XML files, but we also use script tasks within the packages which are precompiled, as the DTSX files contain the binary objects associated with the script source code, if two developers change the same script task in isolated branches the binary is not recompiled as the merge software does not recognise this object.
Do you know whether these issues have been identified and are going to be fixed to be in line with the rest of Microsoft Configuration Managment principles of parallel development?
Many thanks.
View 7 Replies
View Related
Apr 16, 2007
Hi,
We have a process that builds our data warehouse.
The processes execute SPs in serial order.
Each SP builds separate table.
Each table is build destructive (truncate and Insert Into).
I've tried to change the configuration by running 4 SP in parallel by SSIS to shorten the update time.
I've noticed in two declines in performance:
1. Each SP execution time is higher in the parallel execution in around 50% then in the serial execution. The CPU utilization is the same.
2. On each parallel execution we have a decline in performance of around 5 -10% compare to the previous parallel execution.
Do you have any directions to inquire?
Btw €“ we have Itanium 64bit x8 with 32GB memory
Thanks,
Assaf
View 3 Replies
View Related
Jun 26, 2007
Hello,
I have done a search and have read some of the posts, but am left more confused than before. I am fairly new to SSIS. Here is my situation and what i am trying to accomplish.
I have a package that has a sequence container, in which there are multiple SQL tasks (about 20) running in parallel. I have checkpoints enabled, and FailPackageOnFailure enabled as well. If the package fails, when i re-run the package it will run the last task as well as all the other tasks. What I am looking to accomplish is when the package is re-run, have the SQL tasks that failed ran and not the previous successful tasks.
I think the best way would be via disabling tasks on successful completion of a task, where it writes the name of the SQL task to a temp table, but I am skeptical.
Can anyone point me in a direction to help me accomplish what I am looking for please.
Thanks in Advance.
View 4 Replies
View Related
May 7, 2006
Hey guys. I'll try to keep this as short as possible.
Was testing out some thresholds with a simple service broker application and ran into something interesting that I'm trying to find a workaround to. I setup a simple service on a single instance, and am transmitting messages to/from the same instance. I assigned a procedure to be activated on the queue, with a max of 20 parallel activations.
The procedure originally was reading messages off the queue by following a process with major steps outlined here:
Start loop until no more messages to process
Call get conv. group with 3000 timeout into variable x
If group retrieved, receive top 1 from queue where conv_group = x
If no group, exit
Process message(s) received (simply insert some of the msg data into user table, then waitfor 1 second to simulate doing some work) for the given group
Loop to get next group and messages
So, with this type of configuration, I sent off a few thousand messages to the queue and monitored the queue count, user table count, and activated tasks count. To my surprise, despite the fact that there were thousands of messages in the queue waiting to be processed, only a single task was processing data, pulling about 1 per second.
I tried over and over with different values in the waitfor, and no additional tasks got spun up until I had a waitfor of 6 seconds or more...then only about 3-4 would get spun up at any time in parallel.
Given the way activation occurs, I thought I'd try things without the call to get conv group, and instead simply use a receive without a where clause...so, I rewrote the activation procedure to skip the call to GET CONV GROUP, and simply perform a receive without any where clause. I also removed the waitfor delay and simply left it to process messages as fast as it could...
This time, I sent over the same number of messages and checked the same counts, this time noticing that within 20 seconds or so, all 20 parallel instances of the activation procedure were processing messages off the queue, and doing so very quickly. Same occured if I put the waitfor delay back in...
The problem is, that this type of coding doesn't allow for processing related messages, even though it seems to be much better at allowing parallel activations. I tried to rewrite again using 2 types of receive statements (first one without a where clause and next one with a where clause on the conv. group from the first receive after processing the message) to see if that would allow for better parallel activation, however that worked about the same as using the GET CONV GROUP call.
So, if I have an application that mostly does not make use of grouping and want to ensure optimal parallel activation when needed, however also handle the times when related messages arrive and need to be processed together, what can I do? Any ideas?
I have test code scripts that can reproduce this behavior if needed. Thanks up front,
View 11 Replies
View Related
Jul 25, 2007
I'm working on an SSIS package whose job it is to push data from x # of tables to x# of servers. Right now, I have it implemented with nested ForEach loops. ForEach Server loop inside of a ForEach table loop. It works great, but it will take too long as it is serial. I need a way for the inner operation to take place in parallel. In SQL 2000, the way I 'm doing this is by spawning a SQLAgent job per server. I could always do this here as well, but I'm interested in finding out if there is a way to do this with a single package.
I'm okay with process the table serially, but for each table I'd like to be able to parallel the push to the servers. Can't do it with multiple data flows because there are N # of servers being pushed to.
Hope that makes sense.
View 1 Replies
View Related