Delete Statements Without Log Filling Up
Aug 13, 2002
I have a database that I am splitting the data using odd account numbers and even account numbers. The odd acct numbers in one database and the even in the other database.
This database is very large. The problem is when I run the delete statements
it is going to fill up the log files. Can I turn on the "Simple" mode on the database while I am deleteing the data. Will this cause a problem? Then can I turn back on the 'Full' mode when I have finished?
Has anyone ever done this and so how did it work. Or better yet is it possible?
Thanks,
Dianne
View 2 Replies
ADVERTISEMENT
Mar 11, 2008
I would like to run more than 100 delete statements from a database.
is it possible to use all delete statements in one single stored procedure.
Thanks.
View 1 Replies
View Related
Sep 21, 2007
Hi
I was curious whether it's possible to audit DELETE statements in the MS SQL database. I created a procedure (below), but I didn't find any event associated with DELETE statements.
Any help will be greatly appreciated!
Thanks,
Alla
CREATE proc sp_Turn_Audit_On
as
/************************************************** **/
/* Created by: SQL Profiler */
/* Date: 11/15/2006 05:16:40 PM */
/************************************************** **/
-- Create a Queue
declare @rc int
declare @TraceID int
declare @maxfilesize bigint
declare @StatusMsg varchar
declare @ServerTraceFile varchar
set @ServerTraceFile = 'E:Program FilesMicrosoft SQL ServerMSSQLTraceAudit_Info'
set @maxfilesize = 1024
-- Client side File and Table cannot be scripted
-- Set the events
declare @on bit
set @on = 1
exec @rc = sp_trace_create @TraceID OUTPUT, 0, N'\hostnamedbauditlogmy_dir', @maxfilesize, NULL
print @TraceID
if (@rc != 0) goto error
exec sp_trace_setevent @TraceID, 14, 1, @on
exec sp_trace_setevent @TraceID, 14, 6, @on
exec sp_trace_setevent @TraceID, 14, 9, @on
exec sp_trace_setevent @TraceID, 14, 10, @on
exec sp_trace_setevent @TraceID, 14, 11, @on
exec sp_trace_setevent @TraceID, 14, 12, @on
exec sp_trace_setevent @TraceID, 14, 13, @on
exec sp_trace_setevent @TraceID, 14, 14, @on
exec sp_trace_setevent @TraceID, 14, 16, @on
exec sp_trace_setevent @TraceID, 14, 17, @on
exec sp_trace_setevent @TraceID, 14, 18, @on
-- Set the Filters
declare @intfilter int
declare @bigintfilter bigint
exec sp_trace_setfilter @TraceID, 10, 0, 7, N'SQL Profiler'
-- Set the trace status to start
exec sp_trace_setstatus @TraceID, 1
--SELECT @StatusMsg = 'sp_trace_setstatus' + ' Error - ' + @TraceID
-- display trace id for future references
select TraceID=@TraceID
goto noCursor
error:
select ErrorCode=@rc
noCursor:
return
GO
exec sp_procoption N'sp_Turn_Audit_On', N'startup', N'true'
GO
View 3 Replies
View Related
Dec 1, 2006
I'm trying to find information and/or articles on what exactly "Do not replicate DELETE statements" does in transactional replication. How does this affect the Publisher, Distributor, and Subscriber? Does information deleted on the publisher not replicate to the subscriber? What scenarios would someone use this option on a article?
Sorry noob to replication and SQL 2005....
View 6 Replies
View Related
Dec 10, 2002
Hi, I am using sql2000 ent edition. I have a partitioned view based on 8 tables. My selects and inserts are fine. But, when I run a delete on the view based on a query on the paritioned column, I get a "Transaction (Process ID 149) was deadlocked and has been chosen as a victim".
I looked at the query plan and it was showing a parallel query on all the underlying tables. So, I put the Option(maxdop 1), using only one processor and the delete worked fine.
Does anybody know why? is parallel query create deadlocks? is there any known problems with deletes on partitioned views?
same question for updates. I think I have the same problem for updates.
Any help will be useful.
thanks!!
View 1 Replies
View Related
Nov 20, 2006
In SQL Server 2000/2005 (not CE) I can use the following T-SQL statement to delete orphaned rows from a table:
DELETE GroupsMembers FROM GroupsMembers LEFT OUTER JOIN Groups ON GroupsMembers.GroupID = Groups.ID WHERE Groups.ID IS NULL
SQL Server CE does not seem to support combining the JOIN statement with the DELETE statement. Is this correct? If yes, is there any alternative statement that could be used to accomplish the same thing?
Gerrit
View 3 Replies
View Related
Apr 21, 2007
Hi, I just want you to know that I am very young in ASP.NET world so please bear with me.I have been looking for an answer to my problem, but unfortunately I couldn’t find one. So I created a user here on www.asp.net just for making this post.
Before I continue I just want to apologies if there is another post where this question is already answered.
Please watch this Print Screen I just took: � http://www.bewarmaronsi.com/Capture.JPG “
As you can see the “INSERT, UPDATE, and DELETE Statements� are disabled, and that’s exactly my problem. I tried with an MS access database and it works perfect, but when I use a MS SQL database this field gets disabled for some reason.
The MDF file is located in the App_data folder and is called ASPNETDB.
And when I try to add custom SQL statements, it gives me Syntax error near “=�. Something like that. I bought the Total Training Set1 package and it works perfect in their examples.
I just want to thank you for reading my post and I hope that you got some useful information for me.
By the way, I’, from Sweden so you have to excuse me if my English is rusty.
Thanks!
PS: Can it be that I’m running windows Vista?
View 4 Replies
View Related
Mar 26, 2008
I have problem in using the SQLDataSource. When in VS 2005 I drag and drop the SQLDataSource onto my page and then add a GridView control.I bind the GridView control to the SQLDataSource control. But the problem is it does not generate the INSERT, UPDATE, and DELETE statements. The dialog box is inactive. The screenshots may help. please help me in this regard. I also tried it for Accesscontrol but the same problem. Sorry for my poor English!. thanks in advance
the screenshot links:
http://img139.imagevenue.com/img.php?image=28285_2_122_937lo.JPGhttp://img205.imagevenue.com/img.php?image=27550_1_122_203lo.JPG
View 7 Replies
View Related
Oct 25, 2004
Auditors want us to track when Insert, Update and Delete failures occur. Is this possible in SQL 2000?
They also want us to track schema changes. Is this possible?
Thanks, Dave
View 5 Replies
View Related
Jun 25, 2015
We are not allowed to write delete statement in a function. It will give the below error.
"Invalid use of a side-effecting operator 'DELETE' within a function."
We will get the same error for truncate/drop statements.
What is the reason behind this error?
View 8 Replies
View Related
Dec 4, 2006
I previously asked What does "Do not replicate DELETE statements" do? to make sure I was correct on my thinking of what "Do not replicate DELETE statements" does. And after finding out it does what I would like it to do, create an archive db, I've tried several testing scenarios to see if it would work. So far I have been unable to not replicate delete statements. I'm not sure if I'm not setting a property right or what and any guidance would be appreciated.
Here is what I've done.
Created a blank db to be used as a subscriber and created a test db with some random data in a table.
Setup a New Publication on the db with the random data as "Transactional Publication"
Selected the following articles and properties in the publication
Tables
DELETE Delivery Format = Do not replicate DELETE statements
Views
Default Values
Stored Procedures
Default Values
User Functions
Default Values
Selected the default options for the rest of the New Publication Wizard steps and clicked finish.
Created a Pull Subscription on the new publication that I just created.
Let it initialize.
Then did a select count(*) query on the test table on the publication (18k+ rows) and subscriber (18k+ rows)
Then did a delete t-sql from the test table on the publication.
Then did a select count(*) query on the test table on the publication (0 rows) and subscriber (0 rows).
Now shouldn't it not delete the records on the subscription db?
View 6 Replies
View Related
Jun 15, 2015
I am working on an app that getting quite a few deadlocks due to delete statements. Â I have turned on the sql trace flags and pulled the offending delete statements out of the ERRORLOG and trying to mesh those up with the indexes defined on the table, etc. looking to see if there is anything that can be done strictly from the db side (no app code change) to reduce/eliminate these deadlocks. Â I have ran some tests/played around with RCSI and even disabling lock escalation but neither have improved my results.
What I have done is to search the errorlog file for DELETE FROM Tablename, output those matching lines, then sort of normalize the literal values to # or XYZ, open in Notepad++, removed trailing whitespace + dups and sort to come up with these results for the unique list of offending T-SQL statements (a LOT easier to read in text editor so sending screen cap. Â
Open this url in new tab [URL] ....
View 7 Replies
View Related
Jul 20, 2006
Hi,I'm new to ASP.NET and having a problem configuring the SqlDataSource control. I am using the standard ASP.NET 2.0 "aspnetdb" database to manage user accounts. The problem is this:When using the wizard to configure my SqlDataSource control, the option to auto-generate the Insert/Update/Delete SQL statements are grayed out. I've searched this forum and found that this can be a symptom of no primary keys in the tables. However, there are primary keys (UserId), which is the default schema as generated by asp.net (aspnet_regsql.exe). When I use the wizard, I make the following choices:How would you like to retrieve data from your database?-> Select "Specify columns from a table or view"-> Select the "vw_aspnet_MembershipUsers" view from the "Name:" drop-down list-> Select "UserId", "Email", "UserName" from "Columns:"After this, still no option to auto-generate I/U/D statements. Any thoughts on why this isn't working??? Thanks,Leah.
View 1 Replies
View Related
Jul 17, 2015
We are using MS SQL Server 2008. I am running a batch job which deletes 21 days older records(6-7 million records). But daily we have transaction is going on in the database. When the delete occurs, all the insert statements got blocked and waits till the delete statement to complete. May I know why the blocking occurs?Â
View 3 Replies
View Related
Apr 23, 2008
I have an SQL data source on my page and I select "Table". On the next screen I pick the fields I want to show. Then I click the "Advanced" button because I want to allow Inserts, updates and deletes. But its all greyed out abd I can't check this option. The UID in the connection string I am connecting under has the correct permissions in SQL server to do inserts, update and deletes too. Anyone know why it would be greyed out? The connectionstring property in the aspx code is dynamic but this shouldn't be the reason because I have used this before with success
View 2 Replies
View Related
Nov 1, 2004
Hello All
I am wanting to fill a drop down list in ASP.NET using C# from a SQL database table using a stored procedure. I have my Sproc. But using ASP.NET C# I have no idea how to do this. Can someone give me a good example, and if not too much trouble, place comments in the code, and give an explanation. I am just learning ASP.NET after moving from Classic. Things are alot different.
Thank You in advnace for all your help
Andrew
View 1 Replies
View Related
Mar 21, 2008
I have a ton of data to load into a SQL 2005 database.
I just loaded a bunch of data for a number of tables using bcp, and the last table that my script loaded was an 8 million row table. The next table was a 12 million row table, and about 1 million rows into the bcp'ing a log full error was incurred. I have the batch size set to 10000 for all bcp commnads.
Here is the bcp command that failed:
"C:Program FilesMicrosoft SQL Server80ToolsBinncp" billing_data_repository..mtr_rdng_hrly_arc_t in mtr_rdng_hrly_arc_t.dat -c
-b10000 -Sxxxx -T
Here is the last part of the output from the bcp command:
...
10000 rows sent to SQL Server. Total sent: 970000
10000 rows sent to SQL Server. Total sent: 980000
10000 rows sent to SQL Server. Total sent: 990000
SQLState = 37000, NativeError = 9002
Error = [Microsoft][ODBC SQL Server Driver][SQL Server]The transaction log for database 'billing_data_repository' is full. To find out why space in the log cannot be reused, see the log_reuse_wait_desc column in sys.databases
BCP copy in failed
I thought that a commit was issued after every 10000 rows and that this would keep the log from filling up.
The log_reuse_wait_desc column in sys.databases is set to 'LOG_BACKUP' for the database being used.
Does a checkpoint need to be done more often?
Besides breaking up the 12 million row data file into something more manageable, does anyone have a solution?
How can I continue to use my same loading script, and keep the log from filling up?
Thank you.
View 9 Replies
View Related
Feb 27, 2007
I have a class that works fine using the SQLDataReader but when I try and duplicate the process using a Dataset instead of a SQLDataReader it returnsa a null value.
This is the code for the Method to return a datareader
publicSqlDataReader GetOrgID()
{
Singleton s1 = Singleton.Instance();
Guid uuid;
uuid = new Guid(s1.User_id);
SqlConnection con = new SqlConnection(conString);
string selectString = "Select OrgID From aspnet_OrgNames Where UserID = @UserID";
SqlCommand cmd = new SqlCommand(selectString, con);
cmd.Parameters.Add("@UserID", SqlDbType.UniqueIdentifier, 16).Value = uuid;
con.Open();
SqlDataReader dtr = cmd.ExecuteReader(CommandBehavior.CloseConnection);
return dtr;
This is the code trying to accomplish the same thing with a Dataset instead.
public DataSet organID(DataSet dataset)
{
Singleton s1 = Singleton.Instance();
Guid uuid;
uuid = new Guid(s1.User_id);
string queryString = "Select OrgID From aspnet_OrgNames Where UserID = @UserID";
SqlConnection con = new SqlConnection(conString);
SqlCommand cmd = new SqlCommand(queryString, con);
cmd.Parameters.Add("@UserID", SqlDbType.UniqueIdentifier, 16).Value = uuid;
SqlDataAdapter adapter = new SqlDataAdapter();
adapter.SelectCommand = cmd;
adapter.Fill(dataset);
return dataset;
}
Assume that the conString is set to a valid connection string. The Singlton passes the userid in from some code in the code behind page ...this functionality works as well.
So assume that the Guid is a valid entry..I should return a valid dataset but its null.
Additionally if I change the sql query to just be Select * From aspnet_OrgNames I still get a null value...I am assuming I am doing something wrong trying to fill the dataset.
Any help would be appreciated.
View 2 Replies
View Related
Jan 14, 2000
MS SQL Enterprise Server, SP5, running under version 6.5.
I have recently been having a problem with the TempDb database
filling up. I originally started the database at 250 Mb but
recently expanded it to 500 Mb.
My last check of the activity on the server during an event
such as this produced the following information.
- Approx. 300 connections to primarily 2 databases.
- 4 active connections:
Connection 1 -SELECT on database 1 with 13,000 records
and a record size of approx. 300 bytes.
Connection 2 -SELECT on database 1 with 13,000 records
and a record size of approx. 300 bytes.
Connection 3 -SELECT on database 2 with 550 records
and a record size of approx. 100 bytes.
Connection 4 -Replication subscriber set at 100 transactions.
My questions are:
1. What processes may cause the TempDb database to fill up?
2. What processes prevent the database from purging?
Any information would be greatly appreciated.
Jim Story
View 2 Replies
View Related
Nov 2, 1999
We are having continual problems with our transaction log filling up on one of our major applications.
Does anyone know of a way or tool to read the transaction log? We want to determine what is causing this problem.
Thanks
View 2 Replies
View Related
May 4, 1999
Recently, we converted an Access database to SQL server 6.5. One of the processes that runs against the server is
missing a commit causing temporary stored procedures to fill up TEMPDB in the sysobjects table. The only way to
clear up TEMPDB is to stop and start SQL server when the database fills up. I wrote a quick and dirty stored
procedure to delete the affending rows out of the tempdb..sysobjects table, however, the database still registers as
full after the deletes.
Question:
does anyone know of a process/DBCC I can run against the tempdb..sysobjects table to regain the space in TEMPDB
without having to stop and restart SQL Server? I need a temporary solution while the programmer is debugging the
affending code.
Thanks!
TC
View 1 Replies
View Related
Dec 5, 1998
hello,
I've got replication set up as a publisher subscriber, to basically sync a primary server with a
backup server. My distribution log keeps filling up, I've got a perf alert for now to truncate it
at 75% full for now, but why does it fill up? it's size is about 1.5 gig
My tran log for my database will also not remove about 500mb of data as well, is there any way to
see what is going on?
Thanks
View 1 Replies
View Related
Aug 2, 2007
hello, i'd need a little help with filling GridViewsi browsed over like 10 search pages, but couldnt find any which would solve my problem.so in my ajax project i made a testing page pulled a gridview (GridView1) on it with a fhew buttons and textboxes.i need to fill the gv from code so my websie.asp.cs looks like this 1 protected void Page_Load(object sender, EventArgs e)2 {3 4 5 string connstr = "Data Source=.;database=teszt;user id=user;password=pass";6 SqlConnection conn = new SqlConnection(connstr);7 SqlCommand comm = new SqlCommand("select * from users", conn);8 conn.Open();9 SqlDataReader reader;10 reader = comm.ExecuteReader();11 if (reader.HasRows)12 {13 GridView1.DataSource = reader;14 GridView1.DataBind();15 }16 reader.Close();17 conn.Close();18 comm.Dispose();19 } so i load the page and there's no gridview on the page at all, nor an error msg, the connection and the database/table is fine.any suggestions on what am i doing wrong? and i also like to know if there would be any problem with using this on a tabcontrol/tabthankyou
View 3 Replies
View Related
Apr 9, 2008
hello everyone
i have created a table in sqlserver2005 named "Departments" - in this table different departments of a telephone ( landline ) company are to be stored,which deals with complaints registered to them by there users.
i want to know the name of these different departments which deals with complaints assigned to them like if i do have complaint from a user who has problem with his handset then that complaint will be assigned to "Maintance dept."
as i was never in indusrty , i need the help in filling the table.
just do write me name of departments and the nature of complaints wh they deal with!!!
thanks for the consideration
View 1 Replies
View Related
Oct 29, 2004
I have a table that keeps track of click statistics for each one of my dealers.. I am creating graphs based on number of clicks that they received in a month, but if they didn't receive any in a certain month then it is left out..I know i have to do some outer join, but having trouble figuring exactly how..here is what i have:
select d.name, right(convert(varchar(25),s.stamp,105),7), isnull(count(1),0)
from tblstats s(nolock)
join tblDealer d(nolock)
on s.dealerid=d.id
where d.id=31
group by right(convert(varchar(25),s.stamp,105),7),d.name
order by 2 desc,3,1
this dealer had no clicks in april so this is what shows up:
joe blow 10-2004 567
joe blow 09-2004 269
joe blow 08-2004 66
joe blow 07-2004 30
joe blow 06-2004 8
joe blow 05-2004 5
joe blow 03-2004 9
View 1 Replies
View Related
Oct 2, 2000
I have complex VB code that loads millions of records into a SQL 7 database. I see my tempdb slowly growing until system runs out of disk space.
What are typical programming bugs that would cause tempdb to fill up,
and how can I identify the offending statements through profiler?
Any help greatly appreciated!
View 1 Replies
View Related
Dec 28, 2006
What is the best way to fix issues with your log files in TempDB when you start to see them causing error msgs?
Thansk for your time.
View 2 Replies
View Related
Aug 2, 2001
We have a stored procedure that uses temp tables and must gather a lot of data. When we stress test this stored procedure, the tempdb transaction log fills up.
We tried using "Select Into" for our tables. That caused us not to write to the transaction log but it caused problems because it locked up sysobjects.
Is there some other way not to write to the transaction log?
View 1 Replies
View Related
Feb 26, 2007
Hello,
I am new to SQL Server and learning lots very quickly! I am experienced at building databases in Access and using VBA in Access and Excel.
I have a time series of 1440 records that may have some gaps in it. I need to check the time series for gaps and then fill these or reject the time series.
The criteria for accepting and rejecting is a user defined number of time steps from 1 to 10. For example, if the user sets the maximum gap as 5 time steps and a gap has 5 or less then I simply want to lineraly interpolate betwen the two timesteps bounding the gap. If the gap is 6 time steps then I will reject the timeseries.
I have searched the BOL and MSDN for SQL Server and think there must be a solution using the PredictTimeSeries in DMX, but not quite sure if I can do this. I may be better off simply passing through the time series as a recordset and processing as I would have done in Access...(I am reluctant to do this as I have of the order 100 * 5 * 365 time series and growng by 100 each day and fear it will take quite some time...)
Can anyone help me by pointing me in the right direction please?
Unless there is a way of using PredictTimeSeries on its own, I think the solution is:
Identify if a record is the a valid one or part of a gap (ie missing values).
Identify the longest gap and reject or process data on this value.
Identify if a record preceedes or succeeds a gap.
For each gap fill it using a linear interpolation.
Thanks,
Alan.
View 3 Replies
View Related
Oct 8, 2007
Hello,
In my application I am using Identity columns. When some rows are deleted from table, This identity values are not filling the gap. I mean My current identity is 5. That means 1 to 5 rows sequentially i inserted. If I am deleting 3rd and 4th rows, next identity will still continue with 6. So is there any method to fill the gap between rows
View 2 Replies
View Related
Jul 25, 2007
hi all!
I have 2 instances that communicate via service broker.
The conversations are only one way from initiator server1 db to target server2 db.
I also reuse dialog id's in BEGIN DIALOG @dlgId
i save @dlgId from the first run into a table and then retreive it for each message send
since the messages are constant i don't close the dialog at the target for each message.
i'm just wodering why do both sys.transmission_queue and sys.conversation_endpoints get a row
for each message i send but the transmission_status in sys.transmission_queue is emtpy.
also each conversation_handle and conversation_id is different for each row and
only one row in each sys table has the same conversation_handle as my saved @dlgId.
just wondering what is going on.
this code is done on the initiator
DECLARE @dlgId UNIQUEIDENTIFIER
-- each database has one dialog id
SELECT @dlgId = DialogId
FROM dbo.Dialogs
WHERE DbId = DB_ID()
-- Begin the dialog, either with existing or new Id
BEGIN DIALOG @dlgId
FROM SERVICE [//DataSender] -- service on initiator server
TO SERVICE '//DataWriter', -- service on target server
-- Target's Service Broker Id
'83382A22-2830-4B25-B067-15AAC255EB03'
ON CONTRACT [//Contract1]
WITH ENCRYPTION = OFF;
-- Send data
;SEND ON CONVERSATION @dlgId
MESSAGE TYPE [//Message1] (@msg)
Thanx,
Mladen
View 7 Replies
View Related
Feb 8, 2008
Hi All,
i have 50 tables.
M trying to fill the table in dataset using a loop.
no roblem for first 25 tables ,
but at 26th table it gives the error "Input string was not in a correct format.Couldn't store <value> in "column_name" Column. Expected type is UInt32.
column type is varchar(50)
changed it to TEXT ....but nothing happened.
its taking numericals but no characters.
Debugged one by one...
at 26th table....
adapter.fill(dataset)
throws exception
Please help me.
regards,
Amit.
View 6 Replies
View Related
Sep 4, 2007
I am having an issue with the transaction growing uncontrolled and filling up the disk. I suspect that transactions are structured incorrectly between the web application that is monitoring a queue and the SQL that is executing the WAITFOR RECEIVE. This method is receiving large binary objects, so thats the reason for the arguments to the reader. Also, even though its not the suggested way, we commit everytime through to prevent the queue from disabling (which it was doing when we would ROLLBACK - we don't really care if the message is bad, we just want to log it and wait for the next one).
The basic structure is this, which is executed on a separate thread. Am I missing something that could be causing transactions to get into a state where the log grows uncontrollably? Is there a problem with the loop? Should I be doing a ROLLBACK when there is nothing to receive (this is a low volume queue, so it may not receive a message for a few minutes or more)? If so, where should I be doing this?
private void MonitorUpdateQueue()
{
const string _sqlEndDialog = @"END CONVERSATION @conversationHandle";
const string _errorMessageType = @"http://schemas.microsoft.com/SQL/ServiceBroker/Error";
const string _endDialogMessageType = @"http://schemas.microsoft.com/SQL/ServiceBroker/EndDialog";
using (_ssbConnection)
{
if (_ssbConnection == null) _ssbConnection = new SqlConnection();
using (_monitorQueueCommand =
new SqlCommand("asp_ProcessMessage", _ssbConnection))
{
_monitorQueueCommand.CommandType = CommandType.StoredProcedure;
_monitorQueueCommand.CommandTimeout = 0;
while (true)
{
SqlDataReader reader = null;
Guid conversationHandle = Guid.Empty;
Byte[] payloadBytes = { 0 };
try
{
if (_ssbConnection.State == ConnectionState.Closed)
{
_ssbConnection.ConnectionString = _ssbConnectionString;
_ssbConnection.Open();
}
Byte[] latestFullUpdate = null;
_monitorQueueCommand.Transaction = _ssbConnection.BeginTransaction(IsolationLevel.Serializable);
if (_monitorQueueCommand.Transaction != null)
{
reader = _monitorQueueCommand.ExecuteReader(CommandBehavior.SequentialAccess);
if (reader != null)
{
XmlDocument updateMessage = new XmlDocument();
while (reader.Read())
{
conversationHandle = reader.GetGuid(0);
string messageType = reader.GetString(1);
payloadBytes = reader.GetSqlBytes(2).Value;
if (messageType == _endDialogMessageType || messageType == _errorMessageType)
{
SqlCommand cmdEndDialog = new SqlCommand(_sqlEndDialog, _ssbConnection, _monitorQueueCommand.Transaction);
cmdEndDialog.Parameters.AddWithValue("@conversationHandle",
conversationHandle);
cmdEndDialog.ExecuteNonQuery();
}
else
{
if (payloadBytes.Length > 0)
{
ProcessUpdateQueue(updateMessage);
}
}
}
reader.Close();
}
_monitorQueueCommand.Transaction.Commit();
}
}
catch (Exception ex)
{
if (_monitorQueueCommand.Transaction != null && _monitorQueueCommand.Transaction.Connection != null)
{
// Dequeue the bad message and ends the conversation
SqlCommand cmdQuarantineMessage = new SqlCommand("usp_FailMessage", _ssbConnection, _monitorQueueCommand.Transaction);
cmdQuarantineMessage.CommandType = CommandType.StoredProcedure;
cmdQuarantineMessage.ExecuteNonQuery();
if (reader != null && !reader.IsClosed) reader.Close();
if (_monitorQueueCommand.Transaction != null) _monitorQueueCommand.Transaction.Commit();
}
log.Error("Error monitoring queue.", ex);
}
finally
{
if (reader != null && !reader.IsClosed) reader.Close();
if (_monitorQueueCommand.Transaction != null)
{
_monitorQueueCommand.Transaction = null;
}
}
}
}
}
}
The stored procedure that is monitoring looks like this:
CREATE PROCEDURE [dbo].[asp_ProcessMessage]
AS
BEGIN
WAITFOR ( RECEIVE TOP ( 1 ) conversation_handle, message_type_name, message_body
FROM [MyQueue] ) ;
END
View 3 Replies
View Related