I am getting the following Error and I can't find it in the Manual.
Failure to send an event notification instance of type 'BROKER_QUEUE_DISABLED' on conversation handle '{5D273374-E84F-DB11-B3BC-0004239AB15C}'. Error Code = '8429'.
I have checked the service name for the notification and it appears to be okay.
Hallo, I have some problems to handle the poison messages.
Messages are exchanged between 2 databases on the same SQL Server instance. I created a Trigger on the insert , that call the sp that begin the conversation(10 minutes Timeout) and send the message to the target queue.
On the target queue I activated a SP and the main code is:
BEGIN TRANSACTION
WAIT FOR (Receive (1) ..) timeout 5000
IF (@ErrorSave <> 0)
BEGIN
INSERT INTO [TestSender].[dbo].[tblErrorXMLMessages] VALUES ();
END
ELSE
if (@messagetypeName = N'Message')
BEGIN
exec [dbo].[sp_ProcessMessage] @message_body
END
ELSE IF (@messagetypeName = N'EndOfStream')
BEGIN
END CONVERSATION @dialog
END
COMMIT TRANSACTION
My communication has a strange behaviour:
if I type
Begin
insert into TriggerTable values(XMLMessage)
insert into TriggerTable values(XMLMessage)
....
insert into TriggerTable values(XMLMessage)
end
Everything work fine but If I write an insert of 1000 message coming from an another table and I use the
cursor login after I while the transmission stop because the Target queue become inactive. I can see my messages stuck in the Initiator transmission_queue so I think that there is some Poison message that cause 5 rollback and disable the receiver queue.
First I would like to isolate the wrong message and carry on with the insert, my application doesn't have to stop the conversation or return any error, but If I use the Sql server Debugger I'm not able to debug the Target queue's stored procedure.
I suppose that some error happens on the Target queue's stored procedure but how can I first at all find it out?
Maybe it has something to do with the transition wrapped around the RECEIVE command?
I know I have to build in some ability to deal with a posion message. I was thinking that (sort of mentioned in the article http://msdn2.microsoft.com/en-us/library/ms166137.aspx) I would write a special stored procedure that would handle this situation. I don't know how I could activate this. I don't want to poll of course in a waitfor.
Hi! I have crl stored procedure with distributed transaction in it. I really need such transaction. When a poison message occurs 5 time my queue is turned off. I've read about handling poison messages in msdn (save transaction and rollback part of it), but it works only with local transaction. What should I do? An important point is that messages shoold be processed in right order, I can't receive message and put it to the end of queue. I want to try process poison message constantly. I doesn't want to stop receiving messages in that queue. Thanks.
I'm using the Service Broker to parallize my processes (I know that the Service Broker was not designed for that purpose), however it's working quite well.
I use the broker procedure to start procedures which all process all a part of the workload. When the procedure fails because of a lock timeout (or for that concern, for whatever reason), I rollback the transaction (which also roll back my message received on the queue so that it can be retried at a later time.). And this is where my problem lies, if there are 5 sequential rollbacks of messages then the poison message detection kicks in and disables the queue, stopping all the processing. :(
Is there a way to disable poison message detection? I have implemented my own stop-mechanism through a counter system on a per sub-task system so if I could disable poison message detection that would be ideal.
If this is not possible is there a way to turn the queue back on automatically so that it will continue processing the messages on the queue?
Hi all, i searched everywhere but couldn't find any info on the following error that i'm currently receiving:
"The conversation endpoint is not in a valid state for SEND. The current endpoint state is 'DI'."
I understand that this is due to some problem in my send/receive protocol but how do i fix this problem so that i can continue with my testing? Right now i'm forced to drop my entire test database and reinstall everytime this message shows up because i can't send/receive any messages at that point. Is there anyway to get rid of it?
I am implementing the first Sql Service Broker use case in our Sql 2005 application, so forgive me if this is an obvious question. We will be using a service & queue to handle requests to download a batch of documents from a third party service. The basic workflow is: A. Web client requests a document download. B. Web application begins a conversation and sends a message requesting the download C. DownloadService stored procedure retrieves the message and attempts the document download. D. If the download fails, retry every 30 minutes up to max 5 attempts.
Other than (D), this is perfectly straightforward. I will describe the design I've come up with to retry after 30 minutes below. I would appreciate any suggestions on better ways to handle this within SSB.
My solution relies on a small state table: create table MessageState ( ConversationHandle uniqueidentifier NOT NULL, RetryCount int NOT NULL, MessageBody varbinary(max) NOT NULL )The web client begins a conversation and submits a message. It does not end the conversation, but rather the InitiatorQueue has an activation procedure to end its side of the conversation after the DownloadService ends them, as described here http://blogs.msdn.com/remusrusanu/archive/2006/04/06/570578.aspx.
The DownloadService logic, in pseudo-code: 1. Use a typical loop to get conversation groups and receive messages 2. If message type = DownloadRequest 2a. Call a stored procedure to attempt the download 2b. If successful, end conversation 2c. If not successful, add entry to MessageState and BEGIN CONVERSATION TIMER with a 30 minute timeout 3. If message type = 'http://schemas.Microsoft.com/SQL/ServiceBroker/DialogTimer' 3a. Look up conversation handle in MessageState 3b. Recall original MessageBody 3c. Call a stored procedure to attempt the download 3d. If successful, delete MessageState row end conversation 3e. If not successful and no retries left, delete MessageState row and end conversation with error 3f. If not successful and retries remaining, increment MessaageState.RetryCount and BEGIN CONVERSATION TIMER with a 30 minute timeout
The main downside I see to this solution is a certain lack of transparency. After step (2c) or (3f), "select * from DownloadQueue" will no longer show the pending request, although you can find it in sys.conversation_endpoints.
Is there a better idiom to solve this type of problem in SSB?
Can anyone give me info on how the report processing page works in reporting services.
My application makes some pretty heavy queries to the database and I would like to have a message appear on the page whilst the request is being processed.
Something identical to the way reporting services deals with this would be absolutely perfect!
Hello again, I have some poison message detection in place, based on the BOL sample. My problem is that after the 5th message retry my queue goes down - that is the fifth retry on any message. In actuallity, the first message is retried 3 times and it is taken off the queue [for real], the second message comes in and on the second retry - pooof - the queue is down.
I though the poison mechanism should work on a per message basis. It there a setting for the queue I missed? Is my only chance for to fix this: re-enable the queue upon BROKER_QUEUE_DISABLED event notification?
In the "Example: Detecting a Poison Message" section, it reads: This Transact-SQL example shows a simple, stateless service that includes logic for handling poison messages. Before the stored procedure receives a message, the procedure saves the transaction. When the procedure cannot process a message, the procedure rolls the transaction back to the save point. The partial rollback returns the message to the queue while continuing to hold a lock on the conversation group for the message.
I run SB between 2 SQL servers. In profiler on an initiator side I see next error: 'This message could not be delivered because its message timestamp has expired or is invalid'. For the conversation we use best practice, i.e. target closes a conversation. Target side succeed to close conversation, but initiator still stay in DO (disconnect_outbound). What is a reasone for the error? What to do?
I see in profiler this error: "This message could not be delivered because its message timestamp has expired or is invalid" What is a reason for error?
I am having trouble specifying a message body that is valid. I mean for the client to send. If I leave it as null then everything is ok but if I create a memorystream and add a line of text it reports back it did not pass validation. I do not understand this and am not sure what to do. I need to send a message based on a code and text but do not know the format of the body that is allowed. The code I am refering to comes out of HelloWord_CLR because that is what I am formating my sample after. I call it the same way it calls the return message done in ServiceProc. I need to know the message format including body since this does not seem to work. A sample of the call is bellow.
// Create an empty request message
string Msg = "Hello";
MemoryStream body = new MemoryStream(Encoding.ASCII.GetBytes(Msg));
I have a DTS that imports data from an orcle database into SQL Server. Doesn't the processing mostly occur on the SQL Server, not on the oracle database from which the data is being imported? The oracle database is vendor provieded and they are saying our SQL Server DTS package is killing their server. Any insight is appreciated. Thanks
I've got a process that creates records in my database based on XML input that I've gotten. What I am doing it giving this XML to a stored procedure to handle a specific task, then modify the XML and send it to the next stored procedure.
For instance, the XML could hold header records with detail records, I would first send the XML to a stored procedure that creates the header records, then updates the XML so the XML now knows the identity values of the header records I have just created, and then send the XML to the next stored procedure to create the details for those headers.
All works great and fine, but I have a problem with writing the identity values back to the XML. It seems I can only change one item in the XML at a time and thus need to loop this. For many records this really takes a long time.
Here is some sample code of what I'm doing (please excuse any typos, this is a simplified version of the code) :
declare @lvSeq numeric(15) declare @lvRowNo int declare @lvNumRows int
insert into myHeaderTable ( recid, recdesc ) select ref.value('@recid', 'nvarchar(25)') recid, ref.value('@recdesc', 'nvarchar(250)') recdesc from @pXML.nodes('//headers/header') R(ref)
select @lvRowNo=1, @lvNumRows = @pXML.value('count(//headers/header)', 'int') while (@lvRowNo<=@lvNnumRows) begin select @lvSeq = recseq from myHeaderTable where recid = @pXML.value('//headers/header[position()=sql:variable("@lvRowNo")]/@recid)
set @pXML.modify('replace value of (//headers/header[position()=sql:variable("@lvRowNo")]/@recseq with sql:variable("@lvSeq")')
select @lvRowNo=@lvRowNo+1 end
Obviously I am looking for a better way to update the XML with the sequences. The insert takes a second, the loop takes minutes with large XML sets. I guess MSSQL is searching the whole XML to find the item to update.
It would be nice if I didn't have to loop through the XML. One solution I was thinking off is to store the XML in a temporary table with a single record per header item. Then I could do the modify in one go and recreate the XML by simply selecting the contents of the temporary tabel. I have no idea if this is possible.
So something like this:
select ref.value('@recid','nvarchar(25)') recid, ref.value('.','XML') XMLData -- this gives an error into #TMP_XML from @pXML.nodes('//headers/header') R(ref)
insert into myHeaderTable ( recid, recdesc ) select recid, ref.value('@recdesc', 'nvarchar(250)') recdesc from #TMP_XML CROSS APPLY XMLData.nodes('/header') R(ref)
update #TMP_XML set XMLData.modify('replace ....') from myheadertable where #TMP_XML.recid = myheadertable.recid
Hello friends, I needed a suggestion, I am currently working on a reporting website that generates reports and i need to store all the reports in the database.
I usually go by row wise processing as it can be easily controlled but the problem is there will be a lot of reports, that is an estimation of 30,000 rows in a month and i m not sure if sql server can hold more than 2 billion rows per table.
I will just explain whole scenario what I m facing in tricky problem..
We have xml files coming at regular interval by some other source into sql server 2000…daily having records near @10000 to 70000…we have job scheduled to run it regular interval…we doing this by some filter criteria… suppose the flow is like staging table into secondary table and then final into primary table…. We design DTS package accordingly means take the records from staging table put into secondary table and then into primary table…(near @ 8 task involved in it…) Suppose xml file came at 8:30 am and our DTS package will run at 9:00 am…and then 11:00 am and the 1:00 pm like that….what I observing from many days is that after running job at 9:00 am successfully some good data still pending in secondary table not processed into primary table. But when again job ran at 11:00 am it processed that pending good records into primary table…some times when I ran this job manually through DTS design level the good data that pending in secondary table processed!!!
My question is that why this job not processed all the good records in single shot????
Hello everyone. I need help regarding the following:Given the following table:CREATE TABLE T1 (C1 nvarchar(10), C2 money)INSERT INTO T1 VALUES ('A',1)INSERT INTO T1 VALUES ('B',2)INSERT INTO T1 VALUES ('C',3)let's say that i have this table in a local server and i want to uploadit to a remote server and in the remote server upload it to a databasethat contains the same table.the uploading part can be done by another application in the remoteserver, but i want i need is a way to transfer the data at the fastestpossible way.what steps do i need to follow?tia,Rey Guerrero
Which component should be used to process dataset as a whole, and not on per row basis? I have need to process dataset conditionally (condition based on dataset), e.g. if a special row is present in dataset than dataset should be processed in a special way. Should I maybe use one Script Transformation to determine if dataset satisfies condition (and store that result into a variable), and then based on that condition (variable value) perform or not processing (using Conditional Split and Script Transformation)?
I am having trouble trying to construct the following process in SSIS/SQL 2005:
1. Grab a set of unprocessed rows (ProcessDT = null) in an 'Action' table 2. For each of these rows, execute multiple stored procedures base on the action type If actiontype = 1, exec spAct1a @param1, @parm2 exec spAct1b @param1, @parm2, @param3, @param4 If actiontype = 2, exec spAct2a @param1, @parm2, @param3 exec spAct2b @param1, @parm2, @param3 etc.... 3. Update ProcessDT so it's not processed again 4. Repeat until all rows are processed
Note - all sp params are contained in additional columns in the Action table. Basically the Action table is a store for post-event processing of sorts but is order dependent, hence the row by row processing. And some of my servers are 2000 so Service Broker is not an option (yet).
I first attempted to do this totally within the control flow - using an ado recordset/foreach loop control, but I could not figure out how to run conditional process paths based on the ActionTypeID. I then tried to do this within the dataflow using on OLEDB data source, a conditional split, and an oledb command control which almost got me there - the problem being for each row I need to execute multiple sp's and it appears as if the oledb command only gives me one sp.
I'm populating a new table based on information in an existing table. The new table is a list of all "items" and contains a primary key. The old table is a database of receipts where items can appear many times in any order.
I have put together the off-the-shelf components to do this, using a lookup transformation to see if the item is already in the new table. Problem is, because there's so much repetition in the old table I need to process the old table one row at a time. Batch processing is generating errors because the lookup doesn't detect duplicates within the buffer.
I tried setting the "DefaultBufferMaxRows" property of the task to 1, but that doesn't seem to have any effect.
To get the data from the old table, I'm using an OLE DB source. To get the data into the new table, I'm using the OLE DB Command transformation with parameters to execute an INSERT statement.
This is a job I have to do exactly once, so I don't care if I have to run it overnight. I'd rather have a simple, easy to understand but inefficient script so I understand what it's doing completely.
Any help on forcing SSIS to process one row at a time?
I am verifying my reports processing time. I get the information from the Reporting Service DB - [ExecutionLogs] table. I have the following information:
[TimeEnd] €“ time that reports generation ends.
[TimeStart] - time that reports generation starts.
[TimeDataRetrieval] - amount of time spent running the data sources.
[TimeProcessing] - time spent processing the report.
[TimeRendering] - time spent generating the output format.
If this information is correct the following statement should be true:
When using the AS processing task with a connection to "an Analysis Services project in this solution", only some processing options are available for processing dimensions. For instance, it is not possible to select "Process Update". Once I change the connection manager to point to the deployed cube database, I can choose from all the options. Is this by design?
I need help from sql experts for the following problem
DOCTYPE DATE QTY PRD LOT Purchase 1 jan 20+ AA 2007FW Purchase 4 jan 50+ AA 2007SS Purchase 9 jan 10+ AA 2007FW Sale 3 jan 10- AA Sale 4 jan 20- AA Returned Good 4 feb 10 AA
As you can see I don't have the LOT code in sales records, so I must update these records with the following logic:
I have to assign LOT code to sales in FIFO order: from the table above the 3rd of January I find the first sale of 10, take the first LOT (ascending date order), check for qty on hand (20+), consume 10 from LOT (10 remaining), update sale record with 2007FW LOT code.
Then I find next sale of 20, as before I take the first LOT with qty on hand to consume, again it's the first record with only 10 remaining so I set LOT qty on hand to zero but I have 10 more to allocate to a LOT code. So I find next available LOT of 50 and consume 10, with 40 remaining.
It's important to remember that I can sell part of a "lot", have to track the remaining goods per LOT
Is it possible in SQL to do that in batch mode? How? If I have to split sale record consuming 2 or more LOTS, how can I do? Can you show me SQL or good hints?
I have 3 cubes in a single SSAS database and these cubes should be processed using the following schedule
Cube 1 - Every Day Cube 2 - Every Week Cube 3 - Every Month Cube 4 - Every Day
The issue that I face is that these cubes share the dimensions and so I cant do a FULL process of these SHARED Dimensions as it will affec other cubes.
I can expect additions and deletions to my dimension data , but the structure remains the same. It would be great if someone can suggest how to go about processing the dimensions. I am confused with the number of options(Process Incremental, Process Update etc.,) available for processing the dimensions.
I will creating a SSIS package to automate the processing. One more question is say, if Cube 2 fails during a day and Cube 1 has succesfully processed on the same day earlier, how do I revert back to the old state of Cube 2? Does this mean that I need to do a back up of the SSAS database before processing each cube?
I am in the midst of designing a new Data Warehouse system. As we get further into the design of the system, the more we are realising how complex our ETL is going to be and that the amount of time it will take to run could be significant i.e. a few days! My question is obviously I don't want to have a down time in my relational system for this long and prevent my users from accessing the data for days at a time each month. So what functionality should I be looking at to allow me to maintain a working copy of the data that users can query whilst I perform database updates and then perform a quick promotion of the updated data to users for querying?
If you can point me in the direction of the right functionality in SQL Server 2005 and possible some relevant white papers that cover this sort of scenario I would be grateful.
Stored procedure works: PROCEDURE dbo.VerifyZipCode@CZip nvarchar(5),@CZipVerified nvarchar(5) outputAS IF (EXISTS(SELECT ZIPCode FROM ZipCensus11202006 WHERE ZIPCode = @CZip)) SET @CZipVerified = 'Yes' ELSE SET @CZipVerified = 'Not Valid Zip'RETURN Need help calling and processing sp information:
protected void C_Search_btn_Click(object sender, EventArgs e){ SqlConnection con = new SqlConnection(ConfigurationManager.ConnectionStrings["localhomeexpoConnectionString2"].ConnectionString); SqlCommand cmd = new SqlCommand("VerifyZipCode", con); cmd.CommandType = CommandType.StoredProcedure; cmd.Parameters.AddWithValue("@CZip", UserZipCode_tbx.Text); . . . how do you set up output parameter, call the SP and capture output. . . ? if (@CZipVerified == "Not Valid Zip") { TextBox5.Text = "Zip code not valid please re-enter zip"; } else { Continue processing page } }
How would I go about checking incoming e-mails? For example, on a certain e-mail address, I would get e-mails formatted in a certain way. According to the response, some scripts need to run/ some sql tables updates etc. How can one do this in (ASP) .NET with SQL Server? Anyone did this kind of stuff before?