In terms of scaling out Service Broker to hundreds of instances, would it be better from a performance perspective so have one queue with all the messages coming in(obviously with a high number for max_queue readers), or create a number of queues and spread the messages across them ? Or is there no significant difference.
The one reason i am leaning towords multiple queues is so that is poison messages are found or a something lese goes wrong with a queue not all messages are affected, however creating multipple queues makes it more complex and required more administration ?
I develop Service Broker applcation, SB has 20 queues, also I have one Activation Procedure which carries out following actions:
CREATE PROCEDURE proc_ms_Activation
AS
BEGIN
WAITFOR (RECEIVE TOP(1) message_type_name, message_body, conversation_handle, message_id FROM Queue1)
-- process message, save statisctisc and etc
END
Problem in that: when I execute command WAITFOR (RECEIVE TOP (1) message_type_name, message_body, conversation_handle, message_id FROM Queue1) I should specify a name concrete Queue, for example "... FROM Queue1 ". I would like to use one procedure for processing all Queue.
How I can design my application that one Activation Procedure processed messages from all Queues?
I have spent days searching the web and forums for an answer to this simple question and cannot find an example.
I have built a service broker application on sql server 2005. The application puts some xml on an incoming queue which is basically a few parameters to be used in a query. This queue will then call a stored proc which does some business logic and puts the resulting results in another queue also in xml.
I have written a test harness in SQL to put messages on the inbound queue and then some sql to retrieve the returned code from the outbound queue.
What I want to do is be able to convert the SQL which does this into .net code to be used by an application. i.e. write in .net some code to put xml on a queue and then write some .net code to retrieve xml from another queue.
I wouldn't have thought this would be a difficult thing to do and would have been done hundreds of times, but unable to find anything to simply send and retrieve XML to service broker queues....
thanks for your help.. its really needed. I found some links, but they are really vague and often doing select statments in service broker or something like this. I don't want to call any sql, just send and recieve XML on the queues.
any example code that does this, would be really helpfull
What is best practise for what this number should be?
I have seen guidance saying set to a number greater than 25000, but not from any source I particularly know or trust. (I checked SANS, NIST-CIS and the NSA, but I couldn't find anything.
I'm just getting around to understanding notification services in sql 2005 and I have been working through some examples. I'm curious as to why there are always two queues and two corresponding services being set up even when both queues/services exist in the same database. Here is my ultimate goal. I want to have triggers put messages on the queue for various stored procedures to handle asynchronously - i.e. table xyz is updated, and I need a stored proc to take the updated values and possibly generate or update rows in another table. When I tried to set up one queue and one service, nothing ever seems to get put on the queue:
Create Message Type TestMessageType Validation = Well_Formed_XML;
Create Contract TestContract (TestMessageType Sent By Any);
Create Queue TestQueue;
Create Service TestService ON Queue TestQueue(TestContract);
Begin Tran;
Declare @DialogHandle UNIQUEIDENTIFIER;
Begin Dialog Conversation @DialogHandle
From Service TestService
To Service 'TestService'
On Contract TestContract
Send On Conversation @DialogHandle
Message Type TestMessageType (N'<message>Hello World</message>');
I'm trying to do a very simple example of sending a message from Initiator queue to Target queue. The result is no messages are delivered.
Here's the code:
DECLARE @conversationHandle UNIQUEIDENTIFIER
BEGIN DIALOG CONVERSATION @conversationHandle
FROM SERVICE GmiInitiatorService
TO SERVICE 'GmiTargetService'
ON CONTRACT GmiContract
WITH ENCRYPTION = OFF;
SEND ON CONVERSATION @conversationHandle MESSAGE TYPE GmiMessage ('test');
END CONVERSATION @conversationHandle
All three queues are empty (Initiator, Transmission, Target). When I comment out the last line ("end conversation"), the messages get stuck in the Initiator queue.
We can create a queue on a filegroup for performance reasons however what about the sys.transmission_queue which could have messages backing up when the target is down.
HiGot a strange problem.For some reason our web client box times out occasionally. Maybe afew times a day.Nothing appears in the logs.What I do know is that the disk queue ramps up to <high>(think top ofthe graph), processor queue jumps up and the tps drops to 0 (naturallyenough!) along with reduced pagefile usage. Usually at this time thetps is between 20 and 300. Running a pair of mirrored 18gb scsi disksfor the whole server (yeah yeah) and a whopping 640mb memory. Oh anda single piii 1 gig. Sql2k standard vanilla, server2k.What server trace events is it worth my catching to try and get abetter Idea of what may be causing this? No major jobs (backups etcseem to be occuring at these times).Cheers
Hi everyone! I have a very brief question... I have 10 queues in my database and each of them are sent equal number of messages... There are instances where they execute/activate the stored procedures very fast but there are times where they don't, does anyone have an idea why this happens?
Thank you very much for taking the time to read my post. :)
We have problems with duplicate messages in Service Broker queues. We have tried the "fire and forget" method.
Senario Initiator doing SEND and END Conversation , target doing RECEIVE and End Conversation , Sql servar agent job runing every minute doing End conversation with cleanup in sys.conversations_endpionts queue.
We are runing 5000 - 10000 messages per minute.
When the clean up job is running we get som duplicte messages in the target queue.
Enviroment Two separate machines runing Sql Server 2005 Standard Editon SP2
Initiator Machine
CREATE MESSAGE TYPE [TransactionStatisticsSend] AUTHORIZATION [dbo] VALIDATION = WELL_FORMED_XML
CREATE CONTRACT [TransactionStatistics] AUTHORIZATION [dbo] ([TransactionStatisticsSend] SENT BY ANY)
CREATE QUEUE [dbo].[TransactionStatisticsQueueActive] WITH STATUS = ON , RETENTION = OFF ON [PRIMARY]
CREATE SERVICE [TransactionStatisticsServiceActive] AUTHORIZATION [dbo] ON QUEUE [dbo].[TransactionStatisticsQueueActive] ([TransactionStatistics])
CREATE REMOTE SERVICE BINDING [RSB::TransactionStatisticsServiceStat] AUTHORIZATION [dbo] TO SERVICE N'TransactionStatisticsServiceStat' WITH USER = [Proxy::BtsDebitServiceStat] , ANONYMOUS = OFF
ALTER PROCEDURE [dbo].[Bts_SP_TransactionStatPrepare] (@TransferID varchar(30)) AS declare @RowCount Int, @ErrorSave Int, @Msg xml, @dialog_handle UNIQUEIDENTIFIER
Begin
BEGIN TRY
set @msg = (Select Regtime, SendPartShortName, RecPartShortName, MsgType, From btslogactive.dbo.StatTransferlog Tl where TransferID = @TransferID FOR XML RAW)
BEGIN DIALOG CONVERSATION @dialog_handle FROM SERVICE [TransactionStatisticsServiceActive] TO SERVICE 'TransactionStatisticsServiceStat' ON CONTRACT [TransactionStatistics] WITH ENCRYPTION = OFF ;
SEND ON CONVERSATION @dialog_handle MESSAGE TYPE [TransactionStatisticsSend] (@msg) ; END CONVERSATION @dialog_handle
END TRY BEGIN CATCH SELECT ERROR_NUMBER() as ErrorNumber, ERROR_MESSAGE() as ErrorMessage; END CATCH
End
Target Machine
CREATE MESSAGE TYPE [TransactionStatisticsSend] AUTHORIZATION [dbo] VALIDATION = WELL_FORMED_XML
CREATE CONTRACT [TransactionStatistics] AUTHORIZATION [dbo] ([TransactionStatisticsSend] SENT BY ANY)
CREATE QUEUE [dbo].[TransactionStatisticsQueueStat] WITH STATUS = ON , RETENTION = OFF , ACTIVATION ( STATUS = ON , PROCEDURE_NAME = [dbo].[Bts_SP_TransactionStatUpdateBroker] , MAX_QUEUE_READERS = 1 , EXECUTE AS N'dbo' ) ON [PRIMARY]
CREATE SERVICE [TransactionStatisticsServiceStat] AUTHORIZATION [dbo] ON QUEUE [dbo].[TransactionStatisticsQueueStat] ([TransactionStatistics])
ALTER PROCEDURE [dbo].[Bts_SP_TransactionStatUpdateBroker]
AS
DECLARE @MessageType sysname DECLARE @ConversationHandle uniqueidentifier DECLARE @MessageBody XML DECLARE @RegTime datetime, @SendPartShortName varchar(30), @RecPartShortName varchar(30), @RC int
BEGIN TRANSACTION; WHILE (1=1) BEGIN WAITFOR ( RECEIVE TOP (1) @MessageType = message_type_name, @MessageBody = message_body, @ConversationHandle = conversation_handle, @SeqNo = message_sequence_number FROM [TransactionStatisticsQueueStat] ), TIMEOUT 5000;
IF (@@ROWCOUNT = 0) BEGIN ROLLBACK TRANSACTION RETURN END
IF (@MessageType = 'http://schemas.microsoft.com/SQL/ServiceBroker/EndDialog') BEGIN END CONVERSATION @ConversationHandle BREAK END ELSE IF (@MessageType = 'http://schemas.microsoft.com/SQL/ServiceBroker/Error') BEGIN END CONVERSATION @ConversationHandle BREAK END ELSE IF (@MessageType = 'TransactionStatisticsSend') BEGIN Begin try SET @RegTime = @MessageBody.value('(/row/@Regtime)[1]', 'datetime') SET @SendPartShortName = @MessageBody.value('(/row/@SendPartShortName)[1]', 'varchar(30)') SET @RecPartShortName = @MessageBody.value('(/row/@RecPartShortName)[1]', 'varchar(30)') SET @MsgType = @MessageBody.value('(/row/@MsgType)[1]', 'varchar(30)')
.
End try BEGIN CATCH SELECT ERROR_NUMBER() as ErrorNumber, ERROR_MESSAGE() as ErrorMessage; END CONVERSATION @ConversationHandle BREAK END CATCH
-- EXEC @RC = SP_XXXXX
END END CONVERSATION @ConversationHandle BREAK END END COMMIT TRANSACTION RETURN
Sql Server Agent job procedure
ALTER PROCEDURE [dbo].[Bts_SP_Del_Conversation_Endpoints]
AS begin DECLARE c_PurgeConversationEndpoints CURSOR FAST_FORWARD FOR SELECT conversation_handle FROM sys.conversation_endpoints with (nolock) WHERE is_system = 0 AND [State] = 'CD';
OPEN c_PurgeConversationEndpoints; DECLARE @DialogHandle UNIQUEIDENTIFIER; FETCH NEXT FROM c_PurgeConversationEndpoints INTO @DialogHandle; WHILE @@FETCH_STATUS = 0 BEGIN END CONVERSATION @DialogHandle WITH CLEANUP; FETCH NEXT FROM c_PurgeConversationEndpoints INTO @DialogHandle; END CLOSE c_PurgeConversationEndpoints; DEALLOCATE c_PurgeConversationEndpoints;
Ok, I'm stumped. I have a database which has a table that stores documents. Each document has a primary key. For years, I've been notifying an external program that a new document has arrived in the db by firing a trigger on the table. Trigger calls xp_cmdshell, which calls an external program which enqueues a message on a message queue.
I have read from a variety of sources that using the "WITH CLEANUP" option on a "END CONVERSATION" statement is bad and unnecessary. (Question #1) Is this true???
My code does not work properly if I don't use the "WITH CLEANUP" option. My code leaves closed conversation records in the queues if I leave out the "WITH CLEANUP" option. The "END CONVERSATION" statement is executing properly and flagging the conversation record as closed but the records don't get deleted. All the messages are going back and forth properly too.
My code is based on the HelloWorld ServiceBroker sample which does not use "WITH CLEANUP". When I run the sample scripts everything works great and the conversation records are deleted. However, this sample does not uses an activation stored procedure to receive messages and respond with the results. When I copy and paste the receive messages sample code into an activation stored procedure is when the problem comes up. It's the same code! (Question #2) Why am I getting different results depending how the code is executed/activated???
This is could be a tough one. I just hope somebody else has seen it too and figured it out. Thanks for the help!
Is there a way for a .NET application to receive a notification when a service broker queue has been updated with a new message? I tried using SqlDependency on an SB queue but I got an "invalid" error in my notification handler.
Such a notification would be much better than having to poll the queue every N seconds.
I've been experimenting with the new SQL Server Service Broker, and I think I've discovered some interesting behavior. Service Broker relies on "Queues" to store messages that need to be processed. Service Broker operates by sending a message from one Queue (the INITIATOR Quque) to another Queue (the TARGET Queue). A Queue can have an "Activation Stored Procedure" associated with it. This procedure is what actually processes the messages in the Queue.
The first behavior I obeserved related to the setting of a Queue's RETENTION parameter. The RETENTION parameter indicates whether or not the Queue will retain a copy of the messages it receives. By default, the parameter's value is "OFF" (meaning it will not retain messages). In the Activation Stored Procedure of my TARGET Queue, I used "sp_send_dbmail" to send an e-mail message. I wanted to capture the "conversation_handle" (a uniqueidentifier that identifies a particular message) and include it in the body of the e-mail. I was unable to capture it, because the Queue's RETENTION parameter was "OFF". When I tried to capture the conversation_handle from the INITIATOR queue (whose RETENTION parameter was "ON"), I was successful. The moral of the story is you apparently need to have RETENTION = "ON" if you need to capture information from a Queue.
The second behavior I observed relates to the setting of a Queue's MAX_QUEUE_READERS setting. This setting allows you to automatically invoke multiple instances of the Activation Stored Procedure. As messages come into the Queue, the Queue creates an additional instance of the Activation Stored Procedure up to the number you specified in the MAX_QUEUE_READERS setting. This allows parallel processing of messages in the Queue. There is also a programming technique called a "RECEIVE LOOP" which is used for processing messages. In the RECEIVE LOOP, you have a parameter called WAITFOR which tells the Queue whether it should stay on constant alert for new messages or whether it should time out after a specified number of seconds.
If you have the Queue wait indefinitely (by not specifying a TIMEOUT value in the WAITFOR statement) and you have invoked multiple copies of the Activation Stored Procedure, the procedure will remain in memory indefinitely. Therefore, if you make a change to the code of the Activation Stored Procedure, the change will NOT be reflected in the Activation Stored Procedure until you change the STATUS of the Queue. I had changed my procedure so that it would not send an e-mail, but the e-mails kept coming. The e-mails did not stop until I executed an ALTER QUEUE statement. I ran "ALTER QUEUE queue_name WITH STATUS = OFF;" and then I ran "ALTER QUEUE queue_name WITH STATUS = ON;" After that, the changes were reflected in the procedure.
Be aware of this behavior as you design your Queues.
Our current project involved into managing NTFS hierarchical folders structured by multi-level customers and associated projects for them. Essentially each folder level represents one level of customer hierarchy or project root. Both have different subfolders and user access rights for them based on generic XML templates. The folders resided on file servers across the country and should be accessible in ordinary way NTFS file shares are allowing. LAN/Intranet MS AD Win2003 / SQL2005 environment. The folder management system basically have to keep the folder structure in tact with changes in underlying managing application logic. That involves such operations as creating a new folder with subfolders, copying a folder with its content into another folder branch (which may be on the same or another server and place), deleting a folder/content, renaming a folder, applying NTFS access rights to folders and subfolders for users according generic templates. As all these actions are unreliable and some tasks may take hours to complete, SSB approach seems to be the viable solution. Some tasks involved have to be done within 10 minutes, others (are prolonged) have to be scheduled for overdnight run. Windows service like EA sample may be used to start the actual NTFS related tasks. Content transfer involves WMI remoting with robocopy tool on target machine (for better network utilization), other related tasks make use of WMI APIs and probably do direct (i.e. synchronous) calls to the remote target file server. At this stage, making atomary executable modules that do just one functionally isolated task like DirCopy, DirCreator, DirRename, DirDelete, DirUserAccess seems like somewhat logical choice. The questions starting to arise from SSB queues planning to adoption of ExternalActivator sample to run these atomic executables. The problem is that if SSB messages contain atomic tasks for these executables, they needed to be syncronized in two ways - by execution precedence (create or copy dir first, then apply users' access rights) and transactionally (only when all related tasks succeed, the appropriate feedback action and event log writing can be taken.
I can imagine two implementation scenarios below. Case A. Create common queue for directory creating, content transfer and access rights. In that case ExternalActivator has to be either extended & re-designed or it has to activate another Activator-Coordinator (middleway) executable, that would actually read the queue and, based upon the message type, run the appropriate atomary executable. In that scenario, the queue processing would be stalled because if the previous task was content transfer and lasts, say access rights to them can't be started before the transfer task has been finished OK. That in turn probably will require running multiple instances of the atomary tasks and using conversation groups, wouldn't it? What would be the most reliable and simple way to achive that?
Case B. Dedicated queues for each atomary executable modules. Neither EA changes nor extra Activator-Coordinator middleway executable required. But because the atomary task-oriented queues are not syncronized with each other the queues internal conversation groups wouldn't help much... That means if a directory not yet exists, userRights module have to wait. But, what if we are transferring directories from path X to Y, based on what the userRights module knows to wait? With zero asynchronous design experience I'm lost here... Hope I described the app domain understandably, thanks for hints leading to the working solution!
When setting up databases for end users, what's the best practice regarding who's the dbo for each individual database - the user itself or a sysadmin?
Does it really have any importance at all who the owner (as defined by 'dbo') is ?
1.- a list of all the terms that start with A% 2.- a list of all the related terms … that belong to terms that start with A%
For number 1 - I am doing a select on Terms table with where term like A%.
For number 2 – I am joining both tables and then once again doing a where term like A%.
Would it be more efficient to take the first results and put them in a table variable, and then just do a join with the second table RelatedTerms.TermID = Terms .TermID
The number of records that generally comeback are between 500 to 1000 records that
What would you consider is a better approach ? or maybe there is an even better way ?
i'm a newbie for sql , but i want to learn sql on my own , is there any way that i can learn sql , do i have to download sample database from the internet, do i need to have my own server to play with. Hopefully someone show some lights on this.
Please point me to a web resource from where I can study:1) writing complex queries such as those involving HAVING, mult-levelnested queries, GROUP BY, T-SQL functions2) Joins - a lot of practice3) Stored Procedures, transactions, cursors and triggers - I need someheavy-duty practiceWhere can I get some good practice of the above? Also, please recommenda good SQL Server/T-SQL book in the light of the above requirement.
Folks - had a look around Google and no surprises, but never found what i was looking for.
I want to see a real work best practice C# Stored Procedure for Sql 2005 (express is what i am using, but don't mind the Sql edition).
Almost everything i see is a "select * from table" which to be honest was my first stored proc many years ago - everything since has been fairly detailed.
I ask as i am sceptical, after years of trying to STOP building Sql queries in code (as it's hellish!) that the CLR technique really makes any kind of a diffence.
If someone has found that it HAS i'd love to hear about it. The thought of:
SqlCommand cmd = new SqlCommand ( "My Whole Stored Proc as Text" );
... doesn't appeal, never mind the potential for debugging syntactical issues and so on.
I was excited by this, until it became something i had to do in a real situation and then i got a little worried. Should i be?
create a table and name it Salary Information. Add an Employee Name and Salary column to the table. Create a column in the Employee table and name it Salary. Create a trigger that updates the Salary table with the employees's name and salary each time u insert data into the Salary column of the Employee table.
Hi, my database is growing over 1Gb, and I only have one .mdf to keep them all. Should I use a secondary data file for my data? Can I do that now? Thanks.
Good Morning, I work for a company that has sees alot of people come and go. The one thing I have noticed is that people use their admin accounts to log into SQL and create sp, views and databases.When the user leaves I am stuck with all these objects that are owned by somone no longer working for the company. So my question to you guys is: What is the best practice to use in creating new objects? Thanks for your guru-ness!
say i have a customer.aspx that allows a user to enter in customer data. on customer.aspx, i have dropdownSalesRep which allows the user to associate a sales rep with the customerbut some customers come to directly, and not thru a sales rep, so I want the user to be able to specify "none" Is it best to have a dummy record in my SalesReps table called "none" with an ID of say "999", or is there some other better way to deal with this?
Hi. We have developed as quite simple ASP.Net webpage that fetches a number of information from a SQL 2005 database. We are having some problems though, becuase of a firewall that is beetween the webserver and the SQL server, and I think this is because of bad code from my part. I'm not that experiensed yet, so I'm sure that there is much to learn. Usualy when I do a query against a SQL database, I do something like this: Function GO_FormatRecordBy(ByVal intRecordBy As Integer) Dim dbQueryString As String Dim dbCommand As OleDbCommand Dim dbQueryResult As OleDbDataReader dbQueryString = "SELECT Name FROM tblRegistrators WHERE tblRegistratorsID = '" & intRecordBy & "'" dbCommand = New OleDbCommand(dbQueryString, dbConn) dbConn.Open() dbQueryResult = dbCommand.ExecuteReader(CommandBehavior.CloseConnection) dbQueryResult.Read() dbConn.Close() dbCommand = Nothing Return dbQueryResult("Name") End Function Now, lets say that I have a DataList that I populate with Integer values, and I want to "resolve" the from another table, then i do a function like the one above. I guess that this means that I open and close quite alot of connections against the database server when I have a large tabel. Is there any better way of doing this? Chould one open a database connection globaly in lets say the ASA fil? Whould that be a better aproch? When I added the CommandBehavior.CloseConnection to the ExecuteReader statment, I noticed that it was a bit faster, and I think there was fewer connections in the database, so maby there is more to the "closing connections" then I usualy do. Any tips on this? Best reagrds,Johan Christensson
I recently started developing a web site for a client using storefront.net and ms sql server.
the db schema of storefront.net has autonumbers as the PKs for the products table (even though the products table contains an additional field for product_number.)
So here's my dilemma if you care to read:
I typically develop local, deploy remote (after testing). I have a local SQL server, and then the remote SQL server.
When I'm developing for this project, I'll insert data such as products to the products table (sometimes several times while i'm working out routines to import data to the products table.) this has the effect of creating a unique ID for each product based upon SQL auto-incrementing INTs.
This StoreFront.net (SF.NET) has another table that is a lookup table. For each part number, it has a corresponding categoryID number.
Now, if i have product_ID 1234, and I set the category ID to say 10 and get it working on my local box, every thing is fine.
Here's where the problem comes in: When I use DTS to transfer the database during remote deployment, each product is inserted into the remote DBs products table and gets a NEW product id. Same with the categories.
This has the effect of breaking the relationships. (SF.NET has no ref integrity nor relationships defined in the db.) let's say my product_id 1234 gets put into the remote copy, it'll get a new product_ID (PK). let's say it's now 5775. now my category ID will also get a new value. so my data is now not related.
I don't know how to handle this situation. The unique IDs generated on my local sql will nearly almost always be different from those generated on the remote db.
How do i handle this situatoin is my question? advice, guys?
Hi to everyone! I've to create a little auction system that runs on web. Before starting developing, I'll would like to be sure to use the best practice...The main aspect is to avoid conflicts on database updating with bids, i.e. if a user places his bid I've to be absolutely sure that his bid is the highest at the moment of updating database. If not, I have to refuse it...So I ask you: using transaction is the best way for assuring the non-conflicts? And may I have to be careful of some other aspect in ASP.NET pages? Or there's no problem of conflicts at page level?Thank you very much in advance for any suggestion, and If anyone has some other thing to say about possible problem on auctions I'll be glad to hear him!!!! ;-)
I have around 10 databases currently residing on different platforms which make-up for roughly a terrabyte of information. I would like to migrate all of these DBs over so that they are all managed under one instance of SQL server 2K. In my view this streamlines things a lot and reduces costs of licensing/hardware.
However, is managing all of these databases on one clustered instance of SQL 2k the best approach from a performace stand point? Would it be better to seperate each database onto its own machine? I am under the impression that given enough hardware (processors, RAM) using just one instance of SQL 2k enterprise should be enough to perform the mangement of this data. Is this correct? Is there an optimal model?
Money is always a concern but in this case, performance is the main objective. The size of the data managed will be growing significantly so the system should be scalable.
My background is as a developer so I may not have provided enough to give a good answer. Please ask questions if you need more detail. I am looking for suggestions on the best way to handle this.
Specifically I would like to know the preferred architecture as well as any suggested hardware.