Creating A Distributed Service Broker Application Without Using Certificate.
Aug 14, 2007
hi all,
i want to create a service broker application between two different instances.
can i achieve the dialog security and transport security with out using certificate?
like we can achieve the transport security using windows authentication (both the instances are in same domain).
so how to achieve the dialog security with out using certificates.
We are developing a distributed application based on Tree Network Topology. Any part of this application knows only for one parent part and all children parts. For synchronize data with parts of this application we using SQL Service Broker.
Every part contains database. Let€™s name it €œSB€?. This database contains one QUEUE for received and sent messages, MESSAGE TYPE, CONTRACT and SERVICE. Any Service on part has unique name. Also there are one REMOTE SERVICE BINDING and ROUTE for parent part (service) and on one fore every child part (service). For each link Source €“ Target we create separate conversation_handle:
"BEGIN DIALOG @ConversationHandle FROM SERVICE [Source] TO SERVICE 'Target' ON CONTRACT Contract WITH ENCRYPTION = OFF;"
All works fine, messages sends and received quite well. But if we have opposite messages in one moment, with probability 100 percent one of messages, will be missing. Why it happened? Is duplex is not supported in Service Broker? May we do something wrong? Anybody encounter this problem?
We using flag retention = on for queue, to analyze messages after they processed. Body of Message that sent into several services (in one transaction) absent in that service from which the message during the same moment has been sent.
i m trying to send message between different server instance using service broker.
and for security purpose i am trying to create certificate. for that i have used makecert.exe and get a certificate and a private key. but when i am creating certificate using that file it is showing error
the code is --
CREATE CERTIFICATE ctfSourceServerMaster
FROM FILE = 'C:SourceServer.cer'
WITH PRIVATE KEY ( FILE = 'C:SourceServer.pvk', DECRYPTION BY PASSWORD = 'PrivateKeyPassword' )
ACTIVE FOR BEGIN_DIALOG = ON
GO
i have created the file SourceServer.cer' and SourceServer.pvk' by using makecert.exe tool.
the idea behind creating the certificate ctfSourceServerMaster is to give transport security.
I am running the particular script in the master database.
but still i am getting error
ERROR:----
The certificate, asymmetric key, or private key file does not exist or has invalid format.
Nothing happens and i dont know where to troubleshoot, if i check both queues they are empty ? I have altered the queues so that retention is ON.
I have checked the sql log of both instances and nothing.
When i execute the above sql it executes succesfully, so basically i am stuck i can execute the dialog sql, and set the activation on the target queue to OFF , so if i message gets there it should stay in the queue, but both que's are empty and no errors ? Not sure where to even start checking the issue ? For all i know the contract could be incorrect or port 4022 blocked or the endpoints not working, but i dont know where to find these errors other than the sql log and it has no errors ?
I'm working with the April CTP of SQL Server and I'm trying to create a proof of concept using service broker. I'm struggling with the "abc's" of it. If anyone has or can point me to a distributed "Hello, World" for service broker between SQL Server 2005 and SQL Express instances it would save me some time and trouble.
Some of my receiving service side applications usually work offline, andthey only are supposed to get connected to the network once/few times a day,to send/receive Service Broker messages.Will my sending service side application (permanent connected to thenetwork) keep retrying repeatedly to connect to the receiver?- Is retrying like that time consuming?.- I don't like it keeping trying over and over. Is retrying behaviorconfigurable?thx
Please excuse me if I get some terms wrong - I am not a developer!!
I am working for a company who has a requirement to interface to a customer to query for work. The customer has an established Web Services that it uses for external interfacing, these services are passive at their end, in that it will be my clients who initiate all the communications using SOAP, receiving responses in XML.
There are, as I see it, seven conversations.
Receiving Datasets
1. Requests All new Jobs - This will be a post to a HTP with a list of job numbers sent back, These jobs will then need to be polled from the same webservice.
2. Job Changes - existing jobs that have changed definitions (costs etc..)- simular to above
3. Job Changes - Appointment Times
Sending Datasets
1. Completed Jobs - one at a time
2. Cancelled Jobs
3. Appointments (I believe this is responses to 1 and 3 above)
4. Subsequent Jobs - new work derived from initial Job.
I believe the sending datasets will be one way only (ie POST).
I have suggested using Service Broker for this application, with a custom App sitting inbetween ServiceBroker and the Web Service to deal with the HTTP POSTS and GETS, and communicating with the Service Broker, posting to queues and reading from them.
How to prevent the hang on the initator service broker if the target service broker is not started?
Our case has two service brokers (two databases), sometime, the target is need to turn off. But the sitation is the initator service broker (in fact, the message is sent from triggers) become hang, I want to prevent this case and continue to operation, and the messages should queue and will continue to send to target service broker when it startup. How should I do?
hi , i am creating a service broker application between two different instance.when i am initiating a dialog from the source my message remain in the sys.transmission_queue.but its transmission_status column is empty.
i attached the profiler with both source and target by including all the service broker event.
in my source profiler i am getting the error like -- Connection attempt failed with error: '10061(No connection could be made because the target machine actively refused it.)'. with event Broker:connection
and in the target profiler error is --This message could not be delivered because the security context could not be retrieved. with event Broker:Message Undelivarible.
i have checked my port also using telnet with remotely and localy both working fine. i am using port no. 4001 and i have mentioned the port no. in the address of the route.
Hi, We are trying to implement Service Broker between SQL Server Express and SQL Server on the Same machine and we are having problems with certificates. We are creating a certificate on SQL Server, backing up the certificate on a file system and then loading certificate on the SQL Server Express from the file and we are keep getting the following error: Msg 15208, Level 16, State 1, Line 1 The certificate, asymmetric key, or private key file does not exist or has invalid format.
Following script runs fine on SQL Server.
Code Snippet
use master
Create Master Key Encryption BY Password = '45Gme*3^&fwu';
BACKUP MASTER KEY TO FILE = 'C:ServiceBrokerPrivateKeyMasterB.pvk'
ENCRYPTION BY PASSWORD = '45Gme*3^&fwu'
Create Certificate EndPointCertificateC
WITH Subject = 'C.Server.Local',
START_DATE = '06/01/2006',
EXPIRY_DATE = '01/01/2008'
ACTIVE FOR BEGIN_DIALOG = ON;
BACKUP CERTIFICATE EndPointCertificateC
TO FILE = 'C:ServiceBrokerEndPointCertificateC.cer'
Following script runs on SQL Server Express:
Code Snippet
Create Certificate EndPointCertificateC
From FILE = 'C:ServiceBrokerEndPointCertificateC.cer'
WITH PRIVATE KEY (
FILE = 'C:ServiceBrokerPrivateKeyMasterB.pvk',
DECRYPTION BY PASSWORD = '45Gme*3^&fwu'
);
If we run the script other way around, it works fine. If we use the SQL Server on some other machine, the script works fine. But only on the same machine, it throws this error. We made sure the permissions and everything. Let us know if there is any work around or what are we doing wrong.
Hello, I receive this error "The SQL Server Service Broker for the current database is not enabled, and as a result query notifications are not supported. Please enable the Service Broker for this database if you wish to use notifications." I attach the database in Management Studio to query and enable the broker using the scrip below but to no avail. ALTER DATABASE DataName SET ENABLE_BROKER ‘''<<------successfulandSELECT is_broker_enabled FROM sys.databases WHERE name = 'Database name' ‘'''<<-------value is 1 Global.asax ... Sub Application_Start(ByVal sender As Object, ByVal e As EventArgs) System.Data.SqlClient.SqlDependency.Start(ConfigurationManager.ConnectionStrings("dataConnectionString1").ConnectionString) End Sub...Web.config ... <connectionStrings> <add name="dataConnectionString1" connectionString="Data Source=.SQLEXPRESS;AttachDbFilename=|DataDirectory|jbp_data.mdf;Integrated Security=True;User Instance=True" providerName="System.Data.SqlClient" /> <add name="ASPNETDBConnectionString" connectionString="Data Source=.SQLEXPRESS;AttachDbFilename=|DataDirectory|ASPNETDB.MDF;Integrated Security=True;User Instance=True" providerName="System.Data.SqlClient" /> </connectionStrings>... Hope you could help. cheers,imperialx
I am struggling with the position SSB could take in an SOA. If I would want a broker in the general sense, meaning an intermediary sitting between applications which exchange information through messaging, would SSB be a good candidate? I know Biztalk is probably the primary candidate, but in my scenario I would end up with Biztalk apps with empty orchestrations. Also, I think Biztalk is more expensive to manage. So I am looking for a lightweight broker for a simple SOA targeted at application interoperability, no fancy business processes in sight.
In our current project, we are attempting to use Broker for the middleware for a "queuing" solution that helps in throttling & load-balancing incoming messages. These messages could potentially be either long-running jobs or simply an incoming flood of messages received from our integration web services that are called upon via BizTalk.
For the posting of messages onto the "to do" queue, we are implementing the "fire & forget" pattern that has been discussed here previously and on Remus' blog. This aspect of the design seems to work fine for us. Currently we have this setup where activation is occuring on the target queue, but we don't want to hold open a thread there on the server while waiting for our synchronous calls to various web-services on the processing "farm" to complete.
The next evolution of our design is to try and move activation from off of the primary SQL cluster itself (i.e. activation is currently happening on the clustered SQL boxes) onto the individual processing nodes. For this model, we are looking at using SQL Express on each of the nodes as has been suggested here on the forums for other similar scenarios.
For resiliance to any node failures, we do not want to "route" the messages to the individual nodes hosting their own queues but rather have those nodes do a "read" from the primary queue and do the per-message processing and performing either a committed EndDialog of EndDialogWithError based on the success of processing each task/message.
To invoke the processing on each of the nodes, we need some form of mechanism to send a "wake up" and do the "reads" since no message is being sent to the node itself for any form of activation based on a queue that receives the actual "job". On the same hand, we are considering having a "wake up" queue on the nodes where a simple "wake up" message could be sent to all nodes/queues and then activation on those queues would then invoke the processing "loop" for each node.
My question is how to best establish this "wake up" call to each node. I think I've read about a queue that is internal to Broker itself that receives a message when new items are received in any queue. My initial thought is to put activation on that queue and have a procedure that sends the "wake up" to each of the nodes in our processing farm.
I am looking for any input where others have attempted to solve this type of problem with Broker.
We are doing a POC for transferring a huge number of messages(millions) from oner machine to another. The two approaches we are examining are MSMQ and SQL Broker. The MSMQ is set up as a remote queue on the target machine, and the source machine takes as little as 1 millisecond to send the message (using a .NET program). However, when testing on Service Broker, we find that the time taken to send message to the queue is significantly higher - like 70 millisecond. Could you please help us in understanding why this is happening?
The service broker distributed queues have been set up as per the directions in the posting at http://www.sqlservercentral.com/columnists/sindukuri/2797.asp
The source program (written in .NET) is calling a stored procedure in the source machine to write to the SSB queue. When we run SQL Trace, we find that the SP is responsible for 99% of the time taken. Here is our SP that send the message: Declare @ConversationHandle uniqueidentifier Begin Dialog @ConversationHandle From Service SenderService To Service 'ReceiverService' On Contract SampleContract WITH Encryption=off; SEND ON CONVERSATION @ConversationHandle Message Type SenderMessageType (<<XML String>>)
Please let us know if there are any additional settings required in the Service Broker to improve its performance. Or , what are the other approaches for building a distributed SSB application?
I am trying to send a message between to SQL Server 2005 instances on two different machines. I have checked all my routes and all my objects appear to be setup correctly. However, when running Profiler on the target machine, I receive the "This message has been dropped because the TO service could not be found. Service name: "[tcp://mydomain.com/TARGET/MyService]". Message origin: "Transport". This is my activated stored procedure that is sending the message to the target service. I am using certificate security. Any help appreciated....
CREATE PROCEDURE [usp_ProcessMessage]
AS
BEGIN
SET NOCOUNT ON;
DECLARE @conversation_handle uniqueidentifier
DECLARE @message_body AS VARBINARY(MAX)
WHILE (1=1)
BEGIN
BEGIN TRANSACTION;
WAITFOR(RECEIVE TOP (1)
@conversation_handle = conversation_handle,
@message_body = message_body
FROM [tcp://mydomain.com/INITIATE/MyQueue]
), TIMEOUT 1000;
IF (@@ROWCOUNT = 0)
BEGIN
COMMIT;
BREAK;
END
END CONVERSATION @conversation_handle
IF @message_body IS NOT NULL
BEGIN
BEGIN DIALOG CONVERSATION @conversation_handle
FROM SERVICE [tcp://mydomain.com/INITIATE/MyService]
TO SERVICE '[tcp://mydomain.com/TARGET/MyService]'
ON CONTRACT [tcp://mydomain.com/INITIATE/MyMessage/v1.0]
WITH ENCRYPTION = ON, LIFETIME = 600;
SEND ON CONVERSATION @conversation_handle
MESSAGE TYPE [tcp://mydomain.com/TARGET/VisitMessage]
(@message_body);
END
COMMIT;
END
END
GO
My endpoints are created like so:
CREATE ENDPOINT MyEndpoint
STATE = STARTED
AS TCP
(
LISTENER_PORT = 4022
)
FOR SERVICE_BROKER (AUTHENTICATION = CERTIFICATE MasterCertificate)
GO
GRANT CONNECT TO CertOwner
GRANT CONNECT ON ENDPOINT::MyEndpoint TO CertOwner
GO
And my routes like so:
GRANT SEND ON SERVICE::[tcp://mydomain.com/INITIATE/MyService] TO CertOwner
GO
CREATE REMOTE SERVICE BINDING [MyCertificateBinding]
TO SERVICE '[tcp://mydomain.com/TARGET/MyService]'
Hey guys. I've an application which uses an account named Uaccount1. I've given it db_datareader and db_datawriter permissions to the tables. I've given it execute permissions to all sp's I've given it reference permissions to the symmetric key. And I gave it reference permissions on the certificate. At this point, it can't see the certificate. If I give control permission to the account for the certificate, it works perfect. Is it necessary to do that or am I missing something. Our security is based on symmetric key and certificates...
The error occurs on the CREATE CERTIFICATE statement:
Msg 15208, Level 16, State 1, Line 1
The certificate, asymmetric key, or private key file does not exist or has invalid format.
I am logged on as a local admin when running the script. The SQL Server service account is also a local admin, and the account I am logged on as is sysadmin in SQL Server. This script works fine in my Dev environment and a separate Pre-Prod environment.
I have created a certificate and now have taken a backup of this file by the following code:
backup certificate EncrProdCode to file = 'C:BackupCertEncrBackup.cer' with private key(file = 'C:BackupCertEncrPrivKeyBackup.pvk', encryption by password = 'EncrPrivKeyBackuppwd')
Now i want to create a certificate IN DIFFERENT DATABASE from this file. i m using the following database:
create certificate EncrFromFile from file = 'C:BackupCertEncrBackup.cer' with private key(file = 'C:BackupCertEncrPrivKeyBackup.pvk', decryption by password = 'EncrPrivKeyBackuppwd')
But is showing error:
Msg 15232, Level 16, State 1, Line 1 A certificate with name 'EncrFromFile' already exists or this certificate already has been added to the database.
No certificate with this name exists. why i m getting this error. and how can use the same certificate in other database. Is there any other way to create a certificate from an existing certificate file.
I try to implement mirroring using certificate authentication. When I do this query :
create certificate Principal_cert with subject = 'Principal certificate', start_date = '2007/11/01', expiry_date = '2020/11/01'; GO
Create endpoint endpoint_Princ state = started
as tcp(listener_port = 7024, listener_ip = all)
for database_mirroring (authentication=certificate Principal_cert, encryption = disabled, role = all);
GO
I have this error :
Msg 1088, Level 15, State 1, Line 1
Cannot find the object "Principal_cert" because it does not exist or you do not have permissions.
I have also noticed that the error does not occured when I try this in the master database.
So, I have two questions:
1. Why SQL server does not find a certificate that just been created ? 2. Should the certificate be in the master database or in the database being mirrored ?
Hi every body... I will thank u the help that you could bring me...
This is my problem:
I am using a Web Service Task in Integration Services for access a Web Method, this method is not mine, it is a method provided for another company. This Web Method is published in a VPN (Virtual Private Network) and it is in a secured channel (https).
The properties of the Connection Manager that the Web Service Task uses are ok. The Server URL is the address of the asmx and the client certificate is the correct. In fact when I press Test Connection, the message Test Connection Succeed is shown. If I change the certificate, a connection error occurs (403 Forbidden), this means that the client certificate and the address is ok. I addition, I use this certificate to access another methods published by the same company in a Visual basic application (with WSE) and I don´t have problems.
When I execute the DTSX the error "403 Forbidden" occurs and I can't execute the method. I executed the DTS in the VS2005, SQL IDE and in a batch file (.bat) too, and the same error happened.
If I use the File Monitor tool (from SysInternal, that now belongs to Microsoft) I can see that when I test the connection (in the Connection Manager used by the Web Service Task), the process accesses the certificate's private key located in the RSA folder of my certificates. But when I run the DTSX (no matters which way), it never reads the private key, it only accesses the certificate repository.
I saw that the code of the DTSX includes the certificate in Base64, but I am thinking that this code doesn't include the private key (I don't know if it would have to do it)...
Well, the fact is that I can´t use this Web Service task. I am suspecting that is a bug of the Integration Service
I am doing some research to see if the Service Broker technology would help my company with our Enterprise application. Here is our scenario: We have a 3 tier system. The first tier needs to contact the second tier asynchronously. Hence, using queues is a good option. However, the process that needs to happen on the second tier is mostly process intensive with little database updates. Is it still worth our time to use Service Broker?
I like the concept of Activation that Service Broker provides. But, from what I am reading most of the documentation describes activation as a way to call another stored proc. I definitely dont' want to do any process intensive work on the SQL server. So here comes my question...
How would I use a windows service to listen to the activation event from the Service Broker. I could have multiple windows services watching the same queue (scalable). Would I have to handle collisions myself? If so, I think I would rather keep it simple, and just use a simple table as my queue.
Hi to all, I want to study Sql server Service broker, have some questions1. What is the use of service broker ?2. Where this will use ? (With example)3. How to enable Service broker? Because i have sql server 2005 version but no folder like service broker.
Im having a hard time understanding everything required to create a simple Service Broker example. Can someone please assist? Source code would be ideal, but if not "do this, do that" would even be helpful.
I am trying to implement service broker. I send a message from my application code to the database to execute a specific stored procedure. How do i return the result set obtained by the execution of the stored procedure to the application.
My service broker seems to be broken... The database was restored from another crashed server but i have tried the
ALTER AUTHORIZATION ON DATABASE::[SPYDERONTHEWEB] TO [SA];
The error i'm getting is
Service Broker needs to access the master key in the database 'SpyderOnTheWeb'. Error code 25. The master key has to exist and th service master key encryption is required.
It will be great to have an update on MS plans to solve the problem of using Service Broker for remote users who sit behind the NAT. Any news will be appreciated.
Hello , I am trying to Implement distribution of the Stock Quotes over the LAN(only within the Network) and showing the live changing stock Quotes on the front end (in datagrid) installed at each clients desktop.I am receiving the Stock prices over the TCP / IP from the Stock Exchange. I am recieving atleast 10-15 messages per second over the TCP / IP from the Stock Exchange. Now i need to distribute this feed to Each connected client.
I tried doing it from TCP / IP , but in vein. Can we install the SQL 2005 Database Client Version on every client and use Service broker instaed of Live TCP / IP connections programmatically?
Ideally Can i dump the meesages from Stock Exchange in to each connected client's database locally and each front end application will keep a record of all the incomming messages.i.e Front end have a notification event , it will referesh the Datagrid in Front end accordingly...
Is it possible to develop Service Broker in .Net 1.1 (VS 2003)? Currently I have a project developed in .Net 1.1 and I want to add a new method utilize the message queue concept (instead of using MSMQ, using Service Broker SQL 2005), although my DB is SQL server 2005.
I am looking at the Service Broker as a way to notify multiple clients that there has been data changed on a table in the shared database. These clients may or may not be online. When there is a change, the notification should fire off a query to refresh the clients local cache. Is this a situation where Service Broker would help me? Can multiple clients recieve the notification at different times ( some recieve while online, some recieve when they come back online)? Any help on this would be appreciated. It seems from what I read that the messages are pulled off the queue when a notification has taken place. Is this correct? If so, can I set it to behave differently?
Let's assume the situation: we have Initiator and Target. Target is behind ISP's NAT and can't be published outside. So, when Initiator sends a message to Target, Target will not be able to establish a backward connection and will not send an acknowledge. Initiator will retry and retry...