Number Of Queues Best Practice ?

Aug 14, 2006

Hi There

In terms of scaling out Service Broker to hundreds of instances, would it be better from a performance perspective so have one queue with all the messages coming in(obviously with a high number for max_queue readers), or create a number of queues and spread the messages across them ? Or is there no significant difference.

The one reason i am leaning towords multiple queues is so that is poison messages are found or a something lese goes wrong with a queue not all messages are affected, however creating multipple queues makes it more complex and required more administration ?

Any general best practice when it comes to this ?

Thank You

View 7 Replies


ADVERTISEMENT

Many Queues And One Activation Procedure For That Queues.

Sep 22, 2006

Hi, All!

Could you please help me in my problem?

I develop Service Broker applcation, SB has 20 queues, also I have one Activation Procedure which carries out following actions:

CREATE PROCEDURE proc_ms_Activation

AS

BEGIN

WAITFOR (RECEIVE TOP(1) message_type_name, message_body, conversation_handle, message_id FROM Queue1)

-- process message, save statisctisc and etc

END

Problem in that: when I execute command WAITFOR (RECEIVE TOP (1) message_type_name, message_body, conversation_handle, message_id FROM Queue1) I should specify a name concrete Queue, for example "... FROM Queue1 ". I would like to use one procedure for processing all Queue.

How I can design my application that one Activation Procedure processed messages from all Queues?

Thank a lot for help.

View 3 Replies View Related

How To Write .net Code To Place XML Messages On Queues And Retrieve XML Messages From Queues.

Apr 2, 2008

Hello, please help!!

I have spent days searching the web and forums for an answer to this simple question and cannot find an example.

I have built a service broker application on sql server 2005. The application puts some xml on an incoming queue which is basically a few parameters to be used in a query. This queue will then call a stored proc which does some business logic and puts the resulting results in another queue also in xml.

I have written a test harness in SQL to put messages on the inbound queue and then some sql to retrieve the returned code from the outbound queue.

What I want to do is be able to convert the SQL which does this into .net code to be used by an application. i.e. write in .net some code to put xml on a queue and then write some .net code to retrieve xml from another queue.

I wouldn't have thought this would be a difficult thing to do and would have been done hundreds of times, but unable to find anything to simply send and retrieve XML to service broker queues....

thanks for your help.. its really needed. I found some links, but they are really vague and often doing select statments in service broker or something like this. I don't want to call any sql, just send and recieve XML on the queues.

any example code that does this, would be really helpfull

kind regards,
David Weeden
Database Developer

View 2 Replies View Related

Default Number Of Error Logs=6 - What Is Best Practice?

May 8, 2007

What is best practise for what this number should be?



I have seen guidance saying set to a number greater than 25000, but not from any source I particularly know or trust. (I checked SANS, NIST-CIS and the NSA, but I couldn't find anything.

View 1 Replies View Related

Always Two Queues?

Jul 27, 2006

I'm just getting around to understanding notification services in sql 2005 and I have been working through some examples.  I'm curious as to why there are always two queues and two corresponding services being set up even when both queues/services exist in the same database.  Here is my ultimate goal.  I want to have triggers put messages on the queue for various stored procedures to handle asynchronously - i.e. table xyz is updated, and I need a stored proc to take the updated values and possibly generate or update rows in another table.  When I tried to set up one queue and one service, nothing ever seems to get put on the queue:

Create Message Type TestMessageType Validation = Well_Formed_XML;

Create Contract TestContract (TestMessageType Sent By Any);

Create Queue TestQueue;

Create Service TestService ON Queue TestQueue(TestContract);

Begin Tran;

Declare @DialogHandle UNIQUEIDENTIFIER;

Begin Dialog Conversation @DialogHandle

From Service TestService

To Service 'TestService'

On Contract TestContract

Send On Conversation @DialogHandle

Message Type TestMessageType (N'<message>Hello World</message>');

End Conversation @DialogHandle;

Commit Tran;

Select * From dbo.TestQueue;

(nothing is ever returned)

 

View 11 Replies View Related

No Messages In The Queues.

Mar 10, 2006

Hello,

I'm trying to do a very simple example of sending a message from Initiator queue to Target queue. The result is no messages are delivered.

Here's the code:

DECLARE @conversationHandle UNIQUEIDENTIFIER

BEGIN DIALOG CONVERSATION @conversationHandle

FROM SERVICE GmiInitiatorService

TO SERVICE 'GmiTargetService'

ON CONTRACT GmiContract

WITH ENCRYPTION = OFF;

SEND ON CONVERSATION @conversationHandle MESSAGE TYPE GmiMessage ('test');

END CONVERSATION @conversationHandle

All three queues are empty (Initiator, Transmission, Target). When I comment out the last line ("end conversation"), the messages get stuck in the Initiator queue.

Please help!!

Thank you.

View 11 Replies View Related

Queues And Filegroups

Apr 19, 2006

Hi

We can create a queue on a filegroup for performance reasons however what about the sys.transmission_queue which could have messages backing up when the target is down.

What are your thoughts.

Cheers

View 3 Replies View Related

Timeouts - Disk Queues

Jul 20, 2005

HiGot a strange problem.For some reason our web client box times out occasionally. Maybe afew times a day.Nothing appears in the logs.What I do know is that the disk queue ramps up to <high>(think top ofthe graph), processor queue jumps up and the tps drops to 0 (naturallyenough!) along with reduced pagefile usage. Usually at this time thetps is between 20 and 300. Running a pair of mirrored 18gb scsi disksfor the whole server (yeah yeah) and a whopping 640mb memory. Oh anda single piii 1 gig. Sql2k standard vanilla, server2k.What server trace events is it worth my catching to try and get abetter Idea of what may be causing this? No major jobs (backups etcseem to be occuring at these times).Cheers

View 1 Replies View Related

Inconsistent Performance From Queues

Nov 13, 2006

Hi everyone! I have a very brief question... I have 10 queues in my database and each of them are sent equal number of messages... There are instances where they execute/activate the stored procedures very fast but there are times where they don't, does anyone have an idea why this happens?

Thank you very much for taking the time to read my post. :)

View 10 Replies View Related

Duplicte Masseages In Target Queues

Jun 20, 2007

Hi

We have problems with duplicate messages in Service Broker queues. We have tried the "fire and forget" method.



Senario
Initiator doing SEND and END Conversation , target doing RECEIVE and End Conversation ,
Sql servar agent job runing every minute doing End conversation with cleanup in sys.conversations_endpionts queue.

We are runing 5000 - 10000 messages per minute.

When the clean up job is running we get som duplicte messages in the target queue.



Enviroment
Two separate machines runing Sql Server 2005 Standard Editon SP2


Initiator Machine


CREATE MESSAGE TYPE [TransactionStatisticsSend] AUTHORIZATION [dbo] VALIDATION = WELL_FORMED_XML

CREATE CONTRACT [TransactionStatistics] AUTHORIZATION [dbo] ([TransactionStatisticsSend] SENT BY ANY)


CREATE QUEUE [dbo].[TransactionStatisticsQueueActive] WITH STATUS = ON , RETENTION = OFF
ON [PRIMARY]

CREATE SERVICE [TransactionStatisticsServiceActive] AUTHORIZATION [dbo] ON QUEUE [dbo].[TransactionStatisticsQueueActive] ([TransactionStatistics])


CREATE ROUTE [Route::TransactionStatisticsServiceStat,0705DBB2-8CBA-43BC-A8FF-774A27F8ABC0] AUTHORIZATION [dbo] WITH SERVICE_NAME = N'TransactionStatisticsServiceStat' , ADDRESS = N'tcp://EBSDBCONFIG1A:4022'


CREATE REMOTE SERVICE BINDING [RSB::TransactionStatisticsServiceStat] AUTHORIZATION [dbo] TO SERVICE N'TransactionStatisticsServiceStat' WITH USER = [Proxy::BtsDebitServiceStat] , ANONYMOUS = OFF


CREATE ENDPOINT [EBSDBCURR1A_BROKER]
AUTHORIZATION [BTSTULLENtsappl]
STATE=STARTED
AS TCP (LISTENER_PORT = 4022, LISTENER_IP = ALL)
FOR SERVICE_BROKER (MESSAGE_FORWARDING = DISABLED
, MESSAGE_FORWARD_SIZE = 10
, AUTHENTICATION = CERTIFICATE [EBSDBCURR1A]
, ENCRYPTION = REQUIRED ALGORITHM RC4)




ALTER PROCEDURE [dbo].[Bts_SP_TransactionStatPrepare]
(@TransferID varchar(30))
AS
declare
@RowCount Int,
@ErrorSave Int,
@Msg xml,
@dialog_handle UNIQUEIDENTIFIER



Begin


BEGIN TRY

set @msg = (Select
Regtime,
SendPartShortName,
RecPartShortName,
MsgType,
From btslogactive.dbo.StatTransferlog Tl
where TransferID = @TransferID
FOR XML RAW)


BEGIN DIALOG CONVERSATION @dialog_handle
FROM SERVICE [TransactionStatisticsServiceActive]
TO SERVICE 'TransactionStatisticsServiceStat'
ON CONTRACT [TransactionStatistics]
WITH ENCRYPTION = OFF ;

SEND ON CONVERSATION @dialog_handle
MESSAGE TYPE [TransactionStatisticsSend]
(@msg) ;
END CONVERSATION @dialog_handle

END TRY
BEGIN CATCH
SELECT
ERROR_NUMBER() as ErrorNumber,
ERROR_MESSAGE() as ErrorMessage;
END CATCH

End






Target Machine

CREATE MESSAGE TYPE [TransactionStatisticsSend] AUTHORIZATION [dbo] VALIDATION = WELL_FORMED_XML


CREATE CONTRACT [TransactionStatistics] AUTHORIZATION [dbo] ([TransactionStatisticsSend] SENT BY ANY)


CREATE QUEUE [dbo].[TransactionStatisticsQueueStat] WITH STATUS = ON , RETENTION = OFF
, ACTIVATION ( STATUS = ON , PROCEDURE_NAME = [dbo].[Bts_SP_TransactionStatUpdateBroker] , MAX_QUEUE_READERS = 1 , EXECUTE AS N'dbo' ) ON [PRIMARY]


CREATE SERVICE [TransactionStatisticsServiceStat] AUTHORIZATION [dbo] ON QUEUE [dbo].[TransactionStatisticsQueueStat] ([TransactionStatistics])

CREATE ROUTE [Route::TransactionStatisticsServiceActive,D8A1A78B-CEAD-4C63-B3B3-3C986D2AB3AA] AUTHORIZATION [dbo] WITH SERVICE_NAME = N'TransactionStatisticsServiceActive' , BROKER_INSTANCE = N'D8A1A78B-CEAD-4C63-B3B3-3C986D2AB3AA' , ADDRESS = N'tcp://EBSDBCURR1A:4022'


CREATE ENDPOINT [EBSDBCONFIG1A_BROKER]
AUTHORIZATION [BTSTULLENtsappl]
STATE=STARTED
AS TCP (LISTENER_PORT = 4022, LISTENER_IP = ALL)
FOR SERVICE_BROKER (MESSAGE_FORWARDING = DISABLED
, MESSAGE_FORWARD_SIZE = 10
, AUTHENTICATION = CERTIFICATE [EBSDBCONFIG1A]
, ENCRYPTION = REQUIRED ALGORITHM RC4)





Activation procedure

ALTER PROCEDURE [dbo].[Bts_SP_TransactionStatUpdateBroker]

AS


DECLARE @MessageType sysname
DECLARE @ConversationHandle uniqueidentifier
DECLARE @MessageBody XML
DECLARE
@RegTime datetime,
@SendPartShortName varchar(30),
@RecPartShortName varchar(30),
@RC int



BEGIN TRANSACTION;
WHILE (1=1)
BEGIN
WAITFOR
(
RECEIVE TOP (1)
@MessageType = message_type_name,
@MessageBody = message_body,
@ConversationHandle = conversation_handle,
@SeqNo = message_sequence_number
FROM [TransactionStatisticsQueueStat]
), TIMEOUT 5000;

IF (@@ROWCOUNT = 0)
BEGIN
ROLLBACK TRANSACTION
RETURN
END

IF (@MessageType = 'http://schemas.microsoft.com/SQL/ServiceBroker/EndDialog')
BEGIN
END CONVERSATION @ConversationHandle
BREAK
END
ELSE IF (@MessageType = 'http://schemas.microsoft.com/SQL/ServiceBroker/Error')
BEGIN
END CONVERSATION @ConversationHandle
BREAK
END
ELSE IF (@MessageType = 'TransactionStatisticsSend')
BEGIN
Begin try
SET @RegTime = @MessageBody.value('(/row/@Regtime)[1]', 'datetime')
SET @SendPartShortName = @MessageBody.value('(/row/@SendPartShortName)[1]', 'varchar(30)')
SET @RecPartShortName = @MessageBody.value('(/row/@RecPartShortName)[1]', 'varchar(30)')
SET @MsgType = @MessageBody.value('(/row/@MsgType)[1]', 'varchar(30)')

.


End try
BEGIN CATCH
SELECT
ERROR_NUMBER() as ErrorNumber,
ERROR_MESSAGE() as ErrorMessage;
END CONVERSATION @ConversationHandle
BREAK
END CATCH

-- EXEC @RC = SP_XXXXX



END
END CONVERSATION @ConversationHandle
BREAK
END
END
COMMIT TRANSACTION
RETURN




Sql Server Agent job procedure


ALTER PROCEDURE [dbo].[Bts_SP_Del_Conversation_Endpoints]

AS
begin
DECLARE c_PurgeConversationEndpoints CURSOR FAST_FORWARD
FOR SELECT conversation_handle
FROM sys.conversation_endpoints
with (nolock)
WHERE is_system = 0
AND [State] = 'CD';

OPEN c_PurgeConversationEndpoints;
DECLARE @DialogHandle UNIQUEIDENTIFIER;
FETCH NEXT FROM c_PurgeConversationEndpoints
INTO @DialogHandle;
WHILE @@FETCH_STATUS = 0
BEGIN
END CONVERSATION @DialogHandle WITH CLEANUP;
FETCH NEXT FROM c_PurgeConversationEndpoints
INTO @DialogHandle;
END
CLOSE c_PurgeConversationEndpoints;
DEALLOCATE c_PurgeConversationEndpoints;


end





View 3 Replies View Related

How To Mirror Service Broker Queues

Jan 4, 2007

Hi,
I want to know that how is it possible to achieve mirroring for the service broker queues in the database?

View 1 Replies View Related

How Service Broker Queues Can Be Mirrored?

Jan 4, 2007

Hi,
I want to know that how is it possible to achieve mirroring for the service broker queues in the database?

View 1 Replies View Related

Using Message Queues In Stored Procedures

Oct 11, 2005

Ok, I'm stumped.  I have a database which has a table that stores documents.  Each document has a primary key.  For years, I've been notifying an external program that a new document has arrived in the db by firing a trigger on the table.  Trigger calls xp_cmdshell, which calls an external program which enqueues a message on a message queue.

View 6 Replies View Related

END CONVERSATION WITH CLEANUP, Is This Bad? Records Pile Up In Queues Without It.

Sep 13, 2006

I have read from a variety of sources that using the "WITH CLEANUP" option on
a "END CONVERSATION" statement is bad and unnecessary. (Question
#1) Is this true???

My code does not work properly if I don't
use the "WITH CLEANUP" option. My code leaves closed conversation records in the
queues if I leave out the "WITH CLEANUP" option. The "END CONVERSATION"
statement is executing properly and flagging the conversation record as closed
but the records don't get deleted. All the messages are going back and forth
properly too.

My code is based on the HelloWorld ServiceBroker sample
which does not use "WITH CLEANUP". When I run the sample scripts everything
works great and the conversation records are deleted. However, this sample does
not uses an activation stored procedure to receive messages and respond with the
results. When I copy and paste the receive messages sample code into an
activation stored procedure is when the problem comes up. It's the same code!
(Question #2) Why am I getting different results depending
how the code is executed/activated???

This is could be a tough one. I
just hope somebody else has seen it too and figured it out. Thanks for the
help!

Thanks,
Greg Van Mullem

View 3 Replies View Related

Monitoring Service Broker Queues Through A .NET Process

Jul 19, 2007

Is there a way for a .NET application to receive a notification when a service broker queue has been updated with a new message? I tried using SqlDependency on an SB queue but I got an "invalid" error in my notification handler.



Such a notification would be much better than having to poll the queue every N seconds.



Thanks

View 4 Replies View Related

Interesting Behavior Of Service Broker Queues

Dec 23, 2005

Hello All:

I've been experimenting with the new SQL Server Service Broker, and I think I've discovered some interesting behavior.  Service Broker relies on "Queues" to store messages that need to be processed.  Service Broker operates by sending a message from one Queue (the INITIATOR Quque) to another Queue (the TARGET Queue).  A Queue can have an "Activation Stored Procedure" associated with it.  This procedure is what actually processes the messages in the Queue. 

The first behavior I obeserved related to the setting of a Queue's RETENTION parameter.  The RETENTION parameter indicates whether or not the Queue will retain a copy of the messages it receives.  By default, the parameter's value is "OFF" (meaning it will not retain messages).  In the Activation Stored Procedure of my TARGET Queue, I used "sp_send_dbmail" to send an e-mail message.  I wanted to capture the "conversation_handle" (a uniqueidentifier that identifies a particular message) and include it in the body of the e-mail.  I was unable to capture it, because the Queue's RETENTION parameter was "OFF".  When I tried to capture the conversation_handle from the INITIATOR queue (whose RETENTION parameter was "ON"), I was successful.  The moral of the story is you apparently need to have RETENTION = "ON" if you need to capture information from a Queue.

The second behavior I observed relates to the setting of a Queue's MAX_QUEUE_READERS setting.  This setting allows you to automatically invoke multiple instances of the Activation Stored Procedure.  As messages come into the Queue, the Queue creates an additional instance of the Activation Stored Procedure up to the number you specified in the MAX_QUEUE_READERS setting.  This allows parallel processing of messages in the Queue.  There is also a programming technique called a "RECEIVE LOOP" which is used for processing messages.  In the RECEIVE LOOP, you have a parameter called WAITFOR which tells the Queue whether it should stay on constant alert for new messages or whether it should time out after a specified number of seconds.

If you have the Queue wait indefinitely (by not specifying a TIMEOUT value in the WAITFOR statement) and you have invoked multiple copies of the Activation Stored Procedure, the procedure will remain in memory indefinitely.  Therefore, if you make a change to the code of the Activation Stored Procedure, the change will NOT be reflected in the Activation Stored Procedure until you change the STATUS of the Queue.  I had changed my procedure so that it would not send an e-mail, but the e-mails kept coming.  The e-mails did not stop until I executed an ALTER QUEUE statement.  I ran "ALTER QUEUE queue_name WITH STATUS = OFF;" and then I ran "ALTER QUEUE queue_name WITH STATUS = ON;"  After that, the changes were reflected in the procedure.

Be aware of this behavior as you design your Queues.

View 6 Replies View Related

Designing SSB Queues, EA Sample And Unreliable Interdependant NTFS Tasks

Feb 21, 2006

Our current project involved into managing NTFS hierarchical folders structured by multi-level customers and associated projects for them. Essentially each folder level represents one level of customer hierarchy or project root. Both have different subfolders and user access rights for them based on generic XML templates. The folders resided on file servers across the country and should be accessible in ordinary way NTFS file shares are allowing. LAN/Intranet MS AD Win2003 / SQL2005 environment.
The folder management system basically have to keep the folder structure in tact with changes in underlying managing application logic. That involves such operations as creating a new folder with subfolders, copying a folder with its content into another folder branch (which may be on the same or another server and place), deleting a folder/content, renaming a folder, applying NTFS access rights to folders and subfolders for users according generic templates. As all these actions are unreliable and some tasks may take hours to complete, SSB approach seems to be the viable solution. Some tasks involved have to be done within 10 minutes, others (are prolonged) have to be scheduled for overdnight run. Windows service like EA sample may be used to start the actual NTFS related tasks. Content transfer involves WMI remoting with robocopy tool on target machine (for better network utilization), other related tasks make use of WMI APIs and probably do direct (i.e. synchronous) calls to the remote target file server.
At this stage, making atomary executable modules that do just one functionally isolated task like DirCopy, DirCreator, DirRename, DirDelete, DirUserAccess seems like somewhat logical choice.
The questions starting to arise from SSB queues planning to adoption of ExternalActivator sample to run these atomic executables. The problem is that if SSB messages contain atomic tasks for these executables, they needed to be syncronized in two ways - by execution precedence (create or copy dir first, then apply users' access rights) and transactionally (only when all related tasks succeed, the appropriate feedback action and event log writing can be taken.

I can imagine two implementation scenarios below.
Case A. Create common queue for directory creating, content transfer and access rights.
In that case ExternalActivator has to be either extended & re-designed or it has to activate another Activator-Coordinator (middleway) executable, that would actually read the queue and, based upon the message type, run the appropriate atomary executable. In that scenario, the queue processing would be stalled because if the previous task was content transfer and lasts, say access rights to them can't be started before the transfer task has been finished OK. That in turn probably will require running multiple instances of the atomary tasks and using conversation groups, wouldn't it? What would be the most reliable and simple way to achive that?

Case B. Dedicated queues for each atomary executable modules.
Neither EA changes nor extra Activator-Coordinator middleway executable required. But because the atomary task-oriented queues are not syncronized with each other the queues internal conversation groups wouldn't help much... That means if a directory not yet exists, userRights module have to wait. But, what if we are transferring directories from path X to Y, based on what the userRights module knows to wait? With zero asynchronous design experience I'm lost here...
Hope I described the app domain understandably, thanks for hints leading to the working solution!

View 8 Replies View Related

The T-SQL Practice

Jun 2, 2004

Hello, everyone:

Does any one know the good T-SQL practice topic web site? It is better to include solutions. Thanks a lot.

ZYT

View 2 Replies View Related

Best Practice For 'dbo'

Sep 3, 2006

When setting up databases for end users, what's the best practice regarding who's the dbo for each individual database - the user itself or a sysadmin?

Does it really have any importance at all who the owner (as defined by 'dbo') is ?

View 5 Replies View Related

What Is Better Practice

May 7, 2007

(Terms)
TermID, Term
1----- Abc
2----- Arcico
3----- Tunic

and
(RelatedTerms)
TermID, RelatedTermID
1 ------ 3
1------ 2
2------ 4


I want to get the following results

1.- a list of all the terms that start with A%
2.- a list of all the related terms … that belong to terms that start with A%


For number 1 - I am doing a select on Terms table with where term like A%.

For number 2 – I am joining both tables and then once again doing a where term like A%.


Would it be more efficient to take the first results and put them in a table variable, and then just do a join with the second table RelatedTerms.TermID = Terms .TermID

The number of records that generally comeback are between 500 to 1000 records that

What would you consider is a better approach ? or maybe there is an even better way ?

View 4 Replies View Related

Best Practice?

Aug 14, 2007

Wasn't sure where to ask this question.

Was wondering what everyone is doing in regards to server/db protection?

Do you run your DB's on independent servers?

Do you run your DB's on clustered servers?

Do you run your DB's on redundant haardware w/ a 3rd party
application for bit-to-bit data replication?

Is anyone placing the DB's on a SAN's?

We're at a crossroads, and looking for a good direction to ensure the DB's are up.

Any insight welcome.
Thanks,
Kerry

View 3 Replies View Related

How I Practice Sql

Dec 11, 2007

hello

i'm a newbie for sql , but i want to learn sql on my own , is there any way that i can learn sql , do i have to download sample database from the internet, do i need to have my own server to play with. Hopefully someone show some lights on this.

regards
sutha

View 2 Replies View Related

Need Some Practice

Apr 10, 2006

Please point me to a web resource from where I can study:1) writing complex queries such as those involving HAVING, mult-levelnested queries, GROUP BY, T-SQL functions2) Joins - a lot of practice3) Stored Procedures, transactions, cursors and triggers - I need someheavy-duty practiceWhere can I get some good practice of the above? Also, please recommenda good SQL Server/T-SQL book in the light of the above requirement.

View 3 Replies View Related

C# CLR Best Practice Example

Jan 11, 2006

Folks - had a look around Google and no surprises, but never found what i was looking for.

I want to see a real work best practice C# Stored Procedure for Sql 2005 (express is what i am using, but don't mind the Sql edition).

Almost everything i see is a "select * from table" which to be honest was my first stored proc many years ago - everything since has been fairly detailed.

I ask as i am sceptical, after years of trying to STOP building Sql queries in code (as it's hellish!) that the CLR technique really makes any kind of a diffence.

If someone has found that it HAS i'd love to hear about it. The thought of:

SqlCommand cmd = new SqlCommand ( "My Whole Stored Proc as Text" );

... doesn't appeal, never mind the potential for debugging syntactical issues and so on.

I was excited by this, until it became something i had to do in a real situation and then i got a little worried. Should i be?

View 7 Replies View Related

I Have A Practice!

Nov 2, 2006

create a table and name it Salary Information. Add an Employee Name and Salary column to the table. Create a column in the Employee table and name it Salary. Create a trigger that updates the Salary table with the employees's name and salary each time u insert data into the Salary column of the Employee table.

I tried but it didnt work!
Pls, help me!

View 4 Replies View Related

Best Practice Of Using .ndf

Mar 12, 2008

Hi, my database is growing over 1Gb, and I only have one .mdf to keep them all. Should I use a secondary data file for my data? Can I do that now? Thanks.

View 5 Replies View Related

New Database: Best Practice

Jan 24, 2007

Good Morning,
I work for a company that has sees alot of people come and go. The one thing I have noticed is that people use their admin accounts to log into SQL and create sp, views and databases.When the user leaves I am stuck with all these objects that are owned by somone no longer working for the company.
So my question to you guys is: What is the best practice to use in creating new objects?
Thanks for your guru-ness!

View 3 Replies View Related

Best Practice For Lookup

Jul 24, 2007

say i have a customer.aspx that allows a user to enter in customer data.
 on customer.aspx, i have dropdownSalesRep which allows the user to associate a sales rep with the customerbut some customers come to directly, and not thru a sales rep, so I want the user to be able to specify "none"
 Is it best to have a dummy record in my SalesReps table called "none" with an ID of say "999", or is there some other better way to deal with this?
 
 

View 5 Replies View Related

Best Practice For SQL Connections And Asp.Net

Aug 7, 2007

Hi.
We have developed as quite simple ASP.Net webpage that fetches a number of information from a SQL 2005 database. We are having some problems though, becuase of a firewall that is beetween the webserver and the SQL server, and I think this is because of bad code from my part. I'm not that experiensed yet, so I'm sure that there is much to learn.
Usualy when I do a query against a SQL database, I do something like this:
Function GO_FormatRecordBy(ByVal intRecordBy As Integer)        Dim dbQueryString As String        Dim dbCommand As OleDbCommand        Dim dbQueryResult As OleDbDataReader        dbQueryString = "SELECT Name FROM tblRegistrators WHERE tblRegistratorsID = '" & intRecordBy & "'"        dbCommand = New OleDbCommand(dbQueryString, dbConn)        dbConn.Open()        dbQueryResult = dbCommand.ExecuteReader(CommandBehavior.CloseConnection)        dbQueryResult.Read()       dbConn.Close()        dbCommand = Nothing        Return dbQueryResult("Name")    End Function 
Now, lets say that I have a DataList that I populate with Integer values, and I want to "resolve" the from another table, then i do a function like the one above. I guess that this means that I open and close quite alot of connections against the database server when I have a large tabel. Is there any better way of doing this? Chould one open a database connection globaly in lets say the ASA fil? Whould that be a better aproch?
When I added the CommandBehavior.CloseConnection to the ExecuteReader statment, I noticed that it was a bit faster, and I think there was fewer connections in the database, so maby there is more to the "closing connections" then I usualy do.
Any tips on this?
 Best reagrds,Johan Christensson

View 6 Replies View Related

Help Me With A Bit Of A 'practice' Issue...

Dec 16, 2003

Ok.

I recently started developing a web site for a client using storefront.net and ms sql server.

the db schema of storefront.net has autonumbers as the PKs for the products table (even though the products table contains an additional field for product_number.)

So here's my dilemma if you care to read:

I typically develop local, deploy remote (after testing). I have a local SQL server, and then the remote SQL server.

When I'm developing for this project, I'll insert data such as products to the products table (sometimes several times while i'm working out routines to import data to the products table.) this has the effect of creating a unique ID for each product based upon SQL auto-incrementing INTs.

This StoreFront.net (SF.NET) has another table that is a lookup table. For each part number, it has a corresponding categoryID number.

Now, if i have product_ID 1234, and I set the category ID to say 10 and get it working on my local box, every thing is fine.

Here's where the problem comes in: When I use DTS to transfer the database during remote deployment, each product is inserted into the remote DBs products table and gets a NEW product id. Same with the categories.

This has the effect of breaking the relationships. (SF.NET has no ref integrity nor relationships defined in the db.) let's say my product_id 1234 gets put into the remote copy, it'll get a new product_ID (PK). let's say it's now 5775. now my category ID will also get a new value. so my data is now not related.


I don't know how to handle this situation. The unique IDs generated on my local sql will nearly almost always be different from those generated on the remote db.

How do i handle this situatoin is my question? advice, guys?

View 3 Replies View Related

Auction Best Practice

Jun 28, 2005

Hi to everyone!   I've to create a little auction system that runs on web. Before starting developing, I'll would like to be sure to use the best practice...The main aspect is to avoid conflicts on database updating with bids, i.e. if a user places his bid I've to be absolutely sure that his bid is the highest at the moment of updating database. If not, I have to refuse it...So I ask you: using transaction is the best way for assuring the non-conflicts? And may I have to be careful of some other aspect in ASP.NET pages? Or there's no problem of conflicts at page level?Thank you very much in advance for any suggestion, and If anyone has some other thing to say about possible problem on auctions I'll be glad to hear him!!!! ;-)

View 6 Replies View Related

TSQL Practice

Jun 4, 2001

Following works fine in QA
master..xp_cmdshell 'osql -Usa -E -h -w250 -Q"set nocount on exec unallocated" -dTest -oc:est.txt' ,no_output

but does not from within a sp
err -
Server: Msg 170, Level 15, State 1, Procedure test, Line 2
Line 2: Incorrect syntax near 'master'.

Thank you.
Ivan

View 1 Replies View Related

Best Practice Question

Aug 20, 2004

I have around 10 databases currently residing on different platforms which make-up for roughly a terrabyte of information. I would like to migrate all of these DBs over so that they are all managed under one instance of SQL server 2K. In my view this streamlines things a lot and reduces costs of licensing/hardware.

However, is managing all of these databases on one clustered instance of SQL 2k the best approach from a performace stand point? Would it be better to seperate each database onto its own machine? I am under the impression that given enough hardware (processors, RAM) using just one instance of SQL 2k enterprise should be enough to perform the mangement of this data. Is this correct? Is there an optimal model?

Money is always a concern but in this case, performance is the main objective. The size of the data managed will be growing significantly so the system should be scalable.

My background is as a developer so I may not have provided enough to give a good answer. Please ask questions if you need more detail. I am looking for suggestions on the best way to handle this.

Specifically I would like to know the preferred architecture as well as any suggested hardware.

Thanks in advance

View 3 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved