QN Best Practices

Sep 18, 2007

Hi everyone,

I wonder if somebody here could recommend a good article about MS Service Broker. I'm looking for some advice and tips in designing applications using SQL Service Broker, mainly QN. For instance, maintenance routines and common faulty scenarios I might find later when my solution is implemented. I have googled for a while but all I can find are recopied examples of QN.

Thanks

View 3 Replies


ADVERTISEMENT

SqlDataSource Best Practices

Jun 20, 2008

Hi I use data presentation controls like gridview, formsview in my application. In many of the webforms i also use multiple datasources mainly for the purpose of 2 way data binding for controls within data presentation controls.I am concerned about the performance issues this might cause as users using these pages increase.What is the likely performance impact ?Once the databind is done and values are populated in the respective controls, does the database connection of datasource control get closed, or is it open?What are the best practices while implementing datasource controls? 

View 2 Replies View Related

SQL 2000 Best Practices

Nov 16, 2001

Hi all. Thanks for taking the time to read this.

I'm looking for some documentation on SQL 2K Installation tips on a Windows 2000 Member Server platform as well as best practices for ongoing maintenance .

Real world experience as well as Microsoft propaganda are all welcome.

Thanks again.

View 1 Replies View Related

Best Practices ( Installation )

Dec 7, 2001

Could some one give best SQL 2000 install practices such as -
1) SQL app, should this be in OS drive or not ( pros, cons )
2) Should OS be on Raid 1

etc

Thanks,

View 3 Replies View Related

Best Practices To Optimize A DB ?

Sep 1, 2004

What are the best pactices to optimize performance accsiing an SQLServer DB ?

commands, mantenance plan...

Thanks

View 1 Replies View Related

DDL Best Practices Question

Jul 20, 2005

I am looking for some examples of how to manage DDL scripts amongvarious versions of a production db and development and testing. Ihave tried a few things in the past, and it always gets very muddledand cumbersome.I need to be able to build any version of the database from scratch,BUT I also need to maintain an upgrade path from any version to anylater version. So it is not enough to just maintain a master buildscript, but I don't want to maintain 2 different things (modify themaster build scripts AND create a new "ALTER" script for each versionchange).I thought I had seen an article somewhere that layed out a process formanaging this, but I can't find it now (I thought it was in SQL ServerMag). Does anybody know of this article or have a resource they couldpoint me to that outlines best practices in this area?Thanks,Jason Wood, DBA in training.

View 1 Replies View Related

Best Practices For Large DB

Jul 20, 2005

Hi All,My question is what are the best practices for administering largeDBs. (My coworker is the DB administrator. I'm more of thedeveloper. But slowly being sucked in.) My main concern is that wehave some DBs that take approx 3 hrs a night just to rebuild theindexes. I know that with MSSQL 2000, I can use partitioned views tobreak out the table(s) into smaller databases and tables. But we alsohave an older server that runs MSSQL 7. Lastly how do you handledrive space issues? Do you spread out the DB across multiple MDFfiles on different drives? Thanks in advance.

View 1 Replies View Related

Encryption Best-Practices

May 31, 2006

Please forgive me if I have overlooked a thread that answers this question, but I assure you that I have looked.

I would really appreciate a guide of sorts that would tell me the correct steps to take to properly secure a column in my database. I don't need specifics on how to do each step, I either have those already or can find them myself. In fact, I have already successfully encrypted and decrypted some data. I just want to make sure that I create the right keys and certificates and that I follow best-practices as far as backups and stuff is concerned.

Thanks,

Todd Sparks

View 1 Replies View Related

Best Practices Question

Jun 13, 2007

Environment is SQL Server 2005 x64 Enterprise running under Windows Server 2003 x64 Enterprise with four processors and 16GB of ram.

I have 28 data copy routines I would like to add to a SSIS package. They use the Data Reader Source to an ODBC database (InterSystems Cache) and copy the table contents to a SQL2005 database for reporting needs. The data rows in these 28 routines range from only 100 rows to over 6 million rows depending on the table. I have tested these individually and they work fine. My question is, is it a good practice to have all of these routines in a single package or can I expect performance degragation?

View 4 Replies View Related

Tracking Changes Best Practices

Nov 5, 2007

I've got a table that has frequent updates to it. I want 100% change tracking on this table though, so we can rollback to any previous version, or just see any changes people make.

Is there a best practice for things like this? Currently, I'm using a trigger on UPDATE to take the previous values and store them in a history table. This keeps track of who changes what, and when. Plus the most recent data is seperate and more performant to access.

I've also heard about putting an 'IsActive' flag on the main table and any changes that are made just get marked as In-Active and a new record gets added.

Any input?

Thanks!

View 14 Replies View Related

Best Practices For SSIS

Mar 27, 2008

I am new to SSIS, but done alot of DTS 2000 development.

What is the concensus for developing SSIS packages? Do you just place objects and change the properties of each object, having multiple objects basically doing the same thing, with different properties? Or do you set object's properties and then change properties by code in scripts? Ie Execute SQL, setting connections and SQL Statement by code in a script? Is this even possible? With Microsoft OOP I assume this is possible.

Script> Set properties of ExecuteSQL > set flow to ExecuteSQL.

Thanks!

View 4 Replies View Related

Upserts Best Practices

May 19, 2007

Is this this only way to do it in SSIS?
http://sqljunkies.com/WebLog/ashvinis/archive/2005/06/15/15829.aspx
For some reason I figured that SSIS would have this kind of stuff built into it, it seems a function that many would use.

View 1 Replies View Related

Best Practices On Maxinsertcommitsize

Mar 23, 2007

Hi,

I wonder if anyone knows what would be the best case scenario for the property 'maxinsertcommitsize' for the sql destination task if I want to load 6m records into a target. Is the best setting 0 (try loading all in one batch) or should I choose a different value for example 1000000 per batch?

Thanks,

Marc

View 4 Replies View Related

Best Practices For SQL Data Access

Nov 1, 2007

Hi,
i am newbie in ASP.net world. i am using 3 tier application architechture for my web based application. data base is sql server 2000. i have looked at object and sql datasource objects but i think they are not suitable for my requirements. so i am planning to directly use ado.net to access data from database.( i.e. creating connection, then creating commands n executing them)
now what i am looking for is the best known practices for the above task. i have following solutions in my mind please let me know if i am missing some or which could be the best aproach.


careate one class which will handle all the database requests so that all the pages and business objects request that class to to do all the db related stuff. (creating connection, command n execution)

View 4 Replies View Related

Best Practices Question For Outputting

Sep 19, 2005

Hey guys,

Little bit of a newbie question here...I have a database with about 20
or so tables in a relational model.  I am now working on an output
scheme and had a quick question regarding best practices for
outputting.  Would it be best to

1) Set up a view that basically joins all of these tables together, then bind a DataSet/DataTable to it and output as needed?
2) Setup individual views for each table and run through them?

Thanks for the help!

e...

View 2 Replies View Related

SqlCommand Parameters Best Practices

Feb 23, 2006

I was writing a simple sign up class the other day and coded
a Register method with a SqlCommand and Parameters like this:

            SqlCommand sqlCmd =
new SqlCommand("Regester", sqlConn);
            sqlCmd.CommandType =
CommandType.StoredProcedure;
           
           
sqlCmd.Parameters.Add("@UserName", SqlDbType.VarChar, 55);
           
sqlCmd.Parameters["@UserName"].Value = userName;

           
sqlCmd.Parameters.Add("@Firstname", SqlDbType.VarChar, 55);
           
sqlCmd.Parameters["@Firstname"].Value = fName;

             etc...

Seems pretty straight forward right? A senior coder writes his params like this

            SqlParameter pUserName
= new SqlParameter("@UserName", SqlDbType.VarChar, 55);
            pUserName .Value =
address2;
           
sqlCmd.Parameters.Add(pUserName );

            SqlParameter pFName =
new SqlParameter("@FName", SqlDbType.VarChar, 55);
            pFName .Value = city;
           
sqlCmd.Parameters.Add(pFName );

            etc...

What is the benefit, if any, of creating a new instance of SqlParameter for
each value passed to the sp?

How could I easily test the code for speed and resource use besides tracing?

View 4 Replies View Related

Backup Libraries - Best Practices

May 5, 2005

We are preparing to stand up a server with an HP tape library backup system using Data Protector software.

Our intention is to copy backup files from each of our db servers to the file system of the server with the tape library. There will be two tape rotation schemes where there will be a daily offsite tape and one with a monthly offsite rotation for less critical applications. Our systems group isn't happy about that configuration but are willing to implement what we want.

My questions are:
Is there any one out there using one of these and if so can you give me a 10,000 foot view of your process?

Addtionally, is this the best way to utilize this resource?

Any serious suggestions and comments are greatly welcomed and appreciated.

Sidney Ives
Database Administrator
Sentara Healthcare.

View 1 Replies View Related

Best Practices: Changing Values

Jun 3, 2004

What is the best way to design your tables in cases where field values change?

Example:
CREATE TABLE Product (ProductID INT, Description VARCHAR(32), Price SMALLMONEY...);

CREATE TABLE Purchase (PurchaseID INT, ProductID INT, Quantity INT);

Since price obviously change over time, I was wondering what the is the best table schema to use to reflect these changes, while still remembering previous price values (like for generating reports on previous sales...)

is it better to include a "Price SMALLMONEY" field in the purchases table (which kind of de-normalizes it) or is it better to have a separate ProductPrice table that keeps track of changing prices like so:

CREATE TABLE ProductPrice (ProductID INT, Price SMALLMONEY, CreationDate DATETIME...);

and have the Purchase table reference the ProductPrice table instead of the products table?

I have used both methods in the past, but I was wanted to get other peoples' take on it.

Thanks

View 3 Replies View Related

SQL Server Best Practices Anlyzer

Nov 22, 2004

Hey Gang, I got a question about authentication. I have a Just loaded SQL Server on my virtual box and loaded Microsofts bpa and microsoft security anlyzer. I get a funny reading. The security scan work but the bpa scan does not. I also look at the database that I can get access to and notice a new database schema. I was thinking if I remove this database will it have any affect on the present database.

View 1 Replies View Related

Best Practices: GROUP BY Clause

May 20, 2004

I was wondering what the best way to write a GROUP BY clause when there are many (and time consuming) operations in the fields by grouped.

Fictious example:

SELECT DeptNo, AVG(Salary) FROM Department GROUP BY DeptNo;

This will give me the average salary per department. Let's say, however that
I had 10-15 fields being returned (along with the AVG(Salary)) and some fields even had operations being performed on them. Is it better to create a temporary table to calculate the sum per department (or a VIEW) and then
perform a JOIN with the rest of the data?

Fictious example:

SELECT DATENAME(y, StartDate), DATENAME(m, StartDate), DATEPART(d, StartDate), SUBSTR(DeptName, 1, 10), SomeFunction(SomeField), SomeFunction(SomeField), AVG(Salary)
GROUP BY DATENAME(y, StartDate), DATENAME(m, StartDate), DATEPART(d, StartDate), SUBSTR(DeptName, 1, 10), SomeFunction(SomeField), SomeFunction(SomeField);

Am I better off writing my query this way or using a JOIN on some temporary table or view?

Thanks

View 14 Replies View Related

Best Practices For Field Sizes

Jul 23, 2005

Can any one direct me to sources for best practices of field types and sizesto use for commonly used information such as address, names, city, businessnames ....Thanks, Brian

View 3 Replies View Related

Correct Practices With SQL Server

Sep 5, 2007

I am hoping someone can give some advice on the following things:

I have read a few times about a data access layer in an n-tier application. I am assuming that this should be done
using sprocs. Is there an advantage of using sprocs instead of views ( in situations where the same thing could
be accomplished using either)? Will a sproc run faster than a view? Can any share any info?

Are sprocs best suited for data access and to enforce business rules?

I know SQL Server has reserved words that shouldn't be used. I am wondering what the best thing to do is
in the following situation? What is the best way to handle storing a customer or clients address? I am working from a book that shows the name of a column as "Address". I have found that with SQL Server 2005
Express that this is a reserved word(it is shown in blue in the query window). I want to keep my names short. I am trying to avoid a name like "StreetAddress". Is my book teaching bad habits?
...........................................thanks...........................................................

View 3 Replies View Related

BPA Vs. Security Best Practices Paper

Jul 17, 2007





I would like to refer to the following technical article



SQL Server 2005 Security Best Practices - Operational and Administrative Tasks

http://www.microsoft.com/technet/prodtechnol/sql/2005/sql2005secbestpract.mspx



Among best practices for SQL Server service accounts on page 8, it is recommended to 'use a separate account for each service'. I created separate account for each service as advised and assign account to relevant Windows group created for each SQL Server service during SQL setup.



Now when I run Best Practices Analyzer, its report seemed to contradict what the above article said. For example, BPA reports excerpts:

"We recommend that the service SQLBrowser on host MachineName be run under Network Service Account". I get similar recommendation for SQLSERVERAGENT account as well. Most importantly, it recommends that MSFTESQL be run under SQL Server Service Account.



Can anyone of you shed some light on it?



Thanks,

Asaf

View 8 Replies View Related

SQL Best Practices: Dedicated Hardware

Nov 8, 2007

Can anyone point me to MS Best Practices that advise you against installing non-SQL-related applications on a SQL DB server? I feel it to be common sense but I have been asked to justify this with a direct statement from Microsoft. Help?

View 6 Replies View Related

Date And Time Best Practices?

Feb 20, 2008

Can anyone point me at a sql server 2005 best practices type article regarding storage of dattimes, whether to convert to a universal, timezones, etc?

We are writing an SOA style app, and are questioning which type to store datetimes as, when timezone/daylight savings comes into play, always store in server time or use the possibly very remote users time, etc.

Thanks in advance!

Derrick

View 3 Replies View Related

SQL Server, C# .NET 2.0 -- Speed Best Practices

Feb 21, 2008

Greetings,

I'm developing a trading application in C# that processes streaming data that can be very heavy at times. Transactions are occuring, logging information is stored, etc., often at a very rapid pace. Up until recently, I had been storing all of this information in memory in DataSets -- upon a graceful exit, the application would call DataSet.WriteXml() to a file, and then next time the application was opened, it would consequently call ReadXml() to obtain its last state. This is all great in theory because it is super fast, there is negligible lag when I add a row to a DataTable that already has 12,000 rows (at a rate of 300 per second), however if the program were to crash, without a chance to write the data to file, then I'm screwed.

My solution is to have the various DataSets bound to a SQL Server database -- I've created strongly typed DataSets and TableAdapters to help aid this process. Because often I'm adding rows VERY quickly and in large numbers to these tables, having an INSERT command execute on the database for EVERY transaction is prohibitively slow.

What I would like is to have some mechanism in place where I only affect the local DataTables on the fly, and then occasionally make calls to TableAdapter.Update (on their respective TableAdapters) during slow periods (or lulls in the message traffic) so that any changes to the in-memory data is persisted on the database. I'm looking for general "best practices" in this regard -- nothing specific, just advice from people who have dealt with this type of application/environment before and might have some tips.

The first thing I thought about doing would be a relatively simple algorithm that, upon receiving a new transaction, sets a timer (for, say 500 ms). When this timer is triggered, it calls the Update command on the DataSet that was updated. If another update comes in before the 500 ms, it first checks to see if there's an active timer for this DataSet, if so, it cancels it, and sets a new timer for 500 ms. This way, if I have a very rapid set of transactions that all occur within a few ms of each other, it will not make any calls to the database during the "peak" of data -- only when there's a 500 ms gap will it make a call.

Does that seem like a viable approach?

Any help/direction is greatly appreciated.

Rick

View 1 Replies View Related

SQL 2005 Schema Best Practices

Jan 3, 2008

Hello,

In general, with the introduction of schemas in 2005, is it considered "bad practice" to tell people to create new tables in the "dbo" schema?

Our product documentation contains a "quick start" guide for users who just want to get the product up and runing. We suggest that the customer creates a database for our application. This database is configured with a user that is assigned to the 'db_owner' role (we want to keep things as simple as possible) that we use to connect to the database. This database will only be used by our application. In this situtation is it okay to use the "dbo" schema, or should we consider creating another schmea for all our tables?

Are the any "best practices" for using schemas in SQL 2005?

Thanks,
Brenden.

View 7 Replies View Related

Internal Activation Best Practices?

Jun 1, 2006

I am looking for an example of a SP that shows the best practices for internal activation? In BOL this topic describes the typical patter for reading messages from a queue. What is the typical pattern for reading messages from a queue using an internally activated SP? Do we still need to loop (considering the message arrival actually fired the sp)?

Any advice provided would be helpful.



Thanks!

View 4 Replies View Related

Best Practices For Accessing A Sql 2005 Db On The SAME Box As IIS 6.0 Serving Asp.net 2.0

May 17, 2007

I am re-posting this from the security Forum where it remains un-answered.

OK, Here's the set up.

I have a Windows 2003 box, soon to have SSL installed

On it is IIS 6.0, SQL 2005 Standard Edition (5Cal user lic)



SOON I'll have a prod enviornemnt where a web app being served by IIS is accessing SQL. I can go into SQL and set up a user account, call it MyAppSQLAcess, and code that into the connectionn string and lock it down to the tables/db it has access to. Or I can do it w/windows authentication, or I can do it a number of other ways, the question is this:

What is the best way for an asp.net app being server by IIS 6.0 to access data from SQL 2005 server when they are all on the same BOX? WRT Speed and security?



Thanks



Dan

View 2 Replies View Related

Best Practices For Insert/Update/Delete

Jul 24, 2007

 for now, doing a small school project, i find doing SPs for Insert useful, like checking for existing data and not inserting, that might not be the best method, i had advice from here i can use unique constraints instead, then what about update and delete? SPs also? the pros make SPs for everything? currently use dynamically generated SQL from SqlDataSources. for Update / delete. some delete are SPs too...

View 2 Replies View Related

Best Practices SFTPing Files Between Servers

Sep 28, 2007

Hi-I have a sql 2005 database that stores weekly time and attendance data for 500 users. I need to send the file on a scheduled basis.I'd alo like to have programmatic control over sending the file via my asp,net application's admin pages. (I have roles and security set up on the site) My question is two fold: Can asp.net be scripted to manage a data extraction to a file, (I gotta HOPE so)  THEN is there any sFTP functions built into asp.net?I know I can do this in SQL2005, then have a scheduled event fire off the file via sFTP. but like I said, I'd like to have programmatic control over the file sending. (Of course they have to be sent on SUNDAY!)  Thanks in advance for any advise 

View 6 Replies View Related

Accessing Database Data In ASP.NET 2.0 - Best Practices?

Dec 31, 2007

I was wondering if you guys know of a good site that talks about programmatically accessing and displaying data from a sql server 05 database in ASP.NET 2.0.I want to have a data adapter in a dataset, but I would like to create my own class file and pull the data from the adapter through code into the class. Is this the best way? Im wondering about the best practices while learning this new technology. Any articles provided would be appreciated. Thanks!

View 2 Replies View Related

Full-Text Indexing: Best Practices?

Oct 4, 2005

Hey guys,

I'm about to deploy full-text searching to complement
ASP.NET/C#.NET.  I unfortunately have not had much experience in
Full-Text Indexing...

I will be creating catalogs for each table in the database (end-users
pretty much require that we are able to search by
anything).   I see that there is a catalog population
schedule...typically, what is the best practice for updating the
catalogs?  The database entries will most likely be updated
frequently, as it contains the end-users curriculum vitae.  Would
once a day, say midnight, be a sufficient schedule?

And also, as it pertains to best practices, is the method I am applying
the best way to do this (a catalog per table)...I have about 30 tables
in a relational model...is there a better way to catalog the entire
thing instead of table by table?

Thanks a lot for the help!

Eric...

View 3 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved