[Q] High Transaction Load Solution?

Nov 10, 2001

Hey guys,

I orignally wrote a post here regarding some info on setting up a cluster. Upon further analysis of the problem with our system, I noted that at particular times we have tremendous amounst of Update, Insert, Delete etc, transactions hitting out database.

I thought originally SQL Clustering could solve this problem, but the time and upkeep that will be required to maintain such a configuration might not be feasible and more importantly it may not even fix the problem.

Next week I plan on doing some more specific performance monitoring off the database during normal business activity, but my initial suspicion is that there is a tremendous amount of I/O processing due to the high transaction load which is slowing down the application.

I was wondering what you have done to alleviate such problems? One of the solutions I have come up with is to possibly create a Master/Slave SQL Server design where the Slave handles most of the database transactions and then at a low load during the day update the Master DB. How does this sound? Any other ideas would be greatly appreciated...

Thanx

View 1 Replies


ADVERTISEMENT

Solution Design For High Volume Of Data

Apr 18, 2006

Hi,

I have been asked to design a solution for a client of mine who basically requires the daily analysis and reconciliation of the differences between 2 extremely large text files.

The files are not in an identical format but are both in some form of delimited format (one is CSV, the other is a little more complex). For the sake of this question, let's assume that I can effectively import each file into an MS SQL table.

Each file will have in excess of 100,000 rows each day (new data for each day).

Whilst I know that MS SQL does easily have the capacity to store the data, is there a recommended way to tackle the potential problems (I imagine that performance is important... they will be running the report every day)

Or is building the solution as simple as importing the data into 2 tables, and then querying the differences and outputting as a report using Crystal?

Any suggestions appreciated.

Thanks

Rael

View 10 Replies View Related

High CPU Load On Service Start For Approx 10 Mins

Mar 12, 2008

This has been driving me crazy.

First noticed it when I went from JIT compiling of my .net app to pre-compiled via the command prompt, although I can't see that that's go anything to do with it...

My database is around 2gb and I'm using service broker.

If I drop a new version of my app into production, sqlserver.exe goes nuts and eats up the cpu.

Tonight, I stopped IIS and SQL Server, then dropped a new version in. I hadn't re-started IIS at this point... As soon as I started SQL Server, off it went again - eating up the CPU. I quickly launched a trace via the Profiler - no activity showing.

So, I started IIS after a couple of mins (still with the high cpu) and then just waited. After about 10 mins, SQL Server settled down and everything was normal again.

What is it doing when I start the SQL Server service?

Si

View 2 Replies View Related

Random SQL Errors In Application During A High-load Almost 'batch' Process. Is This Expected?

Jan 4, 2008

Setup is a common VB.NET application, SQL back, with 5-10 users. Occassionaly from another VB.NET app, I'll need to run a process to upload and insert a handful of records. The records are somewhat large with 100+ fields that require a handful of SELECT and UPDATE statements making the server 'busy' for several minutes.

Independently, the Main application and the Upload application work just fine. It's only when users are active in the main application during the time of the 'upload' process. It's almost as if they 'bump' into each other and the server throws up errors on either app even simple SQL statements. The same tables are being queried, but it doesn't feel like a concurrency issue.


Are these problems expected when running somewhat of a high load?
Any ideas on how to make these guys work together without them fighting?
Shouldn't SQL be able to handle the statements even during the barrage of update queries and requests?
What happens in situations with hundreds or even thousands of users simultaneously accessing?


Thanks for any advice or input...

View 1 Replies View Related

Separate Databases For High/low Transaction Volumes?

Jun 23, 2006

I have an existing database with approx 500,000 rows and accessed by afew hundred users per day creating approx 1,000 new records per dayplus typical reporting - relatively low volume stuff for SQL Server.I'm about to add a process that will be importing data daily fromlegacy databases and summarizing it for reporting purposes, integratingit with the existing database. This volume of data will be considerablyhigher, perhaps 100,000+ rows per day, which will be deleted once ithas been summarized and the results written to some intermediatetables.Is there any concern about mixing different levels of volume within onedatabase? As I'll be creating lots of rows daily and then deleting themI was wondering about fragmentation, transaction logging etc. andwhether having this processing in a separate database from the mainapplication would be 'better'.

View 3 Replies View Related

Backup Solution - Question About Transaction Log

Aug 6, 2007

I've just inherited (i.e., our sys admin / DBA left the company) a fairly small SQL Server that's running 7 production databases. Most are quite small, but there are two which are about 40gb each. Traffic is quite low - ~30-40 users at one time doing your basic SELECT / UPDATE / INSERT stuff.

Anyway, I was going through some of the backups jobs and noticed that the transaction logs for each database were absolutely huge (in some cases bigger than the DB itself) which led me to think the log wasn't getting truncated.

The T-SQL being run in each case was

Currently, the transaction log for 6 DBs is backed up 3 times a day (and the 7th, "mission critical" DB is backed up every 15 minutes) with the following T-SQL:

BACKUP LOG <database> to <device> WITH NOINIT, NOFORMAT, NOSKIP, NOUNLOAD

5 of the 7 databases get a full backup twice a day, with the 2 larger ones getting a differential, with e.g.,

BACKUP DATABASE <database> TO <device> WITH NOINIT , NOUNLOAD , NAME = N'db', NOSKIP , STATS = 10, DESCRIPTION = N'db', NOFORMAT , MEDIANAME = N'db'DECLARE @i INT
select @i = position from msdb..backupset where database_name='db'and type!='F' and backup_set_id=(select max(backup_set_id) from msdb..backupset where database_name='db')
RESTORE VERIFYONLY FROM <device> WITH FILE = @i

This is then backed up to tape each night.

Looking through the documentation, those WITH commands are largely the default settings so I'm not sure why they're specified explicitly.

If I issue a

BACKUP LOG <database> to <device> WITH INIT, SKIP

then the log does get truncated. However, could someone explain the implications of that for me? As I understand it, INIT will overwrite any existing sets in the device, but considering that it will always backup anything that hasn't been committed then should it be a problem?

Alternatively could someone perhaps explain why the log wasn't getting truncated? It is my understanding that this should happen every time a full backup is completed... which is twice a day. Or does the Transaction log need to be manually shrunk every now and then?

Also, I understand the DECLARE... part in the last part of that DB backup SQL, but is it at all necessary?

Finally, does this backup strategy seem viable? Any thoughts and comments are appreciated!
Matt

View 5 Replies View Related

SQL 2012 :: High Availability Group Transaction Log Shrink

Feb 3, 2015

I have a Customer running a database in a High Availability Group and I am not familiar with the set up... They have a transaction log that quadrupled in size during a data import and update which was generated by an external application. They have limited server space and would like to shrink the log again now as this import / update only happens once a year. The way this has always been dealt with in the past was by shrinking the DB and logs after the update.

Now however, when attempting to do a log or db shrink, an error message is generated which says that the log cannot be shrunk as the DB is in use as part of an Availability Group....

The more I search and try to read up on this subject, it looks like the DB has to be removed from the Availability Group before the log can be shrunk and then the Availability Group has to be re-created or restored in some way. Is there a simple solution to this conundrum?

View 9 Replies View Related

SQL 2012 :: Monitoring Table In High Transaction Database

Nov 4, 2015

I am developing a process to monitors a table in a high transaction database. The process will count the number of lines in the table to verify if it has changed or it is stuck. Due to the fact that the database has a lot of transaction I don't want to execute a query on database too often.l Is there another suitable way to accomplish this goal ?

View 2 Replies View Related

Transaction Log Load

Apr 9, 1999

Hi,

does anybody know how to restore transaction log dumps (several dumps in one dump device) with TSQL statements ?

I tried "Load transaction devicename"
this fires an error message which complains that i have to use right sequence.
How do i tell the load command which sequence it shall use ?

You can use "load headeronly" command which gives you all entrys from the dumpdevice. How can i use this information ?


Thanks

Yorn

View 1 Replies View Related

Database Is In Use-can You Load Transaction Log?

May 18, 1999

I am trying to load transaction log but I am getting an error message saying database is in use, my question is Is it possible to load transaction log when database is in use? if yes how?

Thanks
Bharvi

View 1 Replies View Related

Automate Load Of Transaction Dumps

Jul 1, 1999

Hi,

does anybody know how to automate the loading of incremental transaction dumps. The manual way is to use the "load tran DB with FILE = x" statement.

Since this has to be done in the right sequence and i need to automate this task to keep a second server up to date i like to know if there is a stored procedure
or any other tool which could do the task.

Thanks for your help

Best Regards

Yorn Ziesche

View 1 Replies View Related

OLE DB Fast Load And Transaction Problem

Dec 14, 2007

We are experiencing problems when using OLEDB Fast Load option with transaction. We have a sequence container containing a Data Flow Task with a OLEDB source selection from tab1 left join tab2 and inserting into tab2 on a OLEDB Destination Fast Load.


The setup/troubleshooting is:



We are using TransactionOption=Requiered on the sequence container holding the Data Flow Task

OLE DB destination with Fast Load, no table lock, no check constraints

Dosn't happen on small sized data - seems as if everything can be contained in one buffer the task succeeds

Only a problem when selection and insertion is on the same table - or a view based on the same table

I know that we can use the OLE DB destination without Fast Load but this will perform badly on big sized data so that is not an option.


The errormessage is as follow:



Code Block
Information: 0x402090DF at dft Upsert Sag, OLE DB Destination [8295]: The final commit for the data insertion has started.
Error: 0xC0202009 at dft Upsert Sag, OLE DB Destination [8295]: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "This operation conflicts with another pending operation on this transaction. The operation failed.".

Does anyone have any ideas how to solve this problem?


View 3 Replies View Related

Load SQL 7 Database And Transaction Log Backups To SQL 2000

Jul 20, 2005

Is it possible to load both the SQL 7 database and transaction logbackups to SQL 2000 ? I assume it will perform the upgrade during theload.Thanks,James

View 4 Replies View Related

High Availability To High Protection Without Reconfiguring Mirroring

Apr 23, 2007

Hi,

Is there a way to configure mirroring to go from High Availability to High Protection without having to reconfigure Database Mirroring? Using the interface in Management Studio, I can change the configuration option to High Performance, but not High Protection despite both of them being Synchronous.

If not, what are the recommended steps to configure the mirror once it already has been configured? Is just like initially setting up the mirror or would there be any shortcuts I could take? If I stop the mirroring and remove the witness, will the High Protection option be available?

Thanks,
J.

View 3 Replies View Related

High Safety Changed To High Performance After Fail Over ?

Mar 6, 2008

Hi There

I realise this is a stupid quesiton but i cannot really find any confirmation of this in BOL.


If you are running High Safety with automatic failover, when failover occurs does this automatically change to High Performance mode. SInce for failover to occur something has happen with the primary , it will be impossible to commit transactions on the new primary and mirror asyncronously since 1 of them is no longer available.


So am i correct in assuming that automatic failover also automatically changes the mode to High Performacne for that session?

Thanx

View 4 Replies View Related

SOLUTION! - VB.NET 2005 Express And SQL Server 2005 - NOT Saving Updates To DB - SOLUTION!

Nov 30, 2006

VB.NET 2005 Express and SQL Server 2005 Express - NOT saving updates to DB - SOLUTION!

-----------------------------------

The following article is bogus and confusing:

How to: Manage Local Data Files - Setting 'Copy to Output Directory' to 'Do not copy'
http://msdn2.microsoft.com/en-us/library/ms246989.aspx

You must manually copy the database file to the output directory
AFTER setting 'Copy to Output Directory' to 'Do not copy'.

Do not copy








The file is never copied or overwritten by the project system. Because your application creates a dynamic connection string that points to the database file in the output directory, this setting only works for local database files when you manually copy the file yourself.

You must manually copy the database file to the output directory
AFTER setting 'Copy to Output Directory' to 'Do not copy'.

-----------------------------------

The above article is bogus and confusing.

This is rediculous!

This is the most vague and convoluted bunch of nonsince I've ever come accross!

Getting caught out on this issue for the 10th time!
And not being able to find an exact step-by-step solution.

--------------------------

I've tried it and it doesn't work for me.

Please don't try what the article eludes to as I'm still sorting out exactly what is supposed to be happening.



If you have a step-by-step procedure that can be reproduced this properly please PM me.

I would like to test its validity then update this exact post as a solution rather than just another dicussion thread.

Many thanks.



This is the exact procedure I have come up with:

NOTE 1: DO NOT allow VB.net to copy the database into its folders/directorys.

NOTE 2: DO NOT hand copy the database to a folder/directory in your project.

Yes, I know its hard not to do it because you want your project nice and tidy.
I just simply could NOT get it to work.
You should NOT have myData.mdf listed in the Solution Explorer. Ever.

Create a folder for your data following NOTE 2.

Copy your data to that folder. * mine was C:mydatamyData.mdf



Create a NEW project.

Remove any Data Connections. ( no matter what)

Save it.

Data | View Data Sources

Add New Data Source

select NEW CONNECTION ( No Matter what, do it!

Select the database. * again mine was C:mydatamyData.mdf

Answer NO to the question:
Would you like to copy the file to your project and modify the connection?
- NO ( no matter what - ANSWER NO ! - Absolutely NO )
Then select the tables you want in the DataSet.
and Finish.



To Test ----------

From the Solution Explorer | click the table name drop down arrow | select details
Now Drag the table name onto the form.

The form is then populated with a Navigation control
and matching Labels with corresponding Textboxes for each field in the table.

Save it.

1) Run the app.

Add one database record to the database by pressing the Add(+) icon

Just add some quick junk data that you don't mind getting lost if it doesn't save.

YOU MUST CLICK THE SAVE ICON to save the data you just entered.

Now exit the application.

2) Run the app again.

And verify there is one record already there.

Now add a second database record to the database by pressing the Add (+) icon.

NOW add some quick junk data that you WILL intentionally loose.

*** DO NOT *** press the save icon.

Just Exit the app.

3) Again, Run the app.

Verify that the first record is still there.

Verify that the Second record is NOT there.
Its NOT there because you didn't save the data before exiting the app.

Proving that YOU MUST CLICK THE SAVE ICON to save the data you just entered.

Also proving you must add your own code to catch the changes
and ask the user to save the data before exitiing or moving to another record.

As a side note, since vb.net uses detached datasets,
(a copy/snapshot of the dataset in memory and NOT directly linked to the database)
the dataset will reflect all changes made when moving around the detached datasets.
YOU MUT REMEMBER TO SUBMIT YOUR CHANGES TO THE DATABASE TO SAVE THEM.
Otherwise, they will simply be discarded without notice.

Whewh!

I hope this saves me some time the next time I want to start a new database project.

Oh, and uh, for anyone else reading this post.

Thanks,
Barry G. Sumpter

Currently working with:
Visual Basic 2005 Express
SQL Server 2005 Express

Developing Windows Forms with
101 Samples for Visual Basic 2005
using the DataGridView thru code
and every development wizard I can find within vb.net
unless otherwise individually stated within a thread.

View 17 Replies View Related

Not Able To Load The Application In Case Web Farm Garden When We Load Data Through Background Thread.

Dec 14, 2007

Hi,

Here I will describe my problem.
1. We are loading large amount of data from database on background thread which is starting on Application_start event in global.aspx.cs file.The data is later cached for subsquent request to improve the performance.
2. Now when we put the application on web farm garden, it is not able to load the application.
3. We are sending the request the servers through Router kind of application.
4 This application is working fine on single server enviornment.

Please help us.

Ajay Kumar Dwivedi

View 1 Replies View Related

Error 8525: Distributed Transaction Completed. Either Enlist This Session In A New Transaction Or The NULL Transaction.

May 31, 2008

Hi All

I'm getting this when executing the code below. Going from W2K/SQL2k SP4 to XP/SQL2k SP4 over a dial-up link.

If I take away the begin tran and commit it works, but of course, if one statement fails I want a rollback. I'm executing this from a Delphi app, but I get the same from Qry Analyser.

I've tried both with and without the Set XACT . . ., and also tried with Set Implicit_Transactions off.

set XACT_ABORT ON
Begin distributed Tran
update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TRANSACTIONMAIN
set REPFLAG = 0 where REPFLAG = 1
update TSADMIN.TRANSACTIONMAIN
set REPFLAG = 0 where REPFLAG = 1 and DONE = 1
update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.WBENTRY
set REPFLAG = 0 where REPFLAG = 1
update TSADMIN.WBENTRY
set REPFLAG = 0 where REPFLAG = 1
update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.FIXED
set REPFLAG = 0 where REPFLAG = 1
update TSADMIN.FIXED
set REPFLAG = 0 where REPFLAG = 1
update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.ALTCHARGE
set REPFLAG = 0 where REPFLAG = 1
update TSADMIN.ALTCHARGE
set REPFLAG = 0 where REPFLAG = 1
update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TSAUDIT
set REPFLAG = 0 where REPFLAG = 1
update TSADMIN.TSAUDIT
set REPFLAG = 0 where REPFLAG = 1
COMMIT TRAN


It's got me stumped, so any ideas gratefully received.Thx

View 1 Replies View Related

Load All Data Without Knowing Old One Was Load In The Previous Time???

Apr 27, 2007

I just have done the SSIS example in the tutorial document included when install SQL 2005 ENT. I have a problem that whenever I test to run, the service load all data from source with out noticing about the data (I mean it load all the data to the destination), I do it several time and it continue to load all without checking. That mean the data is dublicated when the schedule run???



I think there should be a paramete or something like that to help the engine just load the new data to the destination. Could you help please?



Thank

View 3 Replies View Related

SSIS, Distributed Transaction Completed. Either Enlist This Session In A New Transaction Or The NULL Transaction.

Feb 22, 2007

I have a design a SSIS Package for ETL Process. In my package i have to read the data from the tables and then insert into the another table of same structure.

for reading the data i have write the Dynamic TSQL based on some condition and based on that it is using 25 different function to populate the data into different 25 column. Tsql returning correct data and is working fine in Enterprise manager. But in my SSIS package it show me time out ERROR.

I have increase and decrease the time to catch the error but it is still there i have tried to set 0 for commandout Properties.

if i'm using the 0 for commandtime out then i'm getting the Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.

and

Failed to open a fastload rowset for "[dbo].[P@@#$%$%%%]". Check that the object exists in the database.

Please help me it's very urgent.

View 3 Replies View Related

Distributed Transaction Completed. Either Enlist This Session In A New Transaction Or The NULL Transaction.

Feb 6, 2007

I am getting this error  :Distributed transaction completed. Either enlist this session in a new
transaction or the NULL transaction. Description:
An unhandled exception occurred during the execution of the current web
request. Please review the stack trace for more information about the error and
where it originated in the code. Exception Details:
System.Data.OleDb.OleDbException: Distributed transaction completed. Either
enlist this session in a new transaction or the NULL transaction.have anybody idea?!

View 1 Replies View Related

Distributed Transaction Completed. Either Enlist This Session In A New Transaction Or The NULL Transaction.

Dec 22, 2006

i have a sequence container in my my sequence container i have a script task for drop the existing tables. This seq. container connected to another seq. container. all these are in for each loop container when i run the package it's work fine for 1st looop but it gives me error for second execution.

Message is like this:

Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.

View 8 Replies View Related

Distributed Transaction Completed. Either Enlist This Session In A New Transaction Or The NULL Transaction. (HELP)

Jan 8, 2008

Hi,

i am getting this error "Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.".

my transations have been done using LINKED SERVER. when i manually call the store procedure from Server 1 it works but when i call it through Service broker it dosen't work and gives me this error.



Thanks in advance.


View 2 Replies View Related

Load A SSIS Package Via Web Service: The Package Failed To Load Due To Error 0xC0011008 Error Loading From XML.WHAT IS THAT?

May 19, 2006

Hello,

I have a big problem and i'm not able to find any hint on the Network.

I have a window2000 pc, VS2005,II5 and SQLServer 2005(dev edition)

I created an SSIS Package (query to DB and the result is loaded into an Excel file) that works fine.

I imported the dtsx file inside my "Stored Packages".

I would like to load and run the package programmatically on a Remote Scenario using the web services.

I created a solution with web service and web page that invoke the web service.

When my code execute:
Microsoft.SqlServer.Dts.Runtime.Application.LoadFromDtsServer(packagePath, ".", Nothing)

I got the Error:
Microsoft.SqlServer.Dts.Runtime.DtsRuntimeException: The package failed to load due to error 0xC0011008 "Error loading from XML. No further detailed error information can be specified for this problem because no Events object was passed where detailed error information can be stored.". This occurs when CPackage::LoadFromXML fails.

The error message doesn't help so much and there is nothing on the www to give me and advice....

Is it a SSIS problem???

Thank you for any help!!

Marina B.



View 10 Replies View Related

SQL Server Admin 2014 :: Restore Lost Transaction From Transaction Log File

Jun 10, 2015

I have Full database backup upto previous day and transaction logfile of Today transaction. my database has crashed. I have restored previous day's Full backup. I have faced difficulty to restore today's transaction from today's transaction log. What are the steps to restore full database back and one day's transaction log file. Note: there is no differential database backup and transaction backup.

View 8 Replies View Related

I Need Away To Show The Pending Transaction From Transaction Replication In A User Friendly Format.

Jul 11, 2007



I want to list out the pending transaction for transaction replication by publication.



Help needed.

View 1 Replies View Related

TRANSACTIONS In SSIS (error: The ROLLBACK TRANSACTION Request Has No Corresponding BEGIN TRANSACTION.

Nov 14, 2006

I'm receiving the below error when trying to implement Execute SQL Task.

"The ROLLBACK TRANSACTION request has no corresponding BEGIN TRANSACTION." This error also happens on COMMIT as well and there is a preceding Execute SQL Task with BEGIN TRANSACTION tranname WITH MARK 'tran'

I know I can change the transaction option property from "supported" to "required" however I want to mark the transaction. I was copying the way Import/Export Wizard does it however I'm unable to figure out why it works and why mine doesn't work.

Anyone know of the reason?

View 1 Replies View Related

Analysis :: Find Amount Distribution Across Different Transaction Types Under Spend Transaction

Jul 27, 2015

I created a Calculated measure in cube something like this : ([TransType].[TransTypeHierarchy].[TransTypeCategoryParent].&[SPEND],[Measures].[Transaction Amount]). To get only spend transactions. Now, I want to slice this measure with same hierarchy to find the amount distribution across different transaction types under spend transaction. But this query behaving like the measure doesn't have relation with measure.

you can think this as below query:
WITH
MEMBER SPEND AS ([TransType].[TransTypeHierarchy].[TransTypeCategoryParent].&[SPEND],[Measures].[Transaction Amount])
SELECT NON EMPTY {SPEND} ON 0
,NON EMPTY ([TransType].[TransTypeHierarchy].[TransTypeCategoryParent]) ON 1
FROM [CUBE]

View 6 Replies View Related

How High Will @@Identity Go??

Jul 21, 2000

I need to use a system-generated key and will have over 2 billion items.
I'm considering using a data type like the following.

Decimal(19,0) will give me the large numbers I need without making my index wider than 9 bytes.

Uniqueidentifier is 16 bytes.

Anyone know if this will cause me pain in the future? Specifically using @@identity after inserts to get the value (i tested it seems OK).


CREATETABLEIdentityTest
(
MyIDDECIMAL (19,0)IDENTITY (1,1),
MyCharchar(1) NOT NULL DEFAULT 'Y'
)


Any ideas welcome.

Dano

View 3 Replies View Related

High Cpu Utilization

Mar 21, 2000

After a fresh install of SQL 6.5 with SP5a(or without), the cpu is running at anywhere from 50%-80%. It is loaded on a PDC, but when I stop the sql service cpu utilization drops to 0-2%. When I start the sql service it's right back up there, does anyone have a suggestion as to how to fix this or why the service would be doing this?

Thanks, Christel

View 1 Replies View Related

High-Availability

Mar 2, 2004

Hello everybody,
We are starting new project
Customer validation with minimum wait time.
How to insure High-Availability?
We have 2 standard servers and in past used custom log shipping.

Log shipping still requre manual intervention ,while goal to switch
between servers automaticly.

Clustering is a last possible solution.

Could someone recomend any other soluion or products?

Thank you

Alex

View 1 Replies View Related

High Disk I/o

Aug 14, 2002

Hello,

We are experiencing high disk i/o on one of our RAID disk systems. Can someone tell me how I can identify the query or user or process which is causing this high disk i/o?

Thanks,
Brent.

View 1 Replies View Related

High CPU Utilization !!!! Help !!!!!!!

Sep 5, 2002

I am getting high CPU utilization on the SQL Server process (>90%).
However the overall utilization (NT -- entire box) always seems to be under 50%.

Can someone explain why this is happening. The server is a quad; the SQL server process seems to be using only two CPUs at a time (not the same ones all the time).

Lightweitht pooling has been turned on and the maximum worker thread size has been left at the default value (255).

How can I configure SQL options to spread the load across all four CPUs ??????

View 2 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved