Stress Management
Sep 22, 1998What exactly `Stress Management` is.
Any tools for assessing Stress Management?
What exactly `Stress Management` is.
Any tools for assessing Stress Management?
Hi,
Is there anyway to test the loadtesting/Stress testing on SQLSERVER. I wanted to executes a Store procedure with concurrent users.
Thanks
JK
can anyone tell me where can download stress testing tool for SQL Server 7.0.
Thank You,
Piyush Patel
Anyone have a tool or script to run to test a SQL server under heavy stress?
View 1 Replies View RelatedAnyone know a decent stress tester tool for SQL Server2000? I need to simulate many concurrent users executing queries...
Regards,
/S
What do you recommend for stress-testing the performance of key stored procedures (they have been identified) for our application? The parameters can be programatically selected, for example:
Select 'exec my_proc @id = ' + Cast(id As varchar)
From myTable
Where foo = 'bar'
I have the Support Tools Available For Stress Testing & Performance Analysis (http://www.microsoft.com/downloads/details.aspx?FamilyId=5691AB53-893A-4AAF-B4A6-9A8BB9669A8B&displaylang=en) from Microsoft's site and they are pretty good.
Recommendations appreciated, and thanks in advance!
Hi,
I have a new server where 32GB of RAM is installed and I have user databases on this server.I am using SQL server 2000 Enterprise edition and Platform is Windows 2003 adv server, which supports upto 128GB of memory.
sp_configure 'awe enabled' is set to 1 and at OS level, AWE is enabled as well.
max server memory (MB) is 2147483647
I was doing some stress test on this server but memory usage doesn't go beyond 180MB....can someone suggest a test for physical RAM ?
How can I make sure that application will make full use of available physical memory?
Rgds
Wilson
I am trying to import a data file, which is tab delimited, using BULKINSERT. I have used BCP to create a format file, since the destinationtable has around 20 columns, but the data file has only three.Here's the problem: The columns I am trying to import comprise ID (anint identity column), Name (a varchar(255) column and Status (a smallint column). The data file contains identity values for the firstcolumn, so I am using the KEEPIDENTITY modifier. The Status column ismandatory, so I have set all rows in the data file to zero for thatcolumn. All of the other columns in the destination table either allowNULL or have default values. When I BULK INSERT the file using theformat file the identity columns are NOT imported and the Status columngets value 3376. The Name column is the only one that gets importedcorrectly. Here's the format file:8.031 SQLINT 0 4 " " 1 ID""2 SQLCHAR 0 0 " " 2 NameSQL_Latin1_General_CP1_CI_AS3 SQLSMALLINT 0 2 "" 4 Status""Sorry it's a bit messy.Where is 3376 coming from, and why are my identity values for column IDnot being imported?
View 3 Replies View RelatedHi all,
We have configured a DR server for our Production Server for Database Mirroring.
But, before bringing DR Server into live, we will setup Mirroring between our DR & TEST Server for Stress Testing on new DR Server.
So, What is best way to Stress test the DR Server, before bringing DR Server into Live.
Thanks.
I am looking to purchase tool where I can perform the stress test on SQL 7.0.
Any recommendation will be appreciated.
I am testing different raid setups in a Datawarehousing environment.
E.G
3 X Raid 5 (with four filegroups)
3 X Raid 10 (with four filegroups)
At the moment I have chance to trial different hardware configurations.
I want to measure what are the performance differences
I need to stress test these configs. Is there any tools I can use to do this.
Any loads I can use. What should I measure.
Are there any aricles for the environment
Any help would be great
Pete
I have been asked to perform a performance stress test on a SQL server with new hardware that we are going to be receiving.
How have some of you performed your stress analysis against new or existing hardware?
This hardware that I am going to receive will have to be configured within a high availabilty environment. I want to take this opportunity to really put a beat down on this server.
Thank you all for your suggestions.
I have a Windows 2003 Server running SQL 2005. The server has 32 GB of memory and I have enabled AWE in SQL. I have also configured the min and max SQL memory as 1 GB and 28 GB, respectively. However, this server currently has very low activity so I'm not sure whether my AWE-related changes worked. SQLSERVR.EXE process takes up about 100 MB of memory. Is there any tool or scripts that I can use to memory stress SQL to confirm that AWE is really in effect ?
View 1 Replies View RelatedHi,
We have built two testing apps for sending and receiving files across the network reliably using SQL Express as the database backend. The apps seem to be working fine under light load. However during stress test, we always get the following exception:
"System.Data.SqlClient.SqlException: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding."
During stress test, both the sender and receiver are running on the same machine. Sender creates file fragments, store them in the sender database and then send out to the network. File fragments will be deleted from the sender database when the sender receives acknowledgement from the receiver. On the receiver side, file fragments will be stored in the receiver database as they are coming in from the network. Corresponding file fragments will be deleted from the receiver database when a complete file is received.
There is maximum of about 1500 updates and 1500 deletes per second on the sender database. On the receiver side, maximum is about 300 updates and 300 deletes per second. Our goal is to send 30 GB of data (it should run for about 10 hrs). As said before we never have a good completed test run, a "timeout" exception is always thrown from the sender app (when it tries to end a transaction). It could happen as early as 1.5 hrs after we started the test. Note that although we are sending 30 GB of data, but at any point in time the database shouldn't be too big (should be well within 4 GB limit) because we delete file fragments relatively soon.
Next we changed the "Query Wait" setting in the Management Studio Advanced setting from the default "-1" to a very big number, then we have a successful run of sending 30 GB of data.
- First of all, are we not doing this properly in terms of dealing with SQL Express? Is SQL Express able to handle long running heavy load transactions for hours?
- We also noticed even before we got the timeout exception, the memory usage of sqlserver.exe keeps growing. Maybe it doesn't have a chance to cleanup internally. If the app hammers SQL Express for hours, I wonder how does it handle fragmentation? I assume it needs some sort of de-fragmenation, otherwise performance will degrade significantly...
- Seems like the Query Wait setting plays an important role here, any guideline on how to pick a reasonable value? Or should we pick a relatively small number and then do re-try in our app when we get timeout exceptions?
- Is it possible that we are running into some SQL Express resource limits? Any idea of how can we tell other than the VM size of sqlserver.exe?
Any help or suggestions would be greatly appreciated!
Thanks very much
W Wong
I am currently working on a project which needs to load over a 1000 xml files. The files are stored across 10 subfolders. I am using a foreach loop with a file enumerator, which is configured at the top of the folder structure and traverses the subfolders. This loops through the files, load the data and then moves the file to another folder. The package executes fine for a few 100 files but then hangs; this happens after a different number of processed files each time the package is run. While trying to resolve the problem we ran performance counters and noticed that the number open handles increased significantly just about the time Dtexec looked like it had hanged and DTexec also then started taking a lot of the cpu processing time. Has one else come across similar situations?
View 9 Replies View RelatedIs it possible to run both Sql Server Management Studio: Express and full blown side by side?
I am developing with the full blown product but would like to test Management Studio Express on the same box.
Is this possible?
Thanks
Eric
I have recently installed the SQL Server Management Studio Express but I do not find Management Tools in order to create scheduled backups and shrinking of the databases. I was under the impression that this should be included in the Management Studio. I use the SQL 2005 Express for smaller customers who run the SQL on a desktop unit. I need a way to backup the data to a server machine for backup purposes. I have uninstalled and reinstalled to no avail.
dont know y, but i have prob. installing MS Management studio....After a hard drive failure I was forced to reinstall everything... :( The problem is that after installing vs2005 proffesional edition I installed the sql server 2005 trial edition and everything was fine during the installation.When I looked for the Management studio it wasnt there for me :( I uninstalled and reinstalled trying the full installation but results was the same, no Management studio!!!After uninstalling the SQL server trial edition I installed the Management studio express and it works. except that from some reason I cannot browse my XP user folder which has no password protection.. :( Any idea for installing the SQL trial edition including the Management studio ???Why can't I browse my XP folder with Management studio express ???Where can I download Management studio (non express version)?? regards
View 2 Replies View RelatedI have to read 25 usernames from a single users row
than display to that user the 25 profiles.
I assume this is possible with a subquery right?
Now do i have to make 25 columns 1 for each username or
could it read user1 user2 user3 ? and how please.. Thanks much
I have looked everywhere I can think of for the answer to this and am completely confused. My IT dept created an SQL database for me on a server. My plan is to create web pages in asp.net using Visual Web Dev Express released this week. I have connected to the database but I can't figure out how I am supposed to create my tables in the database. What simple/obvious thing am I missing here? I keep trying to download SQL Server Express in hopes that I can use that to manage my database but it keeps telling me that it won't install because I have previous Beta versions left on my machine. I can't find those either so am out of luck. I am new to this so perhaps I am asking all the wrong questions.Does anyone know what I can do to find a useful database management tool that will allow me to add/create the tables?Thanks for any help!
View 2 Replies View RelatedApologies if this has been covered elsewhere, but can anyone give me a insight how you manage servers in Enterprise Manager on the MMC effectively?
It seems that there is no refresh facility on the SPID view functions, and it appears impossible to monitor what is actually going on on a real-time basis. SPIDs can appear and disappear, but without refreshing the whole server and drilling down to the right area each time there appears to be no update on the activity on that server.
I have been able to zoom no-existant SPIDs - and SPIDs I have set up running massive queries persisantly show 'sleeping' unless the whole server is refreshed.
I was less than impressed with the 6.5 version for getting important information on what users were up to quickly, but this appears even worse. I wrote a beefed up version of sp_who2 for 6.5, and on Friday started looking at a version for SQL7 - I was that disconsolate!
If there is someone out there who can effectively manage SPID activity on a server using the Enterprise Manage, please tell me how you do it...
Hello!
My login is dbo for a database located at a remote server(SQL server7.0).Can I run the stored procedures of that database by accessing remotely.Also I have to manage that database from here,what limitations that I will have if I am not an SA for that server?Any help is appreciated.I am operating from NT workstation4.0.
Can anyone shed more light on the following in SQL 2K
1. Peg lookup tables in memory so that they dont get swapped out
2. Mark tables as readonly (force all queries against it not to issue shared locks)
3. Peg frequently used procedures in memory
A kick in the right direction will be sincerely appreciated
Thanks
Liju
2 more questions
1. How can I move logins from one machine to another?
2. How can I move DTS packages from one machine to another?
Thanks
Liju
Hi
I am moving towards managing sqlserver databases and i basically come from a command line background.
Can you please tell me some of the tasks that needs to be done using command line only in sqlserver and wherein management studio falls short.
regards
db2hrishy
Hi,
I'm new to MS SQL and would like to get your opinion on how it's possible to automatically manage transaction log growth. I've read the following on the topic:
"When Microsoft SQL Server finishes backing up the transaction log, it truncates the inactive portion of the transaction log. This frees up space on the transaction log. SQL Server can reuse this truncated space instead of causing the transaction log to continuously grow and consume more space. The active portion of the transaction log contains transactions that are still running and have not completed yet."
I'm backing up the transaction logs on a daily basis. Still one of the logs grew up to 130Gb even though I have shrunk it in the past. How can I manage the growth automatically without shrinking the log files manually every time they grow beyond a certain threshold?
Thanks in advance,
Alla
Hi,
I'm new to MS SQL and would like to get your opinion on how it's possible to automatically manage transaction log growth. I've read the following on the topic:
"When Microsoft SQL Server finishes backing up the transaction log, it truncates the inactive portion of the transaction log. This frees up space on the transaction log. SQL Server can reuse this truncated space instead of causing the transaction log to continuously grow and consume more space. The active portion of the transaction log contains transactions that are still running and have not completed yet."
I'm backing up the transaction logs on a daily basis. I believe that all or most of the portions of the log are inactive by the time the backup starts. Still one of the logs grew up to 130Gb even though I have shrunk it in the past. How can I manage the growth automatically without shrinking the log files manually every time they grow beyond a certain threshold? Also, how can I check whether some of the log portions are active or inactive?
Thanks in advance,
Alla
VS.NET Professional doesn't let you manage/edit database objects in SQL Server 2000 like Enterprise, but it does in MSDE2000.
Is there a tweak that will let me do this?
I am running management studio from my desktop but I think I need sp2 is there a sp2 download for management studio that isnt management studio express?
View 2 Replies View RelatedHi,
I need support for the following problem.
It is a teamwork.Suppose 25 databse developer developing a databse.every one make modification (update,delete,insert)to the centralized database.I want that every one its on database copy and make modification to that instead to centerlized database.As source control softwares do(e.g visual source safe).or any other way to manage database in such senerio.
Hope any one will help.
Thanx in advance.
I like some of the new features in sql2005 management studio, but WHY do you have to right-click -> Modify to view the stored procedure code?
It is SO annoying, previous versions was a simple and quick-double click.
To me this is a major blunder!
Anyone else agree with me?
Hello,
I am hoping some of you can direct me to areas I should look into.
First off let me say that everything was fine and worked great up until 3 weeks ago.
Then all of a sudden we began to experience timeouts on a particular query.
The vb.net 2002 code has not changed and is pretty basic stuff to connect to the server and execute the query. The only change I have made to it since then is to add a commandtimeout parameter so it would actually run.
There does not seem to be a specific pattern either to when it runs slow and when it does not.
I can be the only one in the server in the morning (the only one in the office for that matter) and one day the application will execute the query and return data in under 9 seconds and the next day it will be a minute and a half.
I can test it during peak network times and the same thing...one day fine the next day performance is slow.
Also, if I run the query straight from the management studio it will run in 2-9 seconds and from .net it takes 20-90 seconds.
Are there server admin jobs I could run to see if tables are deadlocked, if the network is to busy..anything?
Let's say I may possibly use two transactions in a script, the secondone will depend on the successful execution of the first one.The following code works. However, I'm wondering if SQL Server 2000has some internal function like @@transaction_status to indicate thestatus of the most recent transaction by the connection. The analogueof @@FETCH_STATUS. Then, my own error tracking code could be omitted.Thanks.-- ENV: SQL Server 2000-- ddlscreate table tblA (col1 smallint, col2 smallint)create table tblB (col1 smallint, col2 smallint)create table tblX (col1 char(1), col2 varchar(20))declare @errorCode tinyIntset @errorCode = 0begin transaction fTraninsert into tblAvalues (7,1);insert into tblBvalues (8,0);-- we know this guy will failinsert into tblXvalues ('ab','abcdefge')If (@@error <> 0)beginselect @errorCode = 1endif (@errorCode = 1)rollback transaction fTranelsecommit transaction fTranif (@errorCode = 0)pseducode: start second transaction here ...elseprint 'fTran failed.';RETURN
View 2 Replies View Related