I have been asked to perform a performance stress test on a SQL server with new hardware that we are going to be receiving.
How have some of you performed your stress analysis against new or existing hardware?
This hardware that I am going to receive will have to be configured within a high availabilty environment. I want to take this opportunity to really put a beat down on this server.
I have a Windows 2003 Server running SQL 2005. The server has 32 GB of memory and I have enabled AWE in SQL. I have also configured the min and max SQL memory as 1 GB and 28 GB, respectively. However, this server currently has very low activity so I'm not sure whether my AWE-related changes worked. SQLSERVR.EXE process takes up about 100 MB of memory. Is there any tool or scripts that I can use to memory stress SQL to confirm that AWE is really in effect ?
I have a new server where 32GB of RAM is installed and I have user databases on this server.I am using SQL server 2000 Enterprise edition and Platform is Windows 2003 adv server, which supports upto 128GB of memory.
sp_configure 'awe enabled' is set to 1 and at OS level, AWE is enabled as well.
max server memory (MB) is 2147483647
I was doing some stress test on this server but memory usage doesn't go beyond 180MB....can someone suggest a test for physical RAM ?
How can I make sure that application will make full use of available physical memory?
We have built two testing apps for sending and receiving files across the network reliably using SQL Express as the database backend. The apps seem to be working fine under light load. However during stress test, we always get the following exception:
"System.Data.SqlClient.SqlException: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding."
During stress test, both the sender and receiver are running on the same machine. Sender creates file fragments, store them in the sender database and then send out to the network. File fragments will be deleted from the sender database when the sender receives acknowledgement from the receiver. On the receiver side, file fragments will be stored in the receiver database as they are coming in from the network. Corresponding file fragments will be deleted from the receiver database when a complete file is received.
There is maximum of about 1500 updates and 1500 deletes per second on the sender database. On the receiver side, maximum is about 300 updates and 300 deletes per second. Our goal is to send 30 GB of data (it should run for about 10 hrs). As said before we never have a good completed test run, a "timeout" exception is always thrown from the sender app (when it tries to end a transaction). It could happen as early as 1.5 hrs after we started the test. Note that although we are sending 30 GB of data, but at any point in time the database shouldn't be too big (should be well within 4 GB limit) because we delete file fragments relatively soon.
Next we changed the "Query Wait" setting in the Management Studio Advanced setting from the default "-1" to a very big number, then we have a successful run of sending 30 GB of data.
- First of all, are we not doing this properly in terms of dealing with SQL Express? Is SQL Express able to handle long running heavy load transactions for hours?
- We also noticed even before we got the timeout exception, the memory usage of sqlserver.exe keeps growing. Maybe it doesn't have a chance to cleanup internally. If the app hammers SQL Express for hours, I wonder how does it handle fragmentation? I assume it needs some sort of de-fragmenation, otherwise performance will degrade significantly...
- Seems like the Query Wait setting plays an important role here, any guideline on how to pick a reasonable value? Or should we pick a relatively small number and then do re-try in our app when we get timeout exceptions?
- Is it possible that we are running into some SQL Express resource limits? Any idea of how can we tell other than the VM size of sqlserver.exe?
Any help or suggestions would be greatly appreciated!
What do you recommend for stress-testing the performance of key stored procedures (they have been identified) for our application? The parameters can be programatically selected, for example: Select 'exec my_proc @id = ' + Cast(id As varchar) From myTable Where foo = 'bar' I have the Support Tools Available For Stress Testing & Performance Analysis (http://www.microsoft.com/downloads/details.aspx?FamilyId=5691AB53-893A-4AAF-B4A6-9A8BB9669A8B&displaylang=en) from Microsoft's site and they are pretty good.
Recommendations appreciated, and thanks in advance!
We are setting up a test lab environment with 100 machines. We want one master testing db that gets replicated to each to run scripted application tests nightly.
My goal is to minimize the amount of work to move this thing to each of the 100 test machines. I am wondering if we need to even have the sql local and invest in a monster db server with 100 copies of the db we restore and each test machine point to their own db on that server, or if I should use db mirroring or something to get the master test db to each of those machines instead.
Now that we have a good programming model in SSIS - the question is whether to write automated unit tests for your packages, and would it generally be a good idea for packages?
Also - if yes to write tests - then where to find more informations regarding How to accomplish that?
hi every one, i need to test SSIS pacakge which will import data from different database where record count is around 5 millions. iam planning to test it through c# code as well as manually also. SSIS source : consist of 7 tables SSIS destination :consist of 7 tables Using c# code iam trying to run ssis package through batch file. i am putting expected rowcount, column count in an excel file and comparing same with destination tables by writing query implementing ADO.Net concept. am i going right way ,can any one suggest best and productive way to test the ssis package . what are the other things i need to test it. do any one can add test cases to it.
S.No
Test Case
1
Verify all the tables have been imported.
2
Verify all the rows in each table have been imported.
3
Verify all the columns specified in source query for each table have been imported
4
Verify all the data has been received without any truncation for each column.
5
Verify the schema at source and destination
6
Verify the time taken /speed for data transfer
7
Fields truncated due to difference in length of the field at destination. Regards Arif shareef
I am trying to import a data file, which is tab delimited, using BULKINSERT. I have used BCP to create a format file, since the destinationtable has around 20 columns, but the data file has only three.Here's the problem: The columns I am trying to import comprise ID (anint identity column), Name (a varchar(255) column and Status (a smallint column). The data file contains identity values for the firstcolumn, so I am using the KEEPIDENTITY modifier. The Status column ismandatory, so I have set all rows in the data file to zero for thatcolumn. All of the other columns in the destination table either allowNULL or have default values. When I BULK INSERT the file using theformat file the identity columns are NOT imported and the Status columngets value 3376. The Name column is the only one that gets importedcorrectly. Here's the format file:8.031 SQLINT 0 4 " " 1 ID""2 SQLCHAR 0 0 " " 2 NameSQL_Latin1_General_CP1_CI_AS3 SQLSMALLINT 0 2 "" 4 Status""Sorry it's a bit messy.Where is 3376 coming from, and why are my identity values for column IDnot being imported?
I am testing different raid setups in a Datawarehousing environment. E.G 3 X Raid 5 (with four filegroups)
3 X Raid 10 (with four filegroups) At the moment I have chance to trial different hardware configurations. I want to measure what are the performance differences
I need to stress test these configs. Is there any tools I can use to do this. Any loads I can use. What should I measure. Are there any aricles for the environment
I am currently working on a project which needs to load over a 1000 xml files. The files are stored across 10 subfolders. I am using a foreach loop with a file enumerator, which is configured at the top of the folder structure and traverses the subfolders. This loops through the files, load the data and then moves the file to another folder. The package executes fine for a few 100 files but then hangs; this happens after a different number of processed files each time the package is run. While trying to resolve the problem we ran performance counters and noticed that the number open handles increased significantly just about the time Dtexec looked like it had hanged and DTexec also then started taking a lot of the cpu processing time. Has one else come across similar situations?
I need to restore test DB from production backup but once it is restored I would need all the permissions of sql logins and windows AD account intact in test Db as it was before.
How can I test a stored proc that is in my SQL Server Express 2005 database? I don't see anything in VS 2005 or a Query Analyzer in SSMS express. Thanks
Hi! I'm currently having a problem connecting to my SQL Server6.5 test server. (Not the hottest problem since it is only the test server, but it sure would be nice to use it.) I was able to connect to it from my desktop (Win95 machine) up until this morning. I get the following error when I try connecting "A connection could not be established to PIGLET- DB-Library. Unable to connect: SQL Server is unavailable or does not exist. General network error. Check your documentation."
Does anyone have any idea what this should mean to me? I do have the server running and I am able to perform actions there, but not from my desktop. Any suggestions?
I have a recommended configuration for production server. But, is there any guidelines for determining the test server configuration when compared to production. I know test server need not be as powerful as production, but I am not sure how to decide the test server configuration. Any help would be appreciated
Howdy; I've tried this in the 'tools' area, but that didn't work too well. I suspect, I will have to generate a T-SQL code then schedule it as a job. Why I can't just drag and drop with basic desires, is beyond me, but THAT probably does exist.
anyway here is the problem [this server has many databases, on SQL 2000 sp2] 1. User only wants me to use Monday morning's full backup, which is good in that it doesn't include transaction logs. 2. Restore that data overtop/into Developement db. = good, no data to worry about damaging. 3. User does NOT want me to do this by hand, but schedule it.
ok, a. must do a RESTORE WITH FILELISTONLY from [?] what ?, master? and if I user the *.bak of the production, it has a coded date field in the name entry SO, I would, I guess, have to generate all sorts of wonderful code to find the date and build a file name. Why, because using the FROM DISK = 'F:MSSQLBACKUPDBPRODUCTION_yyyyddmm.BAK' is not going to work with a wild card. Can I do a file lookup using a 'PRODUCTION' prefix into a variable, then use that or should I look for latest file date [remember there are several database backups here], or ????
then. How does one schedule such a T-SQL. Do I save it to some text file, and invoke it using a job scheduler.
We have both a production SQL 7 server, QA, and Development. From time to time, I want to move just the data from the production server to the other 2 servers without modifing the objects that may have been changed such as stored procedures and rights. Is there a way using the SQL tools provided that we can just move the data. Becuase also what happens is that the rights to the objects change which means my developers no longer have access to the tables for selects in QA since the changes where overwritten by production where they do not have the rights.
HiI am trying to connect an application to a remote sql server database, however it does not appear to eb configured for remote connections. Does anyone know a connection string of a database I can connect to , soem sort of learnsers database sort of idea ?Thank you.
I am trying to copy a production db (26.5 gigs) with a 3 gig log from production to a test server. The Prod db name is EDD_Cat which resides on one logical drive for the data (.mdf) and another logical drive for the log (.ldf). The test server does not have the same physical raid allocation. The only way that I can get that much space is to spread the data across 3 logical drives. I have preallocated a database called EDD_CatT with the same total physical db size. I have not been successful in restoring from a sql backup device (copied from production) to the new test db. Here are my tsql statements and error:
Restore Database EDD_Catt from Iloc01bkp with File=2, Move 'EDD_Cat_dat' to 'D:Mssql7DataEDD_Cat.mdf', Replace, Move 'EDD_Cat_dat' to 'E:Mssql7DataEDD_Cat2.ndf', Replace, Move 'EDD_Cat_dat' to 'F:Mssql7DataEDD_Cat3.ndf', Replace, Move 'EDD_Cat_log' to 'G:Mssql7DataEDD_Log1.ldf', Replace
start db restore --------------------------- 2001-01-02 12:23:31.610
(1 row(s) affected)
Server: Msg 3257, Level 16, State 1, Line 0 There is insufficient free space on disk volume 'E:' to create the database. The database requires 20447232000 additional free bytes, while only 1732972544 bytes are available. Server: Msg 3013, Level 16, State 1, Line 0 Backup or restore operation terminating abnormally.
I also tried using EM but basically got the same type of error.
I could do this with SQL 6.5 as long as the db size was the same or larger.
Any advice/suggestions will be greatly appreciated. BOL and the manuals that I have seem to only give examples that have one file for the data and another for the log but I could not find one that gave an example of what I am trying to do.
Thanks much for your time Calvin Matsumoto - State of California
Can anybody give me an idea or a script which can be used to Restore a production Database to Test Database on another server. As I need to do this 3 days a week, I would like to make this automated.
I was wondering if anyone has a neat (preferably automated) method of creating small testing databases from large production instances. My requirement would be to copy the schema and a subset of configuration data from a production database into a test database. The subset of data would be a full copy of a subset of tables, rather than a subset of data within one or more tables. There is a mixture of SQL2000 and SQL2005 servers involved in this requirement. I'm familar with the scripting mechanisms of Enterprise Manager and Management studio and DTS packages, sufficent to perform a process like this manually, but want to productionise and schedule this process to be performed automatically. I'm sure this must be a commonly performed task, so I'm interested to know if anyone has a "best practice" for this requirement.
I use SQL Server 2005 Dev Edition and am not new to making databases (then again, I've had enough experience and my dad does the same thing).
I am (unfortunately) a university student and for my dissertation I am going to produce a SQL Server database with a strong emphasis on data mining.
Obviously, for the data mining to be useful at all I need to produce loads and loads of test data.
Fair enough, and there are applications which do this, such as EMS Data Gen, but can anyone recommend me any other data gen utilities? EMS Data Gen has poor handling of unique attributes, and as I am doing a car manufacturer this will give me problems when I come to the registration number attribute.
Also, why are utilities for SQL Server (and Oracle at that) so expensive? This makes it out of my reach and makes it difficult to build a truly good database that will net me good marks, and demotivates me. :(
Lastly, please feel free to recommend to me any utilities for SQL Server - such as performance monitors, backup utilities. Anything. But if they are priced utilities, they have to be sensibly priced (<£100), because I cannot yet afford to pay >£1k on such utiltiies.
I currently have a test sql server database and I am trying to refresh it with the production sql server data - to get the production current data.
If I copy the production database.mdf file and the database.ldf file from the from the production server and replace it with the test database database.mdf file and database.ldf files and then restart the database, would this give me the production current data, please advise. - If not could you please advise on how I could get the productions current database.
Please advise me about restore a master database from production to test server.
The reason to do this because I need to test and evaluate some login in master database.
I tried to restore master database to test server, but I got some errors regarding about user databases are not exists in test server. I don't want to restore user databases, I only need master database for evaluate user login.