I am testing different raid setups in a Datawarehousing environment.
E.G
3 X Raid 5 (with four filegroups)
3 X Raid 10 (with four filegroups)
At the moment I have chance to trial different hardware configurations.
I want to measure what are the performance differences
I need to stress test these configs. Is there any tools I can use to do this.
Any loads I can use. What should I measure.
Are there any aricles for the environment
What do you recommend for stress-testing the performance of key stored procedures (they have been identified) for our application? The parameters can be programatically selected, for example: Select 'exec my_proc @id = ' + Cast(id As varchar) From myTable Where foo = 'bar' I have the Support Tools Available For Stress Testing & Performance Analysis (http://www.microsoft.com/downloads/details.aspx?FamilyId=5691AB53-893A-4AAF-B4A6-9A8BB9669A8B&displaylang=en) from Microsoft's site and they are pretty good.
Recommendations appreciated, and thanks in advance!
I am currently working on a project which needs to load over a 1000 xml files. The files are stored across 10 subfolders. I am using a foreach loop with a file enumerator, which is configured at the top of the folder structure and traverses the subfolders. This loops through the files, load the data and then moves the file to another folder. The package executes fine for a few 100 files but then hangs; this happens after a different number of processed files each time the package is run. While trying to resolve the problem we ran performance counters and noticed that the number open handles increased significantly just about the time Dtexec looked like it had hanged and DTexec also then started taking a lot of the cpu processing time. Has one else come across similar situations?
Hi Guys, I am looking for a large datawarehouse database implemented in SQL Server to use it to run some examples in my thesis. if someone knows where I can find a large datawarehouse database (e.g Sales database) please tell me.
I have been advised to design multidimensional model based on the operation data model. What I have is ---- Couple of documents about the application. Data model of operational data. Now I have been told to create a datawarehouse from which any type of wuestion can be answered. This seems horrendous. Please guide.
I am re-engineering the data warehouse and my client is currently using autogenerate keys, their concern is that after a certain amount of keys (can't remember the figure) sql server starts having problems, dose anyone know how i should handle it when i am doing the designing? thanks any input will be appreciated
i am writing a graduation thesis about clickstream datawarehouse with SQL Server , now who can give me some samlie about it? I just only want to learn it,thank you! :)
Iam using SQl server 2005 Integration Service to transfer the data from Source Data Base to Datawarehouse DB. Here my main problem is after trnsfering the data from main Data Base to datawarehouse DB. If any records is delete from main database, how can we delete same records in Datawarehouse DB. If I want to delete any record in the datawarehouse that record mapped with some other tables [Foriegn Keys].
Need help, I am managing a Data Warehouse (80 G.B Database Size), I purge older than 6 months data from a table which has more than 140 Million rows on daily basis. The daily data load performance is degrading. The table has no clustered indexes (only non-clustered indexes).
Tried dropping and rebuilding the non-clustered indexes, didn't work.
One way to solve the problem is drop the non-clustered index, bcp out the data, truncate the table and bcp in the data and rebuild the non-clustered indexes. This is too risky and taking 14 hours to bcp out the data.
This was not the issue in SQL Server 6.5, because SQL 6.5 always insert new record indexes at the end of the heap link (heap = non-clustered indexes without clustered index). In contrast, SQL Server 7.0 first checks for available space in existing pages by using percent free space pages (this is where it is killing the performance ).
I have a new server where 32GB of RAM is installed and I have user databases on this server.I am using SQL server 2000 Enterprise edition and Platform is Windows 2003 adv server, which supports upto 128GB of memory.
sp_configure 'awe enabled' is set to 1 and at OS level, AWE is enabled as well.
max server memory (MB) is 2147483647
I was doing some stress test on this server but memory usage doesn't go beyond 180MB....can someone suggest a test for physical RAM ?
How can I make sure that application will make full use of available physical memory?
I am trying to import a data file, which is tab delimited, using BULKINSERT. I have used BCP to create a format file, since the destinationtable has around 20 columns, but the data file has only three.Here's the problem: The columns I am trying to import comprise ID (anint identity column), Name (a varchar(255) column and Status (a smallint column). The data file contains identity values for the firstcolumn, so I am using the KEEPIDENTITY modifier. The Status column ismandatory, so I have set all rows in the data file to zero for thatcolumn. All of the other columns in the destination table either allowNULL or have default values. When I BULK INSERT the file using theformat file the identity columns are NOT imported and the Status columngets value 3376. The Name column is the only one that gets importedcorrectly. Here's the format file:8.031 SQLINT 0 4 " " 1 ID""2 SQLCHAR 0 0 " " 2 NameSQL_Latin1_General_CP1_CI_AS3 SQLSMALLINT 0 2 "" 4 Status""Sorry it's a bit messy.Where is 3376 coming from, and why are my identity values for column IDnot being imported?
Hi All, I have created fact tables and dimension tables in datawarehouse database, and i created a olap cube from those tables. I want to run SSIS Package which populates these fact and dimension tables from datasources.
I have been asked to perform a performance stress test on a SQL server with new hardware that we are going to be receiving.
How have some of you performed your stress analysis against new or existing hardware?
This hardware that I am going to receive will have to be configured within a high availabilty environment. I want to take this opportunity to really put a beat down on this server.
I have created database and OLAP cube in Analysis services using SSAS.In SSAS I have used a datasource which is using SQL tables to populate OLAP cube.Now when I added some more data to my SQL tables and trying to deploy cube,the newly added is not getting populated in the cube.So i want run SSIS package which will import data from SQL tables to this OLAP cube.
Can you please help me how to write this SSIS package to import data from SQL tables to OLAP cube.(Very urgent issue)
I have a Windows 2003 Server running SQL 2005. The server has 32 GB of memory and I have enabled AWE in SQL. I have also configured the min and max SQL memory as 1 GB and 28 GB, respectively. However, this server currently has very low activity so I'm not sure whether my AWE-related changes worked. SQLSERVR.EXE process takes up about 100 MB of memory. Is there any tool or scripts that I can use to memory stress SQL to confirm that AWE is really in effect ?
We have built two testing apps for sending and receiving files across the network reliably using SQL Express as the database backend. The apps seem to be working fine under light load. However during stress test, we always get the following exception:
"System.Data.SqlClient.SqlException: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding."
During stress test, both the sender and receiver are running on the same machine. Sender creates file fragments, store them in the sender database and then send out to the network. File fragments will be deleted from the sender database when the sender receives acknowledgement from the receiver. On the receiver side, file fragments will be stored in the receiver database as they are coming in from the network. Corresponding file fragments will be deleted from the receiver database when a complete file is received.
There is maximum of about 1500 updates and 1500 deletes per second on the sender database. On the receiver side, maximum is about 300 updates and 300 deletes per second. Our goal is to send 30 GB of data (it should run for about 10 hrs). As said before we never have a good completed test run, a "timeout" exception is always thrown from the sender app (when it tries to end a transaction). It could happen as early as 1.5 hrs after we started the test. Note that although we are sending 30 GB of data, but at any point in time the database shouldn't be too big (should be well within 4 GB limit) because we delete file fragments relatively soon.
Next we changed the "Query Wait" setting in the Management Studio Advanced setting from the default "-1" to a very big number, then we have a successful run of sending 30 GB of data.
- First of all, are we not doing this properly in terms of dealing with SQL Express? Is SQL Express able to handle long running heavy load transactions for hours?
- We also noticed even before we got the timeout exception, the memory usage of sqlserver.exe keeps growing. Maybe it doesn't have a chance to cleanup internally. If the app hammers SQL Express for hours, I wonder how does it handle fragmentation? I assume it needs some sort of de-fragmenation, otherwise performance will degrade significantly...
- Seems like the Query Wait setting plays an important role here, any guideline on how to pick a reasonable value? Or should we pick a relatively small number and then do re-try in our app when we get timeout exceptions?
- Is it possible that we are running into some SQL Express resource limits? Any idea of how can we tell other than the VM size of sqlserver.exe?
Any help or suggestions would be greatly appreciated!
Could not load file or assembly 'Microsoft.DataWarehouse, version 9.0.242.0, Culture=neutral,publicKeyToken=89845dcd8080cc91' - the system cannot find the file specified.
I get this message when I try to do a forcast from the Data Mining - Forecast menu.
Forecast from Table Tools, Analyse, Forecast works perfect.
Running Vista and using the DMAddins_SampleData workbook.
I am using SqlServer 2005 Express Edition. I have written a stored procedure that has a transaction to make sure that it rolls back in case of failure. Is there a way to test that the transaction works. I don't now how to cause an error that would cause the transaction to roll back and raise the error with the details of the exception. Thanks, laura
hello everyone i was hoping to get some help. i have sql server2000 and iis5.1 running and im using vbscript through asp. i have a table that has two keys so that together they make a unique key. so i am trying to test my table in asp so that if there is a duplicate unique key inserted then show an error... Can anyone help me?
I want to run remote procs from a local - central proc, but don't want to crash my proc if the remote NT Server or SQL Server is not available. So, I want to test for NT and SQL availability first.
/* I find the server-dbasename name in a lookup table, and then build dynamic script */ SELECT @DBName = "<SRVRNAME~DBASENAME>" SELECT @QueryString = N'master..xp_cmdshell "ECHO >' + SUBSTRING(@DBName, 1,6) + 'pipesqlquery", no_output' EXEC @Results = sp_executesql @QueryString
SELECT @Results /* <-- testing here.... */
=====================
If the NT / SQL is running, "@Results" should = 0 (success on the ECHO to the PIPE) If the NT / SQL is NOT running, "@Results" should = 1 (error on the ECHO to the PIPE)
If I test in QA, I get the expected "@Results" values.
If I run the proc, I get "@Results" = 0 for both events, and them my proc fails on the remote proc call if the NT / SQL is NOT running.
I don't know how to write DMO in TSQL to try that. I have also tried using 'odbcping' too.
HINTS? Better way to test remote NT / SQL is NOT running?
i have performed all the steps for the replication there are two problems i encounytring (i) how to check whether the replication is working or not (ii) can replication on the subscribing server automatically create the table which publishing server is publshing (iii)please tell me someother forums for sql server
what my knowledge says that in replication if i modify the data at one place it should automatically be modified in the subscribing server but this thing is also not happening (its not flashing any error message as such) please tell me what all things i need to do to make sure that everything starts working
I'm new to this forum and as well as new to OLAP testing . I would like to know OLAP testing and guidance for SQL server anlaysis services .
I have been trying install SQL server 2000 in my system but it not going through, Can anybody send information how to install SQL server and how to play with sample database.
Has anyone ever heard of using Response.Write to test sql queries in the query designer? Here is the actual suggestion I was given recently:
Response.write your sql query. Copy the query to your db query tool and run the query. Adjust the query as necessary to return the recordset you desire. When you have the proper query, adjust your asp code.
The query editor/designer doesn't like certain symbols and extraneous code. So how exactly can you use Response.Write with a sql query without getting errors in designer?