SQL Server 2005 is installed on a brand new 64-bit server (Windows 2003 x64 std. Edition, 2.4 Ghz AMD opteron- 2cpu, 8.8 Gb of RAM). There is barely few hundred rows of data scattered among few tables in one database.
SQL server and SSIS performace grossly degrades overnight and in the morning everything is slow including the clicking of tool bar selection.It takes 3 seconds to execute a simple select statement against an empty table.
It takes15-20 seconds to execute a SSIS package that normally would take 2-3 seconds.
But once SQL Server is restarted, everything returns to normal and the performance is good all day and then the next day everything is slow again.
Hi, I have Sql server 2005 Standard edition on my system and was wondering whats the difference between Standard edition and developer edition which one is better..most of the things that i do on sql server is write sprocs, create table etc... any ideas will be appreciated.. regards Karen
I feel like ssis encryption model has a serious flaw. Especially when linked to SQL Agent jobs.
I have posted and others have posted messages about this. Something is plain wrong with ssis encryption keys and password protection. Also, you do not have the choice not to protect the packages. In my case, protecting packages is completely useless.
I created config files for al my packages connections passswords.
Now, by our IT Policy, I had to change again my password and of course, all packages now return multiple errors when I open them.
Hopefully, the config file did its job and the packages are ran anyways by SQL Agent, however, having to manually retype and resave all packages not to have the errors is just a plain hassle. Not to speak about people not using the config files and the correct "Run As" sql agent account.
I stress the fact that in a real world production environment all packages are driven by SQL Agent jobs and MUST run automatically.
Here is the error I get after opening a package after changing my password:
Error 1 Error loading Constants05.dtsx: Failed to decrypt protected XML node "DTS:Password" with error 0x8009000B "Key not valid for use in specified state.". You may not be authorized to access this information. This error occurs when there is a cryptographic error. Verify that the correct key is available. c:projectsssis packagesssis constantsConstants05.dtsx 1 1
So Why is'nt this key automatically adjusted after Windows NT Domain password Change?
How can I refresh the key, not to have to reype all the packages connections passwords and rebuilding, Checkin-in again all the stuff?
I do not think the solution is "Use an application account which password never changes when you create your ssis packages" however at this time, this is the only solution I can think of.
How do you guys deal with this problem?
I still do not understand the ssis security model I feel it is diconnected from the reality and unpracticable in a production environment like mine.
We are experiencing a major issue since upgrading from SQL2000 to SQL2005 over the weekend. Starting today, it appears that the performance of SQLServer reaches a limit every 15 minutes.
Our configuration is as follows:
Window Server 2K3 x64 Enterprise
SQLServer 2005 x64 Enterprise
HP DL585 with 4 dual core Opterons
32 GB of RAM
2 TB EMC SAN
At first, I thought there was a memory pressure problem, since I had the default max memory set. After changing the max memory to only 25 GB (out of 32 available), the issue went away temporarily. However, after 15-20 minutes, the number of batches/sec dropped in half, and remained after half until I changed the max memory setting again. Over the course of the day, I was able to fix the issue each time by just changing the max memory by 1MB. (From 30,000 to 29,999 and back from 29,999 to 30,000). Each time, the batches/sec counter immediately doubles and remains there for about 15-20 minutes. None of the SQL statements have changed since upgrading.
I have found this post, which talks about a similar issue at the end of the thread:
I can't find 'SQL Server: SSIS Pipeline' performance object in performance monitor on a 64-bit SQL Server. I see it on a 32-bit. Does anybody know why?
Right now I built an SSIS package to transform data from external source into local database server. I schedule it to be processed at that database server (ex Server A). Is there any difference performance if I replaced the SSIS package to be processed at another server (ex Server B) ? I'd like to separate the process because I want to reduce workload in Server A by removing the SSIS process to Server B. Am I correct ?
Hello!I have a very simple structured table:id | datawhere "data" is a varchar(100) This table would contain a lot rows (~ 500.000.000) and I want to select all "id" where data=@data. Is it realistic that the SQL Server could serve this request on a normal webserver within 1 or 2 seconds? Thanks!
I have recently upgraded from SQL Server 2000 to SQL Server 2005, and now all my queries run infinitely more slowly.
Here is the scenario - I run an extract of a MS SQL Server database at a client site, then recreate the database on our in-house server - but without indexes etc. Then I run various queries in order to created data files that will be used for importing into a global system. When I was running Server 2000, most of the queries ran in less than 10 seconds each, but under Server 2005 they take 3 minutes or more! Does anybody know of any parameters that I need to adjust to fix this problem?
Hi,I have a Microsoft SQL Server 2005 Enterprise installed on Windows Server 2003, and developing web application for 500 clients. So I am interested will I have any performance issues if I put in 'Articles' table, data for all 500 clients and then filter it on client ID, or should I make 500 'Articles' tables for every client one with different name and then change sqldatasource for gridview depending on which client is working on it. I will have, beside 'Articles' table, another 10 tables, which means 5500 tables total, if I use second approach, on first I will have only 11 tables. So I am asking is it better to have more tables with less data, or less tables with more data. And what are pros and cons for both approach. Thanks a lot!
Ive got sql server 2005 WG edition running and have an access adp application which connects to it. However since upgrading to sql server 2005 from 2000 the adp project runs a lot slower. However when I install express on a machine and connect the adp project to it which sits on the same machine it runs just fine. We have also rebuild all the indexes for the database but that doesnt fix the problem. Could someone please help...
I have one query which is calulating running total and taking just 6 mins to run on production SQL Server 2000 server but it is taking more than 45 mins to run on QA on SQL Server 2005 server. The index and data is same on both server, What other things we can check beside the index? Thanks
Does using varchar in SQL Server 2005 significantly affect performance on updates?
Why or why not?
I have seen many SQL Server databases with many varchar columns - in other databases other than SQL Server it is advised not to use varchar because it significantly impacts performance.
I am trying to weigh when to waste space to help performance.
I am having a table with 40 columns and it contains 4 million records. I got the results for one year in 40 secs. After tuning, it is retuning in 24 secs( what i have done is i created index on order by fields).
Can you please suggest me in which way I can increase the performance.
I'm not sure I chose the right forum, so any comments on that are also welcome
We recently changed from SQLserver2000 to SQLserver 2005 in the beginnen all went fine. But now we are struggling with a severe performance problem... suddenly SQLserver2005 reaches its max and is not longer able to work properly -> Extremely slow
I'm wondering if there are other people / companies / ... sharing this same issue?
A query was taking 20 seconds and consuming 70% CPU takes only 1 second after setting Maximum Memory property to 2048 MB - why?
Server: OS Microsoft(R) Windows(R) Server 2003, Enterprise Edition Version5.2.3790 Service Pack 1 Build 3790 8 GB memory Two Dual-core AMD Opteron 285 2.6GHz Processors Server is not configured for AWE Fiber channel connection to EMC Clarion - two LUNs - one for MDF, one for LDF
SQL 2005 SQL 2005 32 bit Standard Edition - SP1 (version 9.0.2047) Three instances installed on server - only one instance in use Binaries and system databases on local mirrored disk Database file (MDF) on one EMC LUN - dedicated physical drives Log file (LDF) on one EMC LUN - dedicated physical drives
Query in question:
SELECT TOP 10 Address.Address1, Address.Address2, Address.City, Address.County, Address.State, Address.ZIPCode, Address.Country, Client.Name, Quote.Deleted, Client.PrimaryContact, Client.DBA, Client.Type, Quote.Status, Quote.LOB, Client.ClientID, Quote.QuoteID, Quote.PolicyNumber, Quote.EffectiveDate, Quote.ExpirationDate, Quote.Description, Quote.Description2, Quote.DateModified, Quote.DateAccessed, Quote.CurrentPremium, Quote.TransactionDate, Quote.CreationDate, Quote.Producer FROM ((Client INNER JOIN Address ON Client.ClientID = Address.ClientID) INNER JOIN Quote ON Client.ClientID = Quote.ClientID) WHERE (Quote.Deleted = 0) AND ((Address.AddressType)='Mailing') ORDER BY Client.Name
With default maximum memory setting (2,147,483,647 MB) - query runs in 20 seconds and consumes over 70 % of the CPU.
After changing maximum memory setting to 2048 MB, query runs in less than 1 second.
Question is: What is the best practice for setting the minimum and maximum memory settings for SQL 2005? What can be monitored to identify the cause of these type of issues - using profiler, PerfMon, other tool?
I am looking for an useful sql server 2005 performance tuning book. i have been searching for a real nice book as i m going to start my job from next month in a financial domain with one of the requirement as sql server 2005 performance tuning.so i m looking forward a book which can help me doing well at my workplace. Any suggestions and links appreciated in advance .
Does anyone know of any documentation on the performance of partitionmerge/split? Does the merge or split of a partition cause any lockingon the partitioned table? If you were merging or splitting a largevolume of data rebalancing your partitioned table would youpotentially lock users out?
My Performance Counters for SQL Server 2005 are corrupted. How do I repair them ?
Any help would be appreciated. Thanks.
Salyx
Specs Windows 2003 Standard, AMD x64. SQL Server 2005; x64; 9.00.3042.00; SP2 Standard Edition (64-bit) on Windows NT 5.2 (Build 3790: Service Pack 2) This is a new install, so no "upgrade from SQL 2000". This is a production server, so "reboot" is hopefully not part of the suggested repair.
Symptom Open Performance Monitor. Open Add Counters. Open Dropdown "Performance Object". Instead of the SQL Server Performance Counter names, a list of 4-digit numbers appears. Other Performance Counters, eg, Processor, work as normal.
Attempted repair 1 - Recovery of system performance counters Open Command Prompt CD WindowsSystem32 lodctr /R This failed to restore the full set of performance counters for an unknown reason.
Attempted repair 2 - Recovery from a backup file from a second host I used the performance counter backup file from a second host which has an identical windows install. This properly restored the system performance counters, but failed to restore the SQL Server ones. This seems odd, because both system have - as much as I can tell - the same applications installed.
Open Command Prompt CD WindowsSystem32 REM Load backup file from second host lodctr /R:c:PerfStringBackup.INI
Attempted repair 3 - Recover SQL Server - specific counters Open Command Prompt CD WindowsSystem32 REM Load backup file from second host lodctr /R:c:PerfStringBackup.INI REM Clear and re-load MSSQLServer counters... unlodctr MSSQLServer lodctr "/R:C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLBinnsqlctr.ini"
Executing this pretty much wiped most performance counters. Only a small sub-set is now available.
More Info SQL Server 2005 and later SP2 were installed under the administrator account. MSSQLServer service runs under its own Windows Account (permission issues ??) I get Event Log entries regarding x86 vs x64 Performance Counter Libraries. These, however, do not refer to ASP, not SQL Server. I have 2 (virtually) identical hosts (same install sequence of apps). The Performance Counters on the second host work fine. Exctrlst.exe lists MSSQLSERVER service, but I don't know how to diagnose the details.
We have our SQL databases clustered using MSCS on X64 servers and are planning to apply SP2. During initial tests, we find that around 20-25% queries perform slow after applying SP2 compared to SP1. Just wanted to know if anyone else has found the same behavior and if there are any known patterns / issues with respect to performance for Sp2
I have a SQL Server 2005 database where covering indexes had to be used to improve performance for the heavy amounts of retrievals; however, the inserts into the tables are now very slow of course. Is there any way to improve the performance of the inserts without taking away the indexes.
Would changing locking or partitioning the index help the inserts?
Other databases use a concept of "freespace" to set up in the beginning - making pre-existing space for inserts - is there anything like this in SQL Server 2005?
We have updated to SQL Server 2005, let€™s say, in a hurry without thinking or testing. Databases were attached to the new instance of SQL Server 2005. It looked great when I tested it alone but then a new day come and as all users logged into the system we had got a big problem. The response times are very long and users receive time out errors all the time.
A little background:
The instance of SQL Server 2005 is installed on the same server as 2000 was installed on. 2000 has been uninstalled. It is a Xenon 3.2 GHz with 2GB RAM and SCSI raid. Data and logs are on different spins.
Application is an old ASP code and some parts are not optimized at all. But it worked fine on SQL Server 2000.
What could be the problem?
I really don€™t want to downgrade to SQL Server 2000.
Wondering if anyone has any experience with SQL Server Express Edition (SSEXP). We're looking at a mobile sales force type model, so a local database on a laptop with no real time network connection. So the users would collect data locally, then connect up to the network every few days to replicate the data to a central server. So questions.. Has anyone tried anything similar? How stable/mature is SSEXP? Any other thoughts, alternatives or gotchas anyone can think of?
I installed the SQL Server 2005 SP2 update 2 rollup on my 64-bitserver and the performance has tanked!I installed rollup 3 on some of them, but that did not seem to help.I thought it was just a linked server performance issue, but myoptimization started running today on one of the "update 2" instancesand so far it's been running about 10 hours longer than it normallydoes.The rollup 3 fixed our stack dumping issues, but we NEED to have thisperformance thing fixed!I saw that MS has come out with update 4 last week - doesn't sayanything about fixing this, though.Has anyone else experienced this?I'm not necessarily expecting anyone to have a fix for this, justwantto know I'm looking in the right place before I call MS.
Everything is flowing smoothly for the SQL Server Database I have, except one type of retrieval and that is when the where clause has a range of data values to do the retrieval then the performance is terrible. I cannot anticipate every range. There are indexes on the table to try to help; however, nothing seems to help. Has anyone had a similiar problem? Any suggestions to improve performance?
I'm having an issue with a query I'm running on Sql Server 2005. It's a semi-complex query involving an in-line table function and several left outer joins which are joined on to the results of the function call. Two of the left outer joins are then qualified in a where clause of the form where table.Col is not null; the idea is that the final result set contains data that has no match in those two tables.
The problem revolves around a where clause in the function and the last left outer join (ie, one of the ones qualified with where not null). When I alter the where clause of the function to further restrict the result set the function returns, the query times shoots up from 1 second to roughly 2-3 minutes. Note that the time the function takes to complete is not affected. The difference in time is purely down to what the query does with the results the function provides. Also note that the change to the where clause provides a subset of the original data; it does not add any more data (it actually restricts the original resultset by roughly 1000 rows).
I can bring the query speed back down again by removing the last left outer join - this join takes one of the columns from the function, and joins it to a small table - 924 rows. So it appears that this particular join is the cause of the issue, but only when using the resultset generated from the modified function query.
Now, as the thread title alludes, Sql Server 2000 and 2005 handle this differently, or appear to. When I execute this same query on a Sql 2000 machine, there's no apparent time differences, and the data that is returned is as expected. Does anyone have any suggestions as to what might be causing this and how I can fix it? I could simply return the larger resultset and use managed code to filter out the rows I don't want; however, I would like to get to the bottom of this, especially if it's going to effect future queries.
Hello, we currently have our database (MSSQL 2005) on our web server however to do increased traffic and business we are now moving our database to its own server. I was wondering if anyone here knew of some good ways to setup/tune Windows Server 2003 and SQL 2005 for best performance. MSSQL will be the only application running on the server and want to make sure it is as fast as possible! Thanks in advance!!!
We have recently updated an application from SQL Server CE 2.0 to SQL Server Mobile 2005 and we are seeing a huge decrease in performance? Is this normal? Database query that used to take 8 or 9 seconds are now around 20 secs, the database is only about 5 MB and the two tables in this particular query have 20 rows and 14K rows respectively. The query is basically:
select * from table1 join table2 on table1.myint = table2.myint
myint is the Primary Key of table2 and I have even created an index on myint for table1, any ideas?
Hello, I need some major help, I need to make a database using SQL server for a forum, now I am using pHpBB, but i need that database. I was thinking about it, it doesnt need to be complicated or anything. I really have no idea where to start so any help. Thank you in advance
I have a new business, and a part of that business includes receiving large amounts of data from time to time. I just found out yesterday that I'm going to be receiving about 1TB of data from an new client! I'm not set up at all for this large of a data set.
I want to use SQL Server as my database. Can I load SQL on a Desktop PC without having to buy a server? How?
I don't have a clue as to how I need to get set up for this data...hardware or software. Any advice you can give will be outstanding!!!!!
Hi We are using the SQL Server 2005 Full Text Service. The data is not huge, but the kind of data is that each record is small and there are a large number of records. There are 35 million records now with 11 GB of data and about 1.6 GB of FT catalog on the table. This is expected to grow to at least 10 times the size of this data. The issue is with FTS taking a long time to return results when the number of hits (rows) getting returned from FTS is large for some searches, it takes a very long time. With the same data & catalog, those full text queries for less common words return timely. The nature of the problem doesnt allow us to only have top results. We need all the results. So it’s not about the size of data but the number of results getting returned from FT. (As the catalog is inverted). The machine is dual processor with 4 GB RAM.
I am considering splitting the table and hence the catalog and using multiple servers to do full text searches in smaller catalogs. Is there any other way this issue can be solved ?
If splitting is the only way, can you give me an idea as to what is a statistical/standard limit to the number of search results/cataog size as which FTS gives good results
I am having major performance issues with Microsoft SQL 2005 x64 Standard Editions performance on Windows Server 2003 x64. The PC has two quad core cpu's with 8gb of ram and running a 500gb mirrored SCSI (Raid 1) drive system. The database running on the server is about 11gb. I've run a defrag several times which helps a little but I was hoping I could do something else to increase the performance.
I have also found that the bottle neck in the SSIS package is the backup and restore process of an 11gb database which takes about 1 hour (backup takes 1 hour and restore takes 1 hour) when it should take about 11 minutes. Is there anything I can do to make these processes run faster or to find out why they are taking so long? Any ideas would be a great help.