SQL7 Memory & Performance
Oct 31, 1999
Just two questions actually. We have built a ColdFusion based forums package. Currently we have it on two beta sites, and we are usin sql 7 for the db. Firstly the forums are serving 200-300 people at any given time, about 16,000 unique people a day. SQL & seems to stay at around 50% cpu usage on a dual p3 with 512mb ram. Is that normal? Seems like alot of cpu usage. The other thing is it takes 500mb of ram and just about drains the server of all of its ram even though in the memory properties for the sql server its set to 255mb maximum. Any insight is appreciated.
View 2 Replies
ADVERTISEMENT
Jul 9, 2001
I know that SQL2000 can utilize AWE to address more that 2GB of available application addressable RAM, however, I'm not sure how much memory SQL7 can support. BOL hints that SQL7 does take advantage of the /3GB boot.ini switch in W2K and NT4AS. However, no reference so far illuminates the functional enhancements and possible tradeoffs that this change may provide.
Thanks,
Meg
View 1 Replies
View Related
Feb 20, 2002
Hello,
I have been trying to find out what the maximum memory we can have on SQL7 standard edition server. Have not been able to find it anywhere...
Anuy ideas would be greately appreciated.
TIA
B.
View 1 Replies
View Related
Jan 24, 2002
Could anyone inform me how much the conventional memory SQL7 Enterprise on Advanced Server 2000 can have?
The SQL is on an active/active cluster. Currently each runs at 2GB of memory, with available memory for failover. I am planning to add another 2GB to each SQL node. Is this possible? Are there any configurations I need to do? Do I need to set AWE enabled? and How much?
Your help is greatly appreciate.
Jim
View 4 Replies
View Related
Feb 7, 2004
I have an SQL 7 server on NT4 SP6, Pentium II500, HW Raid5, 1Gig Ram.
In the db, there's one table A table with ~100,000 rows.
SELECT col1 from A takes between 15 and 45 secs in Query Analyzer.
SELECT * from A takes 15min+!
So far, I can only compare that to my test system, where a SELECT col1 takes less than a second, SELECT * around 15secs.
Test System is a P4 1.8, 768MB Ram, Single SCSI disc on Win2K, SQL2000, identical data.
I'd expect that to be faster, but not THAT faster. I suspect the performance problems to be hardware-related, but itm, I have only remote access.
I'd appreciate any hints regarding what to look for, what stupid things I may have done or forgotten or what I could do to track the source of the problem.
TIA, chris
View 13 Replies
View Related
Aug 2, 2006
Hi
I did a load testing and found the following observations:
1. The Memory:Pages/sec was crossing the limit beyond 20.
2. The Target Server Memory was always greater than Total Server Memory
Seeing the above data it seems to be memory pressure. But I found that AvailableMemory was always above 200 MB. Also Buffer Cache HitRatio was close to 99.99. What could be the reason for the above behavior?
View 1 Replies
View Related
Mar 17, 2008
Hi,
I hope somebody here can help on my problem.
I wrote a MFC application using sql ce and is running prefectly smooth on wince and PPC 2003.
But when i deploy it to Windows Mobile, the database performance drop dramatically.
I knows the drop on performance is due to the I/O speed on the flash memory.(The previous mobile os are using ram instead).
Is there any solution or work around which I can solve this problem.
Recently i solved the performance issues on "insert" to DB by using a commit buffer(Instead of commit to the sdf instantly).
But how about the "select" performance? It's too slow which take about 3 sec to select a record from the DB.
Does Microsoft provide any suggestion on the sqlce perofrmance on windows mobile?
Thanks
Keith
View 3 Replies
View Related
Jan 10, 2008
Hi Experts,
here we are having a very very serious problem,
in SQL 2005, we are having 8 CPU's and in the task manager CPU usage is showing 100% and the performance of server is very very poor and last night the server has got rebooted, still CPU Usage is showing 100%, So how can i improve the performance? it is very very urgent for me, can any one please let me know what is going on and what is the solution.
-Thanks N Regards,
Kanthi.
View 3 Replies
View Related
Sep 24, 2007
Hi -
I am facing 2 problems :
PROBLEM 1 :
We have a few packages that run pretty fast on a desktop server with 2 Gig RAM, Dual processor (approx 4-5 hours). But the same packages run very very slow on the another server containing 8 CPU and 12 Gig RAM (ran for 24 hours without completing).
PROBLEM 2 :
The CPU% ranges from 40-80% and the PF usage is stagnant at 2GB on desktop server for the same package. But in the 8CPU server, the CPU % ranges from 0-10% but the PF Usage raises from 750 MB to 8 GB.
This has become critical to our application.
TIA,
Shabs
View 4 Replies
View Related
May 31, 2007
Hi All,
I'm new at SQL Server, I decided to use it in order to get all the advantages of using Vb.net and Sql. My server is a SQL standard version. I'm using a relational DB most of the time for complex select queries, everytime the server is used it performs 30 or 40 queries at the time, and I have recently realized that server consumes a lot of memory after one or two days of beign up.
Let's say that if I restart the SQL Server memory usage is about 650 Mb, but after two days memory is 1.4Mb, I have used Sql Profiler and Tunning Assistant where it recommended me to create some indexes, which indeed I created them, but that did not solved the memory problem, although some queries run faster.
My questions are:
is this memory usage is normal ?
if not, what should I check out to reduce memory usage?
Thanks in advance
George
View 12 Replies
View Related
Oct 5, 2006
Hi,
I know the SSIS memory problem has probably been covered quite a bit, but being a newbie in using the SSIS, I'm not quite sure how to improved the performance of my SSIS package.
Basically, I've a package that loops through all the subdirectories within a specified directory, and it then loops through each file in the subdirectory and with the use of the Data Flow, process each file (according to their filenames) with a Script Component to insert data into a SQL DB.
Each subdirectory has up to 15 different csv files, but each is less than 5kB. I probably have about 100 subdirectories.
When I run the package, it functioned properly, but the package stalled (no error but just stuck in one Data Flow) after a while, and when I checked my CPU memory, it was running at 100%.
I'm not sure how I could fix it or improved the memory allocation. I was not expecting to have any memory problems as the file size is small and the number of rows of data going into and out of the Script Component is no more than 20.
Any advice? Thanks.
View 1 Replies
View Related
Aug 13, 2007
A query was taking 20 seconds and consuming 70% CPU takes only 1 second after setting Maximum Memory property to 2048 MB - why?
Server:
OS Microsoft(R) Windows(R) Server 2003, Enterprise Edition
Version5.2.3790 Service Pack 1 Build 3790
8 GB memory
Two Dual-core AMD Opteron 285 2.6GHz Processors
Server is not configured for AWE
Fiber channel connection to EMC Clarion - two LUNs - one for MDF, one for LDF
SQL 2005
SQL 2005 32 bit Standard Edition - SP1 (version 9.0.2047)
Three instances installed on server - only one instance in use
Binaries and system databases on local mirrored disk
Database file (MDF) on one EMC LUN - dedicated physical drives
Log file (LDF) on one EMC LUN - dedicated physical drives
Query in question:
SELECT TOP 10 Address.Address1, Address.Address2, Address.City, Address.County, Address.State, Address.ZIPCode, Address.Country, Client.Name,
Quote.Deleted, Client.PrimaryContact, Client.DBA, Client.Type, Quote.Status, Quote.LOB, Client.ClientID, Quote.QuoteID, Quote.PolicyNumber,
Quote.EffectiveDate, Quote.ExpirationDate, Quote.Description, Quote.Description2, Quote.DateModified, Quote.DateAccessed, Quote.CurrentPremium,
Quote.TransactionDate, Quote.CreationDate, Quote.Producer FROM ((Client INNER JOIN Address ON Client.ClientID = Address.ClientID) INNER JOIN Quote ON
Client.ClientID = Quote.ClientID) WHERE (Quote.Deleted = 0) AND ((Address.AddressType)='Mailing') ORDER BY Client.Name
Address table - 161,075 rows
Client table - 161,634 rows
Quote table - 59,145 rows
With default maximum memory setting (2,147,483,647 MB) - query runs in 20 seconds and consumes over 70 % of the CPU.
After changing maximum memory setting to 2048 MB, query runs in less than 1 second.
Question is:
What is the best practice for setting the minimum and maximum memory settings for SQL 2005?
What can be monitored to identify the cause of these type of issues - using profiler, PerfMon, other tool?
Thanks
View 2 Replies
View Related
Jul 5, 2007
Hello,
I'm havin a problem with my database server in the network, i'm running a windows 2003 server standard edition with sql server 2005 standard edition.
the problem is that the server get stock and the performance of the whole network is affected, when i use the tak manager to monitor the performance i can see that the sqlservr.exe proccess is using 1,397,928 k of memory usage, in the performance monitor the graphics get crazy and the cpu usage grows up untill 85%.
Can you please let me know if there is something that i can do to normalize the server performance in order to let the network user work with the applications feeded by this server.
Thanks in advance for your help.
View 6 Replies
View Related
Apr 26, 2006
Can someone point me to some good articles or perhaps directly supply some words of wisdom with regard to wise utilization of variables within a T-SQL script from and standpoint of conserving memory usage and improved execution cost?
For example:
(1) Is it better to use varchars, nvarchars, etc. defined with minimal lengths to support the needs of the script or is it just as efficient to declare all with a length of say 4,000?
(2) I've seen behavior that leads me to believe that when passing a variable as a parameter in a nested procedure call, if the declared types of the parameter and the variable being passed in don't match (i.e. one is numeric(38,10) and the other is int), then implicit type conversions hurt performance. Is this true and how broadly does it apply?
(3) Does the number of variables declared in a script materially impact the performance and / or resource utlization?
(4) Is it more efficient to have a series of variable value assignments in a single SELECT statement versus a series of SET statements? Should I always perfer one to the other? Only within a looping construct?
Thanks,
Shadowraven
View 1 Replies
View Related
Apr 22, 2014
Getting the following warning in SSIS - SQL 2012:
[SSIS.Pipeline] Warning: Could not open global shared memory to communicate with performance DLL; data flow performance counters are not available. To resolve, run this package as an administrator, or on the system's console.
Microsoft had a fix for SQL Server 2008.
How to get around this in SQL 2012?
View 4 Replies
View Related
Jul 20, 2005
Fellas!!This is a very complicated one and it took me a few days to figure outexactly what's going on, but here's the final story:I have a production environment running on .NET with a SQL Server(2000, SP3). The SQL Server is on a dedicated Proliant computer with2GB RAM (the actual SQLServer.exe process has dynamic memoryassignment and can reach up to 1.6GB RAM). Nothing else is running onthat specific computer.Once the SQLServer is started, it hits 300MB RAM (the minimum that wasset in the configuration of the server - remember, it is dynamicallyaquired).Then there is a .NET program that requests just about all the data theSQL Server contains (apart from a single table that contains roughly1.6 million rows and another table that contains about 10000 rowswhich are all of type IMAGE).Once all the data is retrieved, the RAM is at about 400MB. From thereon, every update I make to the data on the server causes the RAM to goup by a bit (that updates are done in a Transaction which of course iscommitted at the end). It seems that BLOB updates are the majorproblem in all of this. For some reason, uploading a blob of size 9MBcauses the RAM to go up by roughly 20MB and after commit it gose down10MB (total gain of roughly 10MB RAM). Eventually the SQLServerprocess hits its upper limit (1.6GB) and at this point it startsslowing down.Some performance checks showed me the SQLServer has a lot of diskactivity, it seems it is reading and writing pages of data from/to theHD all the time (which causes the queries to be much much muchslower).We have a development environment running the exact same code (it isthe exact same in everything, except for the amount of data stored inthe DB). This does not happen there at all.I have a few questions:1. Why is the RAM going up after BLOB updates?2. Why is the RAM going up at all?3. How can I tell the DB which tables should remain in the RAM at alltime (never swapped back to the HD?) - DBCC PINTABLE does not seem todo the job.It does not seem to have anything to do with the .NET code.Thank you very much,M Yamo.
View 4 Replies
View Related
Nov 17, 2009
I am getting the following warning for my SSIS08 package: Could not open global shared memory to communicate with performance DLL; data flow performance counters are not available. To resolve, run this package as an administrator, or on the system's console. I did check Warning in SSIS 2008 , but didn't find any solution. The package processes data and executes fine , but why do I see this warning? When I run this package on my machine, I see no such warning, it's only when I deploy it to our DEV SSIS server, I get this warning.
View 7 Replies
View Related
Dec 5, 1998
How can I copy a SQL 7 database from one server to another? Please tell me all the ways because I've tried the obvious and it doesn't work. (backup and restore - won't go from one server to another, DTS transfer object Wizard - does a whole pile of errors and doesn't do stored proceedures and trigures). Thanks for help in advance. Also, what is the email address to post to the listserv and where is the FAQ for SQL7? Most of my listserv's tell you where to post right at the bottom of the digest I get - any chance you guys would do that with this list? Thanks again.
View 6 Replies
View Related
Jul 13, 2015
I am looking to test this feature - and the "Transaction Performance Collector" has recommended me a table to port to In-Memory OLTP.
I have now tried the "Table Memory Optimization Advisor" tool.
After a couple of tweaks to the table design - the tool is now passing validation but the tool is not allowing to progress to the next step:
Could it be down to not having enough memory? But would this not show in the advisor?
View 4 Replies
View Related
Sep 28, 2007
Hello. I have received the follwoing error upon an attempt to Browse the Cube. All other tabs are functional, including the Calculations tab. We are running Windows Server 2003 SP2 and SQL Server 2005 SP2. Any suggestions would be greatly appreciated!
**EDIT** - Have confirmed SP1 for VS2005 is installed both locally and on server, also.
Attempted to read or write protected memory. This is often an indication that other memory is corrupt. (Microsoft Visual Studio)
------------------------------
Program Location:
at Microsoft.Office.Interop.Owc11.PivotView.get_FieldSets()
at Microsoft.AnalysisServices.Controls.PivotTableFontAdjustor.TransformFonts(Font font)
at Microsoft.AnalysisServices.Browse.CubeBrowser.UpdatePivotTable(Boolean translate)
at Microsoft.AnalysisServices.Browse.CubeBrowser.UpdateAll(Boolean translate)
at Microsoft.AnalysisServices.Browse.CubeBrowser.InitialUpdate()
at Microsoft.AnalysisServices.Browse.CubeBrowser.SupportFunctionWhichCanFail(FunctionWhichCanFail function)
View 4 Replies
View Related
Oct 11, 2007
I've been researching AWE to determine if we should enable this for our environment.
Currently we have a quad core box with 4 gb of RAM (VMware). OS: Windows 2003 std, SQL Server 2005 std. 3GB is not set but will be as soon as we can perform maintenance on the server.
I have read mixed feedback on AWE, either it works great or grinds you to a hault. I would assume that the grinding to a hault is due to not setting the min/max values correctly or not enabling the lock page in memory setting.
We only have one instance of SQL on the server and this box won't be used for anything else aside from hosting SQL services. We do plan on running SSRS off of this server as well.
1. Will running SSRS and enabling AWE cause me problems? Will I have to reduce the max setting by the SSRS memory usage or will it share and play nice?
2. How do I go about setting the Max value? Should it be less than the physical RAM in the box? Right now its set to the default of 214748364, even if I don't enable AWE should this default value be changed?
3. It seems that even at idle the SQL server holds a lot of memory and the page file grows. If I restart the process in the morning, memory usage in taskmon is at 600mb or so. By the end of the day, its up around 2gb. How can I track down whats causing this, should this even concern me?
4. The lock Page in memory setting worries me. Everything I've read on this seems to give a warning about serious OS and other program support degradation. In some cases to the point where they have to restore the settings on the server before they can bring it back up. What are your thoughts on this.
View 3 Replies
View Related
Aug 28, 2015
I have a Windows sever 2012 with sql server 2012 enterprise. Ram size is 22GB. Sometimes SQL sever takes 95% memory.My question, How to reduce memory size without killing any process because it's production server.So there are many background process is running. And,Is there any guides to learn why Memory is raise d so high and how to reduce it.
View 10 Replies
View Related
Mar 27, 2008
Hello, I understand that we should use SSMS -> Server Properties -> Memory to put a cap on the SQL server memory usage, therefore it gives some space memory for OS, this is based on the fact if the max memory is not specified, SQL will use whatever available memory and eventually crash the system.
My question is that when a server has SSIS and SSAS services installed along with the SQL service. Would the max memory setting covers the SSIS and SSAS memory usage, or the SSIS and SSAS has to shared the memory with OS?
Thanks,
fshguo.
View 1 Replies
View Related
Dec 6, 2006
I am running Visual Studio 2005. I have an SSIS Package which is consuming a huge amount of memory. During the execution of the package the memory keeps increasing. Until finally i get an Out of Memory exception. I have run this package using dtexec, and in the BIDS. No difference. I do have some script components and have added some code to get the assemblies in the current appdomain. I do see that one particular assembly is increasing on every loop. VBAssembly every time it hits the script component is increasing by 6, and along with it the memory is climbing. What is this VBAssembly being used for is there an update to SQL Server Integration Services that I need?
Thanks! Aaron B.
View 6 Replies
View Related
Aug 22, 2007
sql server 2000 is running on windows server 2003 ... 4gb of memory on server .... 2003 was allocated 2.3gb nd sql server was allocated (and using all of it) 1.6gb for total of approx 4gb based on idera monitor software ... all memory allocated betweeen the OS and sql server .... then 4 more gb of memory added for total now of 8g ... now idera monitor shows 1.7gb for OS and 1.0 gb for sql server ..... 'system' info shows 8gb memory with PAE ... so I assume that the full 8gb can now be addressed .... why are less resources being used now with more total memory .... especially sql server ..... i thought about specifying a minimum memmry for sql server but i amnot convinced that would even work since it seems that this 1gb limit is artificial .... it it used 1.6 gb before why would it not use at least that much now ??
thank you
View 4 Replies
View Related
Oct 4, 2015
i want to create a lot of index for my database for performance.but i need find memory usage by indexes.
How to find memory usage by index in sql server?
View 9 Replies
View Related
Jun 15, 2015
I've a database with a memory optimized filegroup on it. How can I remove it?I have removed the memory optimized table I had on it, but when I try to remove the filegroup I receive an error.
View 12 Replies
View Related
Jul 17, 2006
Hi,
I am going to install SQL Server 2000 (then SQL 2K5) on a Win Server 2K3 with 8 GB of ram, but it will be 16 GB in the near future.
I would like to reserve a fixed memory (for momemt less than 3-4 GB) for SQL Server and the rest for application (virtualization).
Without AWE enabled, max memory for SQL Server 2K5 is 4GB as for SQL Server 2000?
How can I manage and optimize memory keeping in mind AWE. (any doc, website available?)
Thank
View 5 Replies
View Related
Feb 10, 2015
So I started a new job recently and have noticed a few strange configurations. Typically I would never mess with min memory per query option and index create memory option configuration because i just haven't seen any need to. My typical thought is that if it isn't broke... They have been modified on every single server in my environment.
From Books Online:
• This option is an advanced option and should be changed only by an experienced database administrator or certified SQL Server technician.
• The index create memory option is self-configuring and usually works without requiring adjustment. However, if you experience difficulties creating indexes, consider increasing the value of this option from its run value.
View 3 Replies
View Related
Mar 20, 2008
Hello frnds Can Anybody explai what does it mean by question itself and how to resole it ?
View 2 Replies
View Related
Sep 12, 2004
1. Use mssql server agent service to take the schedule
2. Use a .NET windows service with timers to call SqlClientConnection
above, which way would be faster and get a better performance?
View 2 Replies
View Related
Jun 23, 2006
Hello Everyone,I have a very complex performance issue with our production database.Here's the scenario. We have a production webserver server and adevelopment web server. Both are running SQL Server 2000.I encounted various performance issues with the production server with aparticular query. It would take approximately 22 seconds to return 100rows, thats about 0.22 seconds per row. Note: I ran the query in singleuser mode. So I tested the query on the Development server by taking abackup (.dmp) of the database and moving it onto the dev server. I ranthe same query and found that it ran in less than a second.I took a look at the query execution plan and I found that they we'rethe exact same in both cases.Then I took a look at the various index's, and again I found nodifferences in the table indices.If both databases are identical, I'm assumeing that the issue is relatedto some external hardware issue like: disk space, memory etc. Or couldit be OS software related issues, like service packs, SQL Serverconfiguations etc.Here's what I've done to rule out some obvious hardware issues on theprod server:1. Moved all extraneous files to a secondary harddrive to free up spaceon the primary harddrive. There is 55gb's of free space on the disk.2. Applied SQL Server SP4 service packs3. Defragmented the primary harddrive4. Applied all Windows Server 2003 updatesHere is the prod servers system specs:2x Intel Xeon 2.67GHZTotal Physical Memory 2GB, Available Physical Memory 815MBWindows Server 2003 SE /w SP1Here is the dev serers system specs:2x Intel Xeon 2.80GHz2GB DDR2-SDRAMWindows Server 2003 SE /w SP1I'm not sure what else to do, the query performance is an order ofmagnitude difference and I can't explain it. To me its is a hardware oroperating system related issue.Any Ideas would help me greatly!Thanks,Brian T*** Sent via Developersdex http://www.developersdex.com ***
View 2 Replies
View Related
Jul 16, 2000
Hi there,
After installing SP2 we encounter a problem with MMC. When selecting a DB we recieve a RunTime error? And the DB items/properties are not shown.
Is there anyone with the same problem and knows how to fix this? I search the MS site for more information but there is nothing on it.
Danny
View 1 Replies
View Related