Memory Management (fixed Memory, AWE)
Jul 17, 2006
Hi,
I am going to install SQL Server 2000 (then SQL 2K5) on a Win Server 2K3 with 8 GB of ram, but it will be 16 GB in the near future.
I would like to reserve a fixed memory (for momemt less than 3-4 GB) for SQL Server and the rest for application (virtualization).
Without AWE enabled, max memory for SQL Server 2K5 is 4GB as for SQL Server 2000?
How can I manage and optimize memory keeping in mind AWE. (any doc, website available?)
Thank
View 5 Replies
ADVERTISEMENT
Jun 8, 2007
I have a person who just recomended to a customer to use Fixed instead of Dynamic for the SQL Server Memory setting informing me that I was wrong for setting it to use Dynamic.
Well I tried to explain that why would you want to give away memory if you do not need to, he just answered that I was wrong, any help in proving me wrong or him wrong?
Which way is the correct way. I was told by a very bright person with a strong background in the developement of SQL that when it was written it was done so with the word "fixed" memory size was done in lower case just as a reminder that you can but should never use "fixed" that you should always use Dynamic (which by the way does use a capitol "D"...)
Just wondering...Thanks for the help guys...
This is the personal edition running on the XP Pro platform for use in a Point-of-Sale enviorment with 6 other computers connecting to hte SQL server on the one back of house computer which is running 4mb of memory.
I told them that the minimum should be at zero and the max is set to half....
View 4 Replies
View Related
Oct 13, 2015
We have a SQL enterprise server 2014 with two installed instance.
First instance for SharePoint 2013 DBs and second instance Dynamics AX 2012 R3 DBs.
The server has 32 GB of RAM, currently the process usage for first instance is 26 GB and second instance is 700 MB.
Which mean that SharePoint DBs in the first instance is consuming most of the RAM.
My question is there any way to assign a fixed RAM usage for a specific instance without affecting the other instance?
for example 16 GB only first instance and 16 GB to second instance.
View 6 Replies
View Related
Jan 5, 2006
Hello,
I have a few questions about the behavior of the CLR host within SQL Server 2005. We are using a UDT (call it MyDateTime) created in C# that represents the COM FILETIME type, in order to have single millisecond resolution. MyDateTime values are stored in the database as binary(8), with the UDT itself being used primarily for display and reporting purposes. I am running performance tests using a prototype (written in C# as well) that runs 20 threads which repeatedly call a stored procedure, which accepts two MyDateTime's, and queries a table based on those MyDateTime's binary string representation. After a certain amount of time (depending on the particular system's resources), threads will start to be aborted. Most of the time the reason is "SQL Exception: .NET Framework execution was aborted by escalation policy because of out of memory." Sometimes, eventually the appdomain will be unloaded, and if I restart the prototype, the process starts over. Sometimes, I will have to restart the server before any more CLR processing can occur (no automatic appdomain unload). While the prototype is running, I'll check the MEMORYCLERK_SQLCLR rows in sys.dm_os_memory_clerks, and see the columns for pages and virtual memory ever increasing, until a threshold is hit (on my system, approximately 225 megs of virtual memory committed), resulting in all 20 threads being aborted, one by one, within 30-45 seconds. During that time some of the remaining threads will still have successful calls, while others are aborted.
I understand the necessity for the CLR in SQL Server to monitor and abort threads, in order to preserve the database server itself, as well as the importance of exception handling client-side, but unless the UDT code itself has a leak in it (I'm fairly confident it doesn't), this behavior confuses me. X amount of stored procedure calls (on my system, approximately 65,000 within an hour) can occur before SQL Server runs out of memory, and will constantly abort any thread trying to use the UDT, until it decides to unload the appdomain? Is it entirely up to the client to catch any threadaborts and retry those transactions, and is there any way to facilitate or predict if/when the appdomain is going to unload? Am I missing something about how garbage collection is functioning within SQL Server, or the CLR itself? Even simple CLR code slowly eats up the memory and causes the same results, if enough transactions are made. Does a long running or high transaction system have to anticipate a regular intervention by the escalation policy?
Any insight you could give me would be greatly appreciated. Have a good day.
-Dan
P.S. I'm running the September CTP of SQL Server and the Release Candidate of Visual Studio, based on our current development requirements--I will upgrade when I can.
View 6 Replies
View Related
Jul 13, 2015
I am looking to test this feature - and the "Transaction Performance Collector" has recommended me a table to port to In-Memory OLTP.
I have now tried the "Table Memory Optimization Advisor" tool.
After a couple of tweaks to the table design - the tool is now passing validation but the tool is not allowing to progress to the next step:
Could it be down to not having enough memory? But would this not show in the advisor?
View 4 Replies
View Related
Sep 28, 2007
Hello. I have received the follwoing error upon an attempt to Browse the Cube. All other tabs are functional, including the Calculations tab. We are running Windows Server 2003 SP2 and SQL Server 2005 SP2. Any suggestions would be greatly appreciated!
**EDIT** - Have confirmed SP1 for VS2005 is installed both locally and on server, also.
Attempted to read or write protected memory. This is often an indication that other memory is corrupt. (Microsoft Visual Studio)
------------------------------
Program Location:
at Microsoft.Office.Interop.Owc11.PivotView.get_FieldSets()
at Microsoft.AnalysisServices.Controls.PivotTableFontAdjustor.TransformFonts(Font font)
at Microsoft.AnalysisServices.Browse.CubeBrowser.UpdatePivotTable(Boolean translate)
at Microsoft.AnalysisServices.Browse.CubeBrowser.UpdateAll(Boolean translate)
at Microsoft.AnalysisServices.Browse.CubeBrowser.InitialUpdate()
at Microsoft.AnalysisServices.Browse.CubeBrowser.SupportFunctionWhichCanFail(FunctionWhichCanFail function)
View 4 Replies
View Related
Oct 11, 2007
I've been researching AWE to determine if we should enable this for our environment.
Currently we have a quad core box with 4 gb of RAM (VMware). OS: Windows 2003 std, SQL Server 2005 std. 3GB is not set but will be as soon as we can perform maintenance on the server.
I have read mixed feedback on AWE, either it works great or grinds you to a hault. I would assume that the grinding to a hault is due to not setting the min/max values correctly or not enabling the lock page in memory setting.
We only have one instance of SQL on the server and this box won't be used for anything else aside from hosting SQL services. We do plan on running SSRS off of this server as well.
1. Will running SSRS and enabling AWE cause me problems? Will I have to reduce the max setting by the SSRS memory usage or will it share and play nice?
2. How do I go about setting the Max value? Should it be less than the physical RAM in the box? Right now its set to the default of 214748364, even if I don't enable AWE should this default value be changed?
3. It seems that even at idle the SQL server holds a lot of memory and the page file grows. If I restart the process in the morning, memory usage in taskmon is at 600mb or so. By the end of the day, its up around 2gb. How can I track down whats causing this, should this even concern me?
4. The lock Page in memory setting worries me. Everything I've read on this seems to give a warning about serious OS and other program support degradation. In some cases to the point where they have to restore the settings on the server before they can bring it back up. What are your thoughts on this.
View 3 Replies
View Related
Dec 11, 2007
Hi all,
I needed to load some tables in memory on startup because of performance reasons.
I'm using "select * from <table>", but there are few questions:
1. How to pin already selected data in memory ? (DBCC PINTABLE doesn't work for 2005)
2. How to put index data in memory ? (do you read document(s) for advance memory management - index data caching ?)
3. How to pin index data in memory ? (otherwise sound very bad - table data in fast memory, index data - in slow disks)
Thanks in advance:
Siol En
View 6 Replies
View Related
Sep 17, 2007
I've read and noticed SQL 2005 handles memory differently then 2000. In 2000 if I told a server it had 6GB to use, it allocated the memory. In 2005 I have one 32-bit server with 6GB of memory and one 64-bit server with 32 GB. If Target Server Memory is the amount of memory SQL Server would like to have, how does that correspond to Maximum Server Memory? Also, how is Target Server Memory determined?
32-bit
Physical Memory = 8GB
Target Server Memory = 6GB (Willing to consume)
Total Server Memory = 690MB (Currently consuming)
Minimum Server Memory = 2GB
Maximum Server Memory = 6GB
For the 32-bit server the Target Server Memory matches Maximum Server Memory
64-bit
Physical Memory = 32GB
Target Server Memory = 28GB (Willing to consume)
Total Server Memory = 397MB (Currently consuming)
Minimum Server Memory = 4GB
Maximum Server Memory = 30GB
For the 64-bit server the Target Server Memory is less then the Maximum Server Memory
Lock Pages in Memory is set for the service account. Neither server above has yet to be released to production and only the 32-bit server has any users. In 2000 when SQL Server started I could count on it using about 1.72GB of memory immediately. Seeing the servers above consume only only 690MB and 397MB has me concerned. Is this just a case of SQL Server 2005 handling memory better then 2000?
Thanks, Dave
View 4 Replies
View Related
Dec 11, 2007
Hi all,
I needed to load some tables in memory on startup because of performance reasons.
I'm using "select * from <table>", but there are few questions:
1. How to pin already selected data in memory ? (DBCC PINTABLE doesn't work for 2005)
2. How to put index data in memory ? (do you read document(s) for advance memory management - index data caching ?)
3. How to pin index data in memory ? (otherwise sound very bad - table data in fast memory, index data - in slow disks)
Thanks in advance:
Siol En
View 3 Replies
View Related
Aug 28, 2015
I have a Windows sever 2012 with sql server 2012 enterprise. Ram size is 22GB. Sometimes SQL sever takes 95% memory.My question, How to reduce memory size without killing any process because it's production server.So there are many background process is running. And,Is there any guides to learn why Memory is raise d so high and how to reduce it.
View 10 Replies
View Related
Apr 15, 2008
Hi all, quick question. A while back I developed a website that allowed upload of photos. At the time, I used ASP and VBasic-behind and wrote code to store and retrieve all photos in an SQL database in binary format. I am looking at a new project, very similar, and was wondering if anybody had any idea how this method might compare to simply storing the image files on the server and using the database to simply point to their location. I am wondering how the two methods compare spped wise and hard drive space they may consume. Does anybody have any idea on a direct comparison?? Any benchmark tests anybody has seen or ran?
Thanks all,
Chris
View 2 Replies
View Related
Mar 27, 2008
Hello, I understand that we should use SSMS -> Server Properties -> Memory to put a cap on the SQL server memory usage, therefore it gives some space memory for OS, this is based on the fact if the max memory is not specified, SQL will use whatever available memory and eventually crash the system.
My question is that when a server has SSIS and SSAS services installed along with the SQL service. Would the max memory setting covers the SSIS and SSAS memory usage, or the SSIS and SSAS has to shared the memory with OS?
Thanks,
fshguo.
View 1 Replies
View Related
Dec 6, 2006
I am running Visual Studio 2005. I have an SSIS Package which is consuming a huge amount of memory. During the execution of the package the memory keeps increasing. Until finally i get an Out of Memory exception. I have run this package using dtexec, and in the BIDS. No difference. I do have some script components and have added some code to get the assemblies in the current appdomain. I do see that one particular assembly is increasing on every loop. VBAssembly every time it hits the script component is increasing by 6, and along with it the memory is climbing. What is this VBAssembly being used for is there an update to SQL Server Integration Services that I need?
Thanks! Aaron B.
View 6 Replies
View Related
Aug 22, 2007
sql server 2000 is running on windows server 2003 ... 4gb of memory on server .... 2003 was allocated 2.3gb nd sql server was allocated (and using all of it) 1.6gb for total of approx 4gb based on idera monitor software ... all memory allocated betweeen the OS and sql server .... then 4 more gb of memory added for total now of 8g ... now idera monitor shows 1.7gb for OS and 1.0 gb for sql server ..... 'system' info shows 8gb memory with PAE ... so I assume that the full 8gb can now be addressed .... why are less resources being used now with more total memory .... especially sql server ..... i thought about specifying a minimum memmry for sql server but i amnot convinced that would even work since it seems that this 1gb limit is artificial .... it it used 1.6 gb before why would it not use at least that much now ??
thank you
View 4 Replies
View Related
Oct 4, 2015
i want to create a lot of index for my database for performance.but i need find memory usage by indexes.
How to find memory usage by index in sql server?
View 9 Replies
View Related
Jun 15, 2015
I've a database with a memory optimized filegroup on it. How can I remove it?I have removed the memory optimized table I had on it, but when I try to remove the filegroup I receive an error.
View 12 Replies
View Related
Feb 19, 2008
Hi,
I am getting error while opening the Data Base Engine in Sql server Management Studio.
We applied SP2. and restarted the server but no luck.
Error Message:Attempted to read or protected memory. This is often an indication that other memory is corrupt(mscorlib).
View 3 Replies
View Related
Feb 10, 2015
So I started a new job recently and have noticed a few strange configurations. Typically I would never mess with min memory per query option and index create memory option configuration because i just haven't seen any need to. My typical thought is that if it isn't broke... They have been modified on every single server in my environment.
From Books Online:
• This option is an advanced option and should be changed only by an experienced database administrator or certified SQL Server technician.
• The index create memory option is self-configuring and usually works without requiring adjustment. However, if you experience difficulties creating indexes, consider increasing the value of this option from its run value.
View 3 Replies
View Related
Mar 20, 2008
Hello frnds Can Anybody explai what does it mean by question itself and how to resole it ?
View 2 Replies
View Related
Aug 2, 2006
Hi
I did a load testing and found the following observations:
1. The Memory:Pages/sec was crossing the limit beyond 20.
2. The Target Server Memory was always greater than Total Server Memory
Seeing the above data it seems to be memory pressure. But I found that AvailableMemory was always above 200 MB. Also Buffer Cache HitRatio was close to 99.99. What could be the reason for the above behavior?
View 1 Replies
View Related
Feb 15, 2007
Hi~, I have 3 questions about memory based bulk copy.
1. What is the limitation count of IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit(true)?
For example, how much insert row at below sample?(the max value of nCount)
for(i=0 ; i<nCount ; i++)
{
pIFastLoad->InsertRow(hAccessor, (void*)(&BulkData));
}
2. In above code sample, isn't there method of inserting prepared array at once directly(BulkData array, not for loop)
3. In OLE DB memory based bulk copy, what is the equivalent of below's T-SQL bulk copy option ?
BULK INSERT database_name.schema_name.table_name FROM 'data_file' WITH (ROWS_PER_BATCH = rows_per_batch, TABLOCK);
-------------------------------------------------------
My solution is like this. Is it correct?
// CoCreateInstance(...);
// Data source
// Create session
m_TableID.uName.pwszName = m_wszTableName;
m_TableID.eKind = DBKIND_NAME;
DBPROP rgProps[1];
DBPROPSET PropSet[1];
rgProps[0].dwOptions = DBPROPOPTIONS_REQUIRED;
rgProps[0].colid = DB_NULLID;
rgProps[0].vValue.vt = VT_BSTR;
rgProps[0].dwPropertyID = SSPROP_FASTLOADOPTIONS;
rgProps[0].vValue.bstrVal = L"ROWS_PER_BATCH = 10000,TABLOCK";
PropSet[0].rgProperties = rgProps;
PropSet[0].cProperties = 1;
PropSet[0].guidPropertySet = DBPROPSET_SQLSERVERROWSET;
if(m_pIOpenRowset)
{
if(FAILED(m_pIOpenRowset->OpenRowset(NULL,&m_TableID,NULL,IID_IRowsetFastLoad,1,PropSet,(LPUNKNOWN*)&m_pIRowsetFastLoad)))
{
return FALSE;
}
}
else
{
return FALSE;
}
View 6 Replies
View Related
Nov 13, 2006
If I install SQL 2005 Standard on Windows 2003 Standard, is SQL limited to 4 gigs of physical RAM?
I'm planning a new system that will run SQL 2005 Standard edition on a Windows 2003 Standard platform. The spec calls for 8 GB of RAM. My experience would lead me to suspect I need to install Windows 2003 Enterprise to take advantage of all the installed memory.
View 3 Replies
View Related
Feb 9, 2005
We have a box with dual Intel Xeon 3.4GHz processors, and 8GB RAM, OS and Data on separate RAID5 arrays. It runs:
MS Win2k3 Enterprise for the OS,
MS SQLServer 2k Enterprise for DB,
MS SQL Analysis Services for OLAP,
MS SQL Reporting Services for reports distrib,
MS IIS for web hosting,
SPSS OLAPHub for OLAP web UI, and other small stuff...
My question is this: How should the memory be configured? Should we use the OS' /3GB switch or are there enough apps running here for the OS to need 2GB to track them? What SQLServer switches would you run? I'm not sure that I want to lock more that 3-4GB of SQL data pages into memory and starve the other pigs... How could I set this up to keep 2GB for the OS and yet really use the remaining 6GB most effectively?
All input is sincerely appreciated.
Robert
View 7 Replies
View Related
Mar 26, 2001
I had the drives switched out on my sever this weekend and the memory is
running at a consistent 100%.
OS: NT Server 4.0 SP6
SQL Server 7.0 SP1
Drives:
From: 60gb To: 210gb
Memory: 1gb (will be increasing to 2gb)
Pagefile: 3gb
Can anyone tell me why or what would be causing all the memory to be
used?
View 1 Replies
View Related
Aug 3, 2000
I have been tracking SQL's memory usage over the past few months with Perfmon. I have noticed a trend that little by little "% Committed Bytes in Use" creeps up starting at 8% and eventually climing as high at 35%. I stop and start the SQL services and the total comes back down to 8%. This happens over a week and a half or two week period. Is this normal SQL activity?
Thanks for your help!
DOUG
View 1 Replies
View Related
Dec 14, 2000
I am going to purchase a new server to run Eastman's OpenImage software's CATALOG database. Our current server is SQL 6.5,
but it is not able to be upgraded to SQL 7.0 (known bug with this particular HP server). So I am going to buy a new one. My question is
where to find documentation that could recommend how much memory to purchase for this server. I can find scads of documents
detailing how much RAM to give SQL given a fixed amount of RAM on your server, but nowwhere can I find how much RAM to get when
buying a new server.
Any tips?
Thanks,
Joe
View 1 Replies
View Related
Nov 15, 2000
hi, i have a dell server with 4 xeon processors, 4GB RAM, running sql 7 ent ed sp2 on win2k advance server (dedicated sql server). configured my sql to dynamically configure memory. i've been monitoring the memory usage using task manager and the most sql will use is up to 2GB even on very heavy loads. why? everything i read on sql 7 ent ed tells me that it should be able to take advantage of up to 8GB of memory. also, the readings i did on win2k advance server says it supports PAE (Intel's Physical Address Extension). why isn't it using the other 2GB or at least another 1GB? if it's not able to use it, then my co's investment on the other 2GB RAM is wasted! please help!
View 2 Replies
View Related
May 19, 2000
For a 2 gig machine(running SQL only) how much should be allocated to
SQL and NT? The old chart and formular Microsoft had went up to 512 and
was more of a guess.
View 3 Replies
View Related
Apr 24, 2003
If I have a SQL Server Ent Edition 2000 running on Win2000 Server, and the server has 4 gig of memory, am I correct that I can set SQl Server to use upto 3 gig of memory, but not use AWE? I thought I could use the setting "4GB RAM: /3GB (AWE support is not used)" in the boot.ini file
View 2 Replies
View Related
Sep 2, 1999
For a dedicated NT machine for SQL Server with a 2.3gb of RAM, how much memory should I allot for the SQL server.
Should I configure the sql server with 2.3GB-64MB = 1.66 GB of memory?
Thanks.
vic
View 1 Replies
View Related
Aug 24, 1999
What would be a good memory setting for SQL Server that's running
on a NT machine(of 512MB physical memory) by it self? currently the
memory parm is set at 437563, and users are getting a lot of time out errors
when they are running their VB app.
View 2 Replies
View Related
Nov 12, 1998
If we have a SQL Server with 2GB of RAM and a database that's only 1.5GB in size (indexes included) will the entire database be loaded into RAM making for lightning performance?
Lastly, is the answer different if you're using SQL7.0 instead of SQL6.5?
Kevin
View 1 Replies
View Related