I need to take each database off of a physical instance and give memory, cpu specs of what each database needs to run on the VM.
If I have 10 databases on one instance (SQL Server 2005 Standard) and need the memory used by one database, if I run the following query...
DECLARE @total_buffer INT;
SELECT @total_buffer = cntr_value
FROM sys.dm_os_performance_counters
WHERE RTRIM([object_name]) LIKE '%Buffer Manager'
AND counter_name = 'Total Pages';
[Code] ....
(Found here [URL] ....)
Is db_buffer_percent the percentage of the max server memory set for the instance?
(db_buffer_percent of DB) * (Max server memory) = Amount of memory needed for DB
I need to change the table codes to in memory optimized but every time I try to change I got an error message like you need to change database to memory optimize.
So, I try to change it  using this code:
ALTER DATABASE Coralreef_ ADD FILEGROUP Coralreef__mod CONTAINS MEMORY_ OPTIMIZED_ DATA.
When I used this  I got an another error:
Msg 10797, Level 15, State 2, Line 1..Only one MEMORY_OPTIMIZED_DATA filegroup is allowed per database.
So, How can I change database to in memory optimized database.
Does anyone know if there is a rough guide in terms of how much memory SQL server will normally use relative to the size of the actual databases that it is hosting?
For example, I'm working on a server that hosts only about 250-300 MB worth of data (including all of the full text indexes, etc.) but the sqlservr.exe process is using 1.25 GB of ram. Also Page Reads/sec is around 1200 -- despite there being over 5 GB of RAM available and not being used (the PF is about 3.25 GB, total memory in the machine is 8GB). Below that w3wp.exe uses only 290 MB and lsass.exe uses only about 225 MB. The machine is a web/database server that is hosting ASP.NET 2, ASP.NET & Classic ASP pages (and the associated databases). For the amount of hardware, the server seems to be responding to requests on the slow side. While some of this is probably due to the legacy code-base, there seems to be a noticeable difference in speed after SQL Server is restarted.
Hello. I have received the follwoing error upon an attempt to Browse the Cube. All other tabs are functional, including the Calculations tab. We are running Windows Server 2003 SP2 and SQL Server 2005 SP2. Any suggestions would be greatly appreciated!
**EDIT** - Have confirmed SP1 for VS2005 is installed both locally and on server, also.
Attempted to read or write protected memory. This is often an indication that other memory is corrupt. (Microsoft Visual Studio)
------------------------------ Program Location:
at Microsoft.Office.Interop.Owc11.PivotView.get_FieldSets() at Microsoft.AnalysisServices.Controls.PivotTableFontAdjustor.TransformFonts(Font font) at Microsoft.AnalysisServices.Browse.CubeBrowser.UpdatePivotTable(Boolean translate) at Microsoft.AnalysisServices.Browse.CubeBrowser.UpdateAll(Boolean translate) at Microsoft.AnalysisServices.Browse.CubeBrowser.InitialUpdate() at Microsoft.AnalysisServices.Browse.CubeBrowser.SupportFunctionWhichCanFail(FunctionWhichCanFail function)
I've been researching AWE to determine if we should enable this for our environment.
Currently we have a quad core box with 4 gb of RAM (VMware). OS: Windows 2003 std, SQL Server 2005 std. 3GB is not set but will be as soon as we can perform maintenance on the server.
I have read mixed feedback on AWE, either it works great or grinds you to a hault. I would assume that the grinding to a hault is due to not setting the min/max values correctly or not enabling the lock page in memory setting.
We only have one instance of SQL on the server and this box won't be used for anything else aside from hosting SQL services. We do plan on running SSRS off of this server as well.
1. Will running SSRS and enabling AWE cause me problems? Will I have to reduce the max setting by the SSRS memory usage or will it share and play nice?
2. How do I go about setting the Max value? Should it be less than the physical RAM in the box? Right now its set to the default of 214748364, even if I don't enable AWE should this default value be changed?
3. It seems that even at idle the SQL server holds a lot of memory and the page file grows. If I restart the process in the morning, memory usage in taskmon is at 600mb or so. By the end of the day, its up around 2gb. How can I track down whats causing this, should this even concern me?
4. The lock Page in memory setting worries me. Everything I've read on this seems to give a warning about serious OS and other program support degradation. In some cases to the point where they have to restore the settings on the server before they can bring it back up. What are your thoughts on this.
I have seen some different behaviors when I am writing SQL Server compact edition Database file into SD Memory card or any user removable storage media.
The following Steps for reproducing these behaviors.
1. Set your database location is in the SD Memory card. 2. Create a database and establish session. 3. Writing first record into the database is done 4. Remove your SD card manually from the system 5. Try to write a Second record into the database 6. You will get the error from this "SqlCeWriteRecordProps" API call - This is fine. 7. Put your SD card back into the system. 8. Writing Second record again into the database is done - This is fine.
So far the behaviors are good after that
9. Remove your SD card manually from the system again 10. Try to write a Third record into the database 11. You will suppose to get the error from this "SqlCeWriteRecordProps" API call but you won€™t get that error €“ I don€™t know why? 12. Then Try to write a Forth record into the database 13. You won't get any error from the API call "SqlCeWriteRecordProps" and return status is success. 14. Now put your SD card back into the system. 15. Then write Fifth record but this time its writing record Third, Fourth and Fifth into the database.
Note: I am clearing (means Free) my record structure everything after calling this function "SqlCeWriteRecordProps". So I don€™t know why its wring Third, fourth and fifth record into the database.
Hi all, I was just wonder what the normal memory consumption is when just simply making a connection to a database. I have an application that starts, and when it connects the to a database, it eats up around a 1.5 MB, yes I keep the connection open for the life of the application (as its in a kinda global class, and when database reads/writes are performed I just get the open connection object) - I don't run any queries against the database here, just connect.
Is this a normal amount of memory to consume on a simple connection to a database?
Hi, I want to manage memory relational data in the memory(Insert, delete, return) and manipulate the data via SQL 2005 and CLR database objects. The gool is to manage a table that relatively has a lot of inserts and deletes which takes a lot of IO operations and i want to reduce it. the data itself is not permanant each row deleted from the table after 15-20 minutes. I looked at the extended stored procedure and other CLR enable objects and it seams that they all return data from the database itself. How can i return data that is reside in the memory? Thanks.
I have a Windows sever 2012 with sql server 2012 enterprise. Ram size is 22GB. Sometimes SQL sever takes 95% memory.My question, How to reduce memory size without killing any process because it's production server.So there are many background process is running. And,Is there any guides to learn why Memory is raise d so high and how to reduce it.
Hello, I understand that we should use SSMS -> Server Properties -> Memory to put a cap on the SQL server memory usage, therefore it gives some space memory for OS, this is based on the fact if the max memory is not specified, SQL will use whatever available memory and eventually crash the system.
My question is that when a server has SSIS and SSAS services installed along with the SQL service. Would the max memory setting covers the SSIS and SSAS memory usage, or the SSIS and SSAS has to shared the memory with OS?
I am running Visual Studio 2005. I have an SSIS Package which is consuming a huge amount of memory. During the execution of the package the memory keeps increasing. Until finally i get an Out of Memory exception. I have run this package using dtexec, and in the BIDS. No difference. I do have some script components and have added some code to get the assemblies in the current appdomain. I do see that one particular assembly is increasing on every loop. VBAssembly every time it hits the script component is increasing by 6, and along with it the memory is climbing. What is this VBAssembly being used for is there an update to SQL Server Integration Services that I need?
sql server 2000 is running on windows server 2003 ... 4gb of memory on server .... 2003 was allocated 2.3gb nd sql server was allocated (and using all of it) 1.6gb for total of approx 4gb based on idera monitor software ... all memory allocated betweeen the OS and sql server .... then 4 more gb of memory added for total now of 8g ... now idera monitor shows 1.7gb for OS and 1.0 gb for sql server ..... 'system' info shows 8gb memory with PAE ... so I assume that the full 8gb can now be addressed .... why are less resources being used now with more total memory .... especially sql server ..... i thought about specifying a minimum memmry for sql server but i amnot convinced that would even work since it seems that this 1gb limit is artificial .... it it used 1.6 gb before why would it not use at least that much now ??
I've a database with a memory optimized filegroup on it. How can I remove it?I have removed the memory optimized table I had on it, but when I try to remove the filegroup I receive an error.
I have a very small SSAS database with around 35 Mb. I opened it on Excel 32 bits and started dragging fields to a pivot table and it started failing with memory errors. The behavior on the SSAS server was that memory started growing very fast until 8 GB (vm memory total) and then the error is reported in excel.
What might be the issue in such a small database? I would understand in a big database, but not on this one.
So I started a new job recently and have noticed a few strange configurations. Typically I would never mess with min memory per query option and index create memory option configuration because i just haven't seen any need to. My typical thought is that if it isn't broke... They have been modified on every single server in my environment.
From Books Online: • This option is an advanced option and should be changed only by an experienced database administrator or certified SQL Server technician. • The index create memory option is self-configuring and usually works without requiring adjustment. However, if you experience difficulties creating indexes, consider increasing the value of this option from its run value.
I did a load testing and found the following observations:
1. The Memory:Pages/sec was crossing the limit beyond 20.
2. The Target Server Memory was always greater than Total Server Memory
Seeing the above data it seems to be memory pressure. But I found that AvailableMemory was always above 200 MB. Also Buffer Cache HitRatio was close to 99.99. What could be the reason for the above behavior?
Hi~, I have 3 questions about memory based bulk copy.
1. What is the limitation count of IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit(true)? For example, how much insert row at below sample?(the max value of nCount) for(i=0 ; i<nCount ; i++) { pIFastLoad->InsertRow(hAccessor, (void*)(&BulkData)); }
2. In above code sample, isn't there method of inserting prepared array at once directly(BulkData array, not for loop)
3. In OLE DB memory based bulk copy, what is the equivalent of below's T-SQL bulk copy option ? BULK INSERT database_name.schema_name.table_name FROM 'data_file' WITH (ROWS_PER_BATCH = rows_per_batch, TABLOCK);
------------------------------------------------------- My solution is like this. Is it correct?
// CoCreateInstance(...); // Data source // Create session
If I install SQL 2005 Standard on Windows 2003 Standard, is SQL limited to 4 gigs of physical RAM?
I'm planning a new system that will run SQL 2005 Standard edition on a Windows 2003 Standard platform. The spec calls for 8 GB of RAM. My experience would lead me to suspect I need to install Windows 2003 Enterprise to take advantage of all the installed memory.
We have a box with dual Intel Xeon 3.4GHz processors, and 8GB RAM, OS and Data on separate RAID5 arrays. It runs: MS Win2k3 Enterprise for the OS, MS SQLServer 2k Enterprise for DB, MS SQL Analysis Services for OLAP, MS SQL Reporting Services for reports distrib, MS IIS for web hosting, SPSS OLAPHub for OLAP web UI, and other small stuff...
My question is this: How should the memory be configured? Should we use the OS' /3GB switch or are there enough apps running here for the OS to need 2GB to track them? What SQLServer switches would you run? I'm not sure that I want to lock more that 3-4GB of SQL data pages into memory and starve the other pigs... How could I set this up to keep 2GB for the OS and yet really use the remaining 6GB most effectively?
I have been tracking SQL's memory usage over the past few months with Perfmon. I have noticed a trend that little by little "% Committed Bytes in Use" creeps up starting at 8% and eventually climing as high at 35%. I stop and start the SQL services and the total comes back down to 8%. This happens over a week and a half or two week period. Is this normal SQL activity?
I am going to purchase a new server to run Eastman's OpenImage software's CATALOG database. Our current server is SQL 6.5, but it is not able to be upgraded to SQL 7.0 (known bug with this particular HP server). So I am going to buy a new one. My question is where to find documentation that could recommend how much memory to purchase for this server. I can find scads of documents detailing how much RAM to give SQL given a fixed amount of RAM on your server, but nowwhere can I find how much RAM to get when buying a new server.
hi, i have a dell server with 4 xeon processors, 4GB RAM, running sql 7 ent ed sp2 on win2k advance server (dedicated sql server). configured my sql to dynamically configure memory. i've been monitoring the memory usage using task manager and the most sql will use is up to 2GB even on very heavy loads. why? everything i read on sql 7 ent ed tells me that it should be able to take advantage of up to 8GB of memory. also, the readings i did on win2k advance server says it supports PAE (Intel's Physical Address Extension). why isn't it using the other 2GB or at least another 1GB? if it's not able to use it, then my co's investment on the other 2GB RAM is wasted! please help!
For a 2 gig machine(running SQL only) how much should be allocated to SQL and NT? The old chart and formular Microsoft had went up to 512 and was more of a guess.
If I have a SQL Server Ent Edition 2000 running on Win2000 Server, and the server has 4 gig of memory, am I correct that I can set SQl Server to use upto 3 gig of memory, but not use AWE? I thought I could use the setting "4GB RAM: /3GB (AWE support is not used)" in the boot.ini file