I know I can run DBCC MEMORYSTATUS to view how much memory SQL Server is using on a 64-bit server, but how do I view how much memory the other processes are consuming?
I do not believe the Perfmon counters are accurate when using AWE or 64-bit memory, at least in terms of memory consumption per process so how do we see this information. I have a feeling it is not possible.
Here is the problem. We have a 64-bit server with 32GB of memory. SQL Server is configured to use no more the 20GB. It currently is using only 15GB. The O/S is reporting 28.3GB used out of 32GB, which means some other processes are consuming 28.3GB - 15GB = 13.3GB. How can I determine what is using the 13.3GB of memory?
I know I can run DBCC MEMORYSTATUS to view how much memory SQL Server is using on a 64-bit server, but how do I view how much memory the other processes are consuming?
I do not believe the Perfmon counters are accurate when using AWE or 64-bit memory, at least in terms of memory consumption per process so how do we see this information. I have a feeling it is not possible.
Here is the problem. We have a 64-bit server with 32GB of memory. SQL Server is configured to use no more the 20GB. It currently is using only 15GB. The O/S is reporting 28.3GB used out of 32GB, which means some other processes are consuming 28.3GB - 15GB = 13.3GB. How can I determine what is using the 13.3GB of memory?
I've thre problem that our MSDE SQL-Server 2000 (without SP3) memory consumption is constantly increasing. Even though, that I limitet the 'max server memory' with sp_configure and RECONFIGURE it's growing above that limit.
When I shut down the application, SQL server does not free the memory. I was already checking the database file size (transaction logs) and made shure that autoshrinking is enabled.
Does anybody have an idea, why SQL-Server does not release the memory?
Hi to all, on our project we are facing this problem: our users using report builder are making reports containing several Mb of data (sometimes also 1Gb!). Now it happens that when 3-4 users do a report like that on the server the process w3wp.exe reaches also 10 Gb of memory allocated and then the whole reporting services istance stops working. We want to limit the number of rows that can be extracted from the sql server db using report builder. In the report model reference seems that there is not this kind of option (something like "max rowcount"). Is it possible to set this limit to the user that connects to sql server (something like: whatever is the query this user does, set @@rowcount to 10000)? Thanks. Alberto.
We have a SQL Server 2000 that has been working nice without any issues. Lately we noticed the fact that the amount of memory that it is using has increased and once it took down the web server as the total amount of memory used was 2G. Due to this fact I have set Memory Max to 500MB. Now as I look in Task Manager the Memory usage is at 530396k which is 518MB. Any reason why would it exceed the 500MB?
What we did before was to stop the SQL Server and restart it, and it takes about 2 days until it gets back to +500MB.
I've been working on serveral packages for the past hour and after finishing for the night I quickly wanted to check to see how much memory SQL Server was consuming on my laptop. It was using almost 500MB of memory. It typically hovers around 50-100MB when I'm not doing anything with it. Is this normal?
I have two database server, named it DB_A and DB_B. I want to establish a linked server of DB_A in my DB_B. There's a script in DB_B that queries data from DB_A but only execute periodically. Does linked server consume large memory or CPU when it didn't use ?
When Running the standard Memory Consumption Report in the section Buffer Pages Distribution (# Pages) i get the following error
Unable to retrieve data for this section of the report. Following error occurred. Msg 8114, Level 16, State 1 Error converting data type bigint to int.
I have a webservice, which I am using in the Web service Task. The XML response, is fed to the XML source and I want to write the data from this XML to flat file. I am using the below attached schema definition for XML (.XSD). When I use this as XSD in the XML source task, in the column definition I see only three columns and they are
ZoneCode ZoneName AggregatedCZ_Id
My question is what is AggregatedCZ_Id and why I do not see the PostCode in the columns? Any help is appreciated.
I got a task to get resource consumption (CPU, storage etc.) for stored procedures in certain database. I found nice script ([dbo].[usp_Worst_TSQL]) but looks like script written for SQL 2005 and we use SQL 2012.how to get resource consumption details.
We are planning a shared SQL Server 2005 enviroment where users can create databases/applications for themselves and/or departments. With the consideration that there can be multiple SQL Servers on a box, can each instance limit a user's disk space? Thanks for any enlightenment.
Hello. I have received the follwoing error upon an attempt to Browse the Cube. All other tabs are functional, including the Calculations tab. We are running Windows Server 2003 SP2 and SQL Server 2005 SP2. Any suggestions would be greatly appreciated!
**EDIT** - Have confirmed SP1 for VS2005 is installed both locally and on server, also.
Attempted to read or write protected memory. This is often an indication that other memory is corrupt. (Microsoft Visual Studio)
------------------------------ Program Location:
at Microsoft.Office.Interop.Owc11.PivotView.get_FieldSets() at Microsoft.AnalysisServices.Controls.PivotTableFontAdjustor.TransformFonts(Font font) at Microsoft.AnalysisServices.Browse.CubeBrowser.UpdatePivotTable(Boolean translate) at Microsoft.AnalysisServices.Browse.CubeBrowser.UpdateAll(Boolean translate) at Microsoft.AnalysisServices.Browse.CubeBrowser.InitialUpdate() at Microsoft.AnalysisServices.Browse.CubeBrowser.SupportFunctionWhichCanFail(FunctionWhichCanFail function)
I've been researching AWE to determine if we should enable this for our environment.
Currently we have a quad core box with 4 gb of RAM (VMware). OS: Windows 2003 std, SQL Server 2005 std. 3GB is not set but will be as soon as we can perform maintenance on the server.
I have read mixed feedback on AWE, either it works great or grinds you to a hault. I would assume that the grinding to a hault is due to not setting the min/max values correctly or not enabling the lock page in memory setting.
We only have one instance of SQL on the server and this box won't be used for anything else aside from hosting SQL services. We do plan on running SSRS off of this server as well.
1. Will running SSRS and enabling AWE cause me problems? Will I have to reduce the max setting by the SSRS memory usage or will it share and play nice?
2. How do I go about setting the Max value? Should it be less than the physical RAM in the box? Right now its set to the default of 214748364, even if I don't enable AWE should this default value be changed?
3. It seems that even at idle the SQL server holds a lot of memory and the page file grows. If I restart the process in the morning, memory usage in taskmon is at 600mb or so. By the end of the day, its up around 2gb. How can I track down whats causing this, should this even concern me?
4. The lock Page in memory setting worries me. Everything I've read on this seems to give a warning about serious OS and other program support degradation. In some cases to the point where they have to restore the settings on the server before they can bring it back up. What are your thoughts on this.
I have a Windows sever 2012 with sql server 2012 enterprise. Ram size is 22GB. Sometimes SQL sever takes 95% memory.My question, How to reduce memory size without killing any process because it's production server.So there are many background process is running. And,Is there any guides to learn why Memory is raise d so high and how to reduce it.
Hello, I understand that we should use SSMS -> Server Properties -> Memory to put a cap on the SQL server memory usage, therefore it gives some space memory for OS, this is based on the fact if the max memory is not specified, SQL will use whatever available memory and eventually crash the system.
My question is that when a server has SSIS and SSAS services installed along with the SQL service. Would the max memory setting covers the SSIS and SSAS memory usage, or the SSIS and SSAS has to shared the memory with OS?
I am running Visual Studio 2005. I have an SSIS Package which is consuming a huge amount of memory. During the execution of the package the memory keeps increasing. Until finally i get an Out of Memory exception. I have run this package using dtexec, and in the BIDS. No difference. I do have some script components and have added some code to get the assemblies in the current appdomain. I do see that one particular assembly is increasing on every loop. VBAssembly every time it hits the script component is increasing by 6, and along with it the memory is climbing. What is this VBAssembly being used for is there an update to SQL Server Integration Services that I need?
sql server 2000 is running on windows server 2003 ... 4gb of memory on server .... 2003 was allocated 2.3gb nd sql server was allocated (and using all of it) 1.6gb for total of approx 4gb based on idera monitor software ... all memory allocated betweeen the OS and sql server .... then 4 more gb of memory added for total now of 8g ... now idera monitor shows 1.7gb for OS and 1.0 gb for sql server ..... 'system' info shows 8gb memory with PAE ... so I assume that the full 8gb can now be addressed .... why are less resources being used now with more total memory .... especially sql server ..... i thought about specifying a minimum memmry for sql server but i amnot convinced that would even work since it seems that this 1gb limit is artificial .... it it used 1.6 gb before why would it not use at least that much now ??
I've a database with a memory optimized filegroup on it. How can I remove it?I have removed the memory optimized table I had on it, but when I try to remove the filegroup I receive an error.
So I started a new job recently and have noticed a few strange configurations. Typically I would never mess with min memory per query option and index create memory option configuration because i just haven't seen any need to. My typical thought is that if it isn't broke... They have been modified on every single server in my environment.
From Books Online: • This option is an advanced option and should be changed only by an experienced database administrator or certified SQL Server technician. • The index create memory option is self-configuring and usually works without requiring adjustment. However, if you experience difficulties creating indexes, consider increasing the value of this option from its run value.
I did a load testing and found the following observations:
1. The Memory:Pages/sec was crossing the limit beyond 20.
2. The Target Server Memory was always greater than Total Server Memory
Seeing the above data it seems to be memory pressure. But I found that AvailableMemory was always above 200 MB. Also Buffer Cache HitRatio was close to 99.99. What could be the reason for the above behavior?
I am looking to create a constraint on a table that allows multiplenulls but all non-nulls must be unique.I found the following scripthttp://www.windowsitpro.com/Files/0.../Listing_01.txtthat works fine, but the following lineCREATE UNIQUE CLUSTERED INDEX idx1 ON v_multinulls(a)appears to use indexed views. I have run this on a version of SQLStandard edition and this line works fine. I was of the understandingthat you could only create indexed views on SQL Enterprise Edition?
Write a CREATE VIEW statement that defines a view named Invoice Basic that returns three columns: VendorName, InvoiceNumber, and InvoiceTotal. Then, write a SELECT statement that returns all of the columns in the view, sorted by VendorName, where the first letter of the vendor name is N, O, or P.
This is what I have so far,
CREATE VIEW InvoiceBasic AS SELECT VendorName, InvoiceNumber, InvoiceTotal From Vendors JOIN Invoices ON Vendors.VendorID = Invoices.VendorID
I created a query, which makes use of a temp table, and I need the results to be displayed in a View. Unfortunately, Views do not support temp tables, as far as I know, so I put my code in a stored procedure, with the hope I could call it from a View....
I compared view query plan with query plan if I run the same statementfrom view definition and get different results. View plan is moreexpensive and runs longer. View contains 4 inner joins, statisticsupdated for all tables. Any ideas?
Hi~, I have 3 questions about memory based bulk copy.
1. What is the limitation count of IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit(true)? For example, how much insert row at below sample?(the max value of nCount) for(i=0 ; i<nCount ; i++) { pIFastLoad->InsertRow(hAccessor, (void*)(&BulkData)); }
2. In above code sample, isn't there method of inserting prepared array at once directly(BulkData array, not for loop)
3. In OLE DB memory based bulk copy, what is the equivalent of below's T-SQL bulk copy option ? BULK INSERT database_name.schema_name.table_name FROM 'data_file' WITH (ROWS_PER_BATCH = rows_per_batch, TABLOCK);
------------------------------------------------------- My solution is like this. Is it correct?
// CoCreateInstance(...); // Data source // Create session
I had given one of our developers create view permissions, but he wants to also modify views that are not owned by him, they are owned by dbo.
I ran a profiler trace and determined that when he tries to modify a view using query designer in SQLem or right clicks in SQLem on the view and goes to properties, it is performing a ALTER VIEW. It does the same for dbo in a trace (an ALTER View). He gets a call failed and a permission error that he doesn't have create view permissions, object is owned by dbo, using both methods.
If it is doing an alter view how can I set permissions for that and why does it give a create view error when its really doing an alter view? Very confusing.