Any Idea How To Solve SQL Express High Memory Consuming Problem? Thanks
Nov 6, 2007
My Sql express will use up to 1G or more memory and never release.
I had the same problem on SQL 2005 standard, I solved by adding /3G in boot.ini and turn on AWE.
but it seems SQL express doesn;t support AWE. so how could I do here?
Hi, The sqlsrv.exe process has been consuming a high number of cpu cycles and time and seems not be decreasing at all. Can somebody point me in the right direction as to how to identify the bottleneck and resolve the same at the earliest
I've just faced quite interesting case. I have two SQL Servers 7.0 running on two W2K(sp1) Server boxes (Compaq ProLiant 1600 and HP NetServer LC3) both with RAID5. One of the task these servers peform is logging MS Proxy. For more convenient way I created the procedure that show me proxy users, their IP address and URL head where they hit for the last one minute and half. This one as follow:
----- REATE proc usp_showusers as select clientip, clientusername, substring(stuff(uri, 1, 7, ''), 1, patindex('%/%', stuff(uri, 1, 7, ''))) short_uri into #whoweb from msplog where logtime > getdate()-0.001 select clientip, clientusername, short_uri from #whoweb group by clientip, clientusername, short_uri -----
So, when I run this proc on HP box it executes well and SQL keeps on consuming the same amount of memory (approximately 60MB) When I run on Compaq, the proc executes well as well, but promptly gets memory consume up to 150MB (before execution it did ~50(60)MB). In half an hour it pulls down 90MB. In addition to SQL Server the boxes run Lotus Domino R5 and Active Directory. The installed software on them is absolutely identical.
I have a few in house developed application (VB based) that access the SQL server for adding, appending , creating tables. The application does the changes thru queries dynamically generated at the application level.
My MS SQL Server runs on a PIII / 256 MB Ram / 18 GB HDD
The problem is that the memory utilization of SQL server keeps growing constantly. Out of 512 MB (256 Physical + 256 Virtual) available teh memory utilization reaches a level of 490 MB and statys constant. Though SQL Server shows a utilization of 150 MB.
I suspect that SQL is not releasing memory back to the system. Please help in resolving. The problem may lie at the applications developed.
I got a small MSSQL server.. total database file size less than 7GB. with 2G rams installed 2 cpus. but for some reason when i check the task manager process mem usage is over 1.7G. the sql server memory setting in on Dynamically not fixed. and maximum 2G i believe is default. anyway. my question is over 1.7 memory usage is too high? because i dont have alot of transaction going. and cpu usage is very low. wondering if this's normal or not. and if is not normal . what cause the memory usage so high...... and how can i adjust back to normal. ? can anyone help me out? or any suggestion? thanks
I have a 2003 server with sql 2005 on it and the sqlservr.exe is using 880 meg of memeory and it will climb to 1.4 gig. if i reboot server it will go back to 100 meg and slowly climb back up. any ideas i am not a sql guy.
Hi,We have a prod server running on SQL server 2000 64 bit. It is a4cpu server with 16GB of RAM. we have a maxmemory setting of 15.5GBfor sql server. Inspite of 15GB being available for sql server, itstill uses paging file space, a lot. When looking thru task maanger wecan see sql server using 15.5GB of Memory usage and 22GB of Virtualmemory usage. I don't understand why it should even be using closer to7GB of Paging space, when it has so much memory. How does SQl serveruse Virtual memeory vs Physical memory?HAs anyone seen this before.ThanksGG
We have Windows Server 2008R2 installed on VM Server.On that we have three SQL instances running. From few past months we are observing physical memory is going high. Earlier we observed it was at 86-88%, now it is 96-97%.We have 16RAM & 8 CPU cores on VM. what is the best and ideal configuration so that we can rectify the high physical memory issue.
How can I keep my thread alive after an out of memory exception? That is, I understand that sometimes a server may be unable to satisfy a memory request, but I'm okay with waiting -- I'm not okay with being terminated (think of the reaction to Oliver asking for some more). I would think that, in general, when any application makes a request for a resource that is currently unavailble, but may be available at another time, that application (process/thread/fiber) would be put in a Wait Queue for that resource. On a high concurrency system, this could obviously lead to deadlocks; however, I think in the situation I describe below, the killing is overkill.
Discussion & Background:
In my project, I have a SqlFunction, we'll call "SqlDecimal BigFunction()" that will allocate a large chunk of memory (~3MB) and can take anywhere from 20ms to 500ms to complete (on my system, assuming no other processor load). There are also Functions that are used to set control points for BigFunction (implying thread/fiber state -- or, if there is a distinction, Transaction state), which we will call "SqlBoolean SetControlPoint(SqlInt32 x, SqlInt32 y)". The 3MB requirement is constant, regardless of the number of control points. (Incidentally, the actual implementations of these functions are in a referenced assembly)
In low to moderate concurrency (single hyperthreaded CPU with 20 sessions banging it in a loop), it *usually* does okay. In a higher concurrency situation (2 hyperthreaded cpus with 10 sessions stressing this code and 10 other sessions doing regular TSQL Selects) It runs for a long time, but will occasionally throw an out-of-memory exception. (Previously, I was managing my thread state manually with a locked dictionary, an Int32 key, and CreateSession/ReleaseSession calls). When an out of memory exception is thrown while the dictionary is locked, I get an AppDomain unload, which is *completely* unacceptable)
So, I know that sometimes, I won't be able to allocate my 3MB (it could be 3kb, it just shows up more readily with a larger allocation request). That doesn't mean my externalAssembly is "misbehaving" or "off-in-the-weeds". It just means the server is loaded right now and can't satisfy my request. One may catch an OutOfMemory Exception (perhaps to add additional info about the point of failure), but the thread is already being aborted.
I tried modifying this implementation to use a buffer pool that is allocated on start-up. That worked pretty well (reduced % Time in GC a bit, also), but it forced my external assembly to be marked as unsafe rather than just external access because of the Synchronization methods used to manage the buffer pool. It also doesn't scale, at least not as it sits. Its just a fixed size buffer pool. With more processors and less peripheral loading, the extra processors would just be waiting for a buffer. Besides that, I thought there was some escalation policy about "waiting too long", but I may be wrong.
I would like to eliminate the "UNSAFE" attribute from the primary assembly -- mainly because it "sounds scary", but more realistically, because it is unsafe! Or at least, experience in the field points to synchronization issues being a primary cause of unreliability in systems. Also, calling the C# lock, Mutex, Monitor etc call into native code to use the OS for locking. When this happens, SQL doesn't really know what you're waiting for and can't take that info into account when scheduling. All it knows is that you're waiting on an OS lock. I thought the hosting API would've allowed the host to optionally implement its own locking primitives, especially a host that runs its own scheduler.
I've looked into constrained execution regions and Chris Brumme's blog entry on hosting. Using them would help ensure some protection, but I think even they do not protect a thread from being unloaded in the face of an OutOfMemoryException (or any asynch exception); rather, they allow you to safely clean up unmanaged references and ensure state integrity for the appdomain.
At any rate, this is getting a little long winded. If anyone has any feedback, I'd be delighted to hear it.
Thank you.
-Troy
System Info:
SELECT @@version
Microsoft SQL Server 2005 - 9.00.2047.00 (Intel X86) Apr 14 2006 01:12:25 Copyright (c) 1988-2005 Microsoft Corporation Developer Edition on Windows NT 5.1 (Build 2600: Service Pack 2)
We have a system(32GB RAM and 2 TB hard disk, Windows7,SQL SERVER 2008R2 enterprise 64 bit). Looks like whenever i run some query(even query result 50 records) on the database, the Memory utilization is very high(30 GB) in task manager. How can i control this over usage? The memory setting is default in server properties(min 0 and max 2147483647).
My database server memory utilisation is growing faster from past 1 week. it remained same for 1 week around 55% and now it is going to 70% and increasing.
Total OS memory is 32GB and I kept cap for sql server memory upto 29GB. Dont know what to do..
I have been having issues with our SQL server for awhile now. It seems to run out of memory every few days and when I look at the memory dump, the MEMORYCLERK_SQLOPTIMIZER seem to take over memory and eventually cause the server to crash.
Here is the SQL verison we are using: Microsoft SQL Server 2012 (SP1) - 11.0.3460.0 (X64) Jul 22 2014 15:22:00 Copyright (c) Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 6.2 (Build 9200: ) (Hypervisor)..It is on a VM on Windows 2012 server. It has 20gb of RAM allocated to it and the MAX Server Memory is set to 16.5gb.
I have seen the MEMORYCLERK_SQLOPTIMIZER grow to about 11gb at the time of the server crash. Why that is happening? What is causing the memoryclerk_sqloptimizer to get so high? I have looked it up and it looks like it has to do with ad hoc requests, but is there something I can do to bring that memory down when it gets so high so that I can prevent a server crash?Do we just need to add more memory or is there a memory leak somewhere?
I know I should probably be posting this in the RS section but I have a Win 2008 R2 server and RS 2012 along with SSIS and Database server installed. I also have SQL 2005 instance with Sql 2005 SSIS running on the server.
I saw that Reporting services was consuming 9gb of ram a few days ago with no published reports. It's just a default install.
So I investigated the settings and the document: [URL]......
I added this section to the reportserver.config file to restrict memory usage
windows is sql server 2012 64 bit edition and sql server is 2012 64 bit edition.
RAM installed on both server is around 65 gb of which 49 gb is max server memory allocated for sql services on both servers.
database related to reporting services are also in always on group .
We have also configured for reporting services and both are running on their respecting server.
Issue is on primary server reporting service is using almost 7 gb while on secondary it is using 10 gb even when there are 5 reports and its used within offices .
what issue and how to check why ssrs is using high memory..
any query , perfmon counters
reports are randomly used at client side
i have checked memory utilization through task manger..
Within SQL Express 2005 running on Windows server 2003 using asp.net as the interface
This Is the Situation, at first i recieved this error "
Previously we received a €œfailed to generate a user instance of SQL due to a Failure in Starting the process for the user instance. The connection will be closed€?
I looked up this error and it seams to be a flaw/bug in sql server express the error occurs because of the use of remote login on the windows 2003 server.
The solution to this was to delete an sql server directory that resides here C:Documents and SettingsUSERNAMELocal SettingsApplication DataMicrosoftMicrosoft SQL Server DataSQLEXPRESS.
This indeed solved the failed to generate a user instance once the server was re-started however now I get a responce within the asp.net pages (which will have been built into those pages)
Your login attempt was not successful. Please try again.
now i have checked the simple errors caps lock etc and tried to log on to the SQL pages directly on the server with exactly the same result. the system was built by somone else and has not been modified "intentionally" just recently i added a few extra databases to the same server location, could this have caused the problem. to my knowlage (e.g i cant recall changing any settings)
there are one or two extenuation circumstances just recently the whole file directory was transfered from one HDD to another and then back to it's original position now i dont think this altered data but possibly there is some inbuilt security or something, i would most sincearly appreciate any questions or answers about this possibly things i havent mentioned that you need to know, just let me know i will get back to you
Hi Guys, [VS2005, .NET 2.0 SQL Server Express] I could really use some help with this one - as in desperately!: I have a couple of database projects which run fine in Debug Mode from VS2005, but I can't get them to run outside of it. The problem is that, for some reason, the login to the database file isn't succeeding. I'm getting the following errors:
.NET Project: System.Data.SqlClient.SqlException: Failed to generate a user instance of SQL Server due to a failure in starting the process for the user instance. The connection will be closed. DotNetNuke 4.0.3 Project: ERROR: Could not connect to database specified in connectionString for SqlDataProvider. (I sometimes get a "The Server Version does not match the Database Version" error) Here are my connection strings for the above two scenarios: <connectionStrings> <add name="EpgDbConnection" connectionString="Data Source=.SQLEXPRESS;AttachDbFilename=C:ProjectsEpgEpg.mdf;Integrated Security=True;Connect Timeout=30;User Instance=True" providerName="System.Data.SqlClient" /></connectionStrings> <connectionStrings> <add name="SiteSqlServer" connectionString="Data Source=.SQLExpress;Integrated Security=True;User Instance=True;AttachDBFilename=|DataDirectory|Database.mdf;" providerName="System.Data.SqlClient" /></connectionStrings> My assumption is that I need to switch from Integrated Security (is that Windows Security?) to SQL Server Security, but I'm battling to do that effectively. A pointer to a simple step-by-step guide would be fantastic, but, if there isn't such a thing available, I have a number of specific questions: Question 1: Why does one need to switch from Integrated Security to SQL Server Security? If I'm right, I suspect that my connection strings will become: <connectionStrings> <add name="SiteSqlServer" connectionString="Data Source= (local);user id=gary;password=password;Connect Timeout=30" /> and <connectionStrings> <add name="EpgDbConnection" connectionString="Data Source=(local); user id=gary;password=password;Connect Timeout=30" /> respectively. Question 2: Are these strings correct? (I receive a login failed error) Question 3: How do I go about changing the Security Setting, using either VS2005 or SQL Server? (I've modified permissions at a SQL Server Express level from Windows Authentication mode to SQL Server and Windows Authentication mode in SSMS, but the Authentication Method can't seem to be changed at the database file level. View connection string (under database properties), reveals an unalterable "Windows Authentication" Authentication Method.) Question 4: I've managed to add a user and password at the SQL Server Express level. What roles should the user have? I'd be very grateful for any help you could give me. I've tried everything I can find or think of and am not getting anywhere! Thanks very much. Regards Gary
Is there a way to configure mirroring to go from High Availability to High Protection without having to reconfigure Database Mirroring? Using the interface in Management Studio, I can change the configuration option to High Performance, but not High Protection despite both of them being Synchronous.
If not, what are the recommended steps to configure the mirror once it already has been configured? Is just like initially setting up the mirror or would there be any shortcuts I could take? If I stop the mirroring and remove the witness, will the High Protection option be available?
I realise this is a stupid quesiton but i cannot really find any confirmation of this in BOL.
If you are running High Safety with automatic failover, when failover occurs does this automatically change to High Performance mode. SInce for failover to occur something has happen with the primary , it will be impossible to commit transactions on the new primary and mirror asyncronously since 1 of them is no longer available.
So am i correct in assuming that automatic failover also automatically changes the mode to High Performacne for that session?
We are developing some applications for Sunit computers (In-vehicle computers) which have limited memory . Because the applications display maps, the will consume alot of memory. Because of this, we want to restrict the amount of memory and other resources SQL Server Express uses to a minimum. I've tried setting 'max server memory', which doesn't help too much as this is only a limit on the SQL engine and not the sqlserver.exe process. What other settings and restrictions can I apply to limit memory consumption by the process?
We have considered using SQL Server Everywhere, but as it doesn't support ODBC (will there be any ODBC support?), we are not too happy about using it. As of now, we have no experience with OLE DB, and the application is MFC 6.0 so we can't take advantage of ADO.NET
I have 1 problem here. I developed a desktop application and I am using SQL 2005 Express as my application database.
I found that after I closed my application, the SQL 2005 Express memory resources do not release. How do I release the memory resources except of restart the SQL services?
Do you guy have any better recommendation of other database is more suitable for desktop application?
I am aware that SQL Express has a memory limitation of 1GB. I am trying to determine if this 1GB memory is shared with the Report Server. In other words, if I run a query that requires 600MB of memory, and a report that pushes the Report Server to 600MB of memory, will I exceed the 1GB limit for SQL Express? Or are these two separate processes for memory limitation purposes?
If they are separate processes, are there any limitations to the Report Server as far as performance when using the Express edition of SSRS? If not, would it not make sense to push the Report Server harder than the database when running reports if SQL Express is my target platform? (e.g., do MORE work in the RS, and less in the database itself)
SQL Express Ed is limited to one CPU, what happens if the CPU is a dual core CPU? Does SQL Express care? Does it see a dual core as a single CPU?
As for memory, stupid question, SQL Express is limited to 1 GB RAM, if I run it on a server with say 4GB of ram packed into it, will it affect SQL Express' performance? Will SQL Express even run if there is more than 1GB available? Or will SQL Express simply ignore the extra RAM with no negative consequences?
I am new here and new to SQL Express. I've searched for my issue, but can 't quite find anything close to the problem or how to solve it, if it's even solvable. I am using SQL Express on a pc to connect to the back end of a database. The front end application (an access runtime) also runs on the same pc. This pc is on a domain. I think I've tried every combination of protocols, and although connectivity via ODBC is successful, the application can't connect - gives the "server doesn't exist or access denied". When I log on to this computer with the "machine" logon (not the domain), I have SQL Express configured to use shared memory, the application runs just fine. I need to use this database for testing in a non productivity environment, but I really hate to log off the domain to run it. Ideas?
I'm hoping that some might have an answer for me after much net searching.
I have a server (200GB disk space, Dual 3.8 GHZ processors, 4GB memory) that hosts 6, very small, SharePoint sites (WSS 2.0) and SQL 2005 express handling 1 config DB and 7 content DB's. The biggest DB at the moment is only 300 MB and the sites are not actively being used yet, they are only open to a select number of users (+ - 25 in total) who are using them as reference "areas" at the moment. Each site is running in it's own application pool as well.
I find that the sqlserv.exe process increases in memory usage and does not seem to decrease. It gets to the point of 960 MB usage and then databases cannot be used (SQL 2005 Express max memory is 1GB). SQL seems to "close" them down and site errors being received are "site is not in configuration database". When the SQL service is restarted the memory usage idles around 70 - 80 MB, the sites are 100% again but after an hour or 2 the memory usage is sitting at 560 MB again and doesn't seem to decrease.
Is there a way I can bring this memory usage down?
Hi there, I've been following Vineets and David's procedures to consume web services using SQL CLR to the t. I created my web service in C#.NET 2005, and generated my proxy using this command:
I added both files to the project, set Generate Serialization Assembly to on and compiled it.I then generated a strong name key for the assembly and signed my assembly with that key.Inside my post-build event I added the following script:
"E:DevelopmentMicrosoft Visual Studio 8SDKv2.0Binsgen.exe" /force /compiler:/keyfile:SmsServiceKey.snk /t:StoredProcedures $(TargetDir)$(TargetName).dll
This compiled into my assembly, the XmlSerializer assembly and then added strong name key to both.
In SQL Server 2005, I enabled CLR, made my DB trustworthy, created my first assembly with permissions EXTERNAL ACCESS and then the XmlSerializer assembly with permissions SAFE. I created my stored procedure and ran it. When I did I got this error which I assumed the XmlSerializer was supposed to solve for me:
System.InvalidOperationException: Cannot load dynamically generated serialization assembly. In some hosting environments assembly load functionality is restricted, consider using pre-generated serializer. Please see inner exception for more information. ---> System.IO.FileLoadException: LoadFrom(), LoadFile(), Load(byte[]) and LoadModule() have been disabled by the host.
I have seen alot of posts about this error, but none of them has been able to solve my problem.
We've been using Sql Server 2005 for some time now and we've found it to be stable and pretty much bug free, well done! (as opposed to Oracle 10g, which is bug-ridden) I'm not sure if this is the right forum to post this in, but it felt like the best place to start. Please move my post if it is not correct.
In the last few weeks we've had a very serious problem; whenever we try to execute our upgrade scripts the CPU usage spikes to 100%. And then every insert takes > 0.5 seconds. Needless to say, but I'll do it anyway, this is a huge problem. Some of our scripts have > 100 000 inserts, so the upgrade process takes "forever".
At this point it might be useful to talk just briefly about how we use our Sql Server; We have an application that is configured via the database (we support Sql Server and Oracle) and so we use the database for a lot more than just a "dumb" datastore. We do massive amounts of insert/update for every version of our application and we add new tables quite often. We have a build system, using TFSBuild, where we do nightly builds. That build will drop the current database (example of name: MONDAY, TUESDAY etc.) and restore it again using an image. Then we upgrade the new database to the most current version of our scripts. On our Sql Server 2005 server we have 4 such databases so the process is repeated 4 times each night. And since we build our database using a baseline image and scripts (both stored in source control) we do not need backups of the databases, so no backup plan is running. I'm only mentioning that because we've had a theory that this might be the cause of our problems...
So, does anybody have an idea how we might go about solving this?
Hi, Ive been taking a look at how to consume events from a package when executing programatically.
Ive got some code (copied below) that creates a package programatically, adds a sequence container then within that adds a script task , then executes it using the overloaded method of Package.Execute() that takes an IDtsEvents argument.
My class that implements IDtsEvents simply output a message to the console for each event type.
Weird thing is, when I execute, this is the only output I get:
What I find weird is that I dont get information for loads of other event types. I would at least have expected to see some OnPostExecute events.
Anyone know why i dont see all of the events?
Thanks Jamie
Heres the code:
Code Snippet using System; using System.Collections.Generic; using System.Text; using Microsoft.SqlServer.Dts.Runtime; using Microsoft.SqlServer.Dts.Tasks.ScriptTask; namespace Package_API { class Program { static void Main(string[] args) { Console.WriteLine("Starting..."); Package p = new Package(); p.InteractiveMode = true; p.OfflineMode = true; // Add a Script Task to the package. Sequence s = (Sequence)p.Executables.Add("STOCK:Sequence"); TaskHost taskH = (TaskHost)s.Executables.Add("STOCK:ScriptTask"); // Run the package. DtsEvents events = new DtsEvents(); p.Execute(null,null,events,null,null); //p.Execute(); if (p.ExecutionResult == DTSExecResult.Failure || p.ExecutionStatus == DTSExecStatus.Abend) Console.WriteLine("Package failed or abended"); else Console.WriteLine("Package ran successfully"); Console.ReadLine(); } } }
I have a 2GHZ cpu with 1GB of RAM. I occassionally see very slow (long) queries against a local SQL Server 2005 Express (SP2) database. The issue occurs against different SQL Queries, but all queries are rather basic select statements Perfmon shows that the SQL Server counter for the "MEMORY GRANT QUEUE WAIT Avg MS" gets extremely high (25000+ ms). Perfmon also also shows that PAGING is not occuring, and the system is not under unsual stress. The problem is not reproducible with MSDE.
Has anyone seen this issue, or have any recommendations for a next course of action?
I have a sotred procedure using a cursor which sort data and create subsets based on same oid and same decision_date columns, for each subset I am trying to order them and affect values 1, 2,... for each record in a different subset. The stored procedure seems to work very well and fast against a small tables (during tests). When used against a table with 200,000 records it takes more than 24 hours... I am looking for a help to make it work faster, thanks guys. here is the stored procedure:
CREATE PROCEDURE ORDERING_TEST_PROCEDURE AS
DROP TABLE REVIEWS_TEST_TABLE
SELECT OID,DECISION_DATE,DECISION_ID,VOTES_REQUIRED, ORDERING INTO REVIEWS_TEST_TABLE FROM DECISION_FLAGS WHERE VOTES_REQUIRED IN ('1','2','3') and FINAL_DECISION_CODE NOT IN ('0800','0810','0840') and DECISION_TYPE_CODE < '0100' ORDER BY OID, DECISION_DATE,VOTES_REQUIRED DESC
CREATE INDEX OID_DECISIONID_DATE_INDEX ON CRIMS.dbo.REVIEWS_TEST_TABLE (DECISION_ID, DECISION_DATE, OID)
CREATE INDEX VOTESREQUIRED_INDEX ON CRIMS.dbo.REVIEWS_TEST_TABLE (VOTES_REQUIRED)
CREATE INDEX DECISIONID_INDEX ON CRIMS.dbo.REVIEWS_TEST_TABLE (DECISION_ID)
set @oldoid = 'space' set @olddecision_date = 'space' set @oid = 'space' set @decision_date = 'space' set @votes_required='space' set @Decision_id = 'space' set @ordering = '0' set @ordering_count = 0
declare review_test_cursor cursor for select oid,decision_date,votes_required,ordering,decision _id from CRIMS.dbo.reviews_test_table order by oid,decision_date,votes_required asc open review_test_cursor fetch review_test_cursor into @oid,@decision_date,@votes_required, @ordering,@Decision_id
while (@@fetch_status = 0 ) begin if @oldoid <> @oid or @olddecision_date <> @decision_date begin set @oldoid = @oid set @olddecision_date = @decision_date set @ordering_count=0 end update reviews_test_table set ordering = CAST ((@ordering_count + 1) as VARCHAR) where decision_id = @Decision_id set @ordering_count = @ordering_count + 1
fetch review_test_cursor into @oid,@decision_date,@votes_required, @ordering,@Decision_id end
close review_test_cursor deallocate review_test_cursor
/*********************************/ UPDATE DECISION_FLAGS SET ORDERING = '0'
UPDATE DECISION_FLAGS SET DECISION_FLAGS.ORDERING = TEM.ORDERING FROM DECISION_FLAGS DEC, REVIEWS_TEST_TABLE TEM