High Concurrency, Memory Allocations, And AppDomains
May 2, 2006
Hello,
Question:
How can I keep my thread alive after an out of memory exception? That is, I understand that sometimes a server may be unable to satisfy a memory request, but I'm okay with waiting -- I'm not okay with being terminated (think of the reaction to Oliver asking for some more). I would think that, in general, when any application makes a request for a resource that is currently unavailble, but may be available at another time, that application (process/thread/fiber) would be put in a Wait Queue for that resource. On a high concurrency system, this could obviously lead to deadlocks; however, I think in the situation I describe below, the killing is overkill.
Discussion & Background:
In my project, I have a SqlFunction, we'll call "SqlDecimal BigFunction()" that will allocate a large chunk of memory (~3MB) and can take anywhere from 20ms to 500ms to complete (on my system, assuming no other processor load). There are also Functions that are used to set control points for BigFunction (implying thread/fiber state -- or, if there is a distinction, Transaction state), which we will call "SqlBoolean SetControlPoint(SqlInt32 x, SqlInt32 y)". The 3MB requirement is constant, regardless of the number of control points. (Incidentally, the actual implementations of these functions are in a referenced assembly)
In Code:
////////////////////////////////////////////////////////////////////////////////////////////////////
[ThreadStatic] // UNSAFE
static externalAssembly.MyClass myObj;
[SqlFunction]
SqlDecimal BigFunction()
{
if(myObj == null) return -1;
// DoSomeWork will do something like, byte[] b = new byte[3 * 1024 * 1024];
return externalAssembly.DoSomeWork(myObject)
}
[SqlFunction]
SqlBoolean SetControlPoint(SqlInt32 x, SqlInt32 y)
{
if(myObj == null) myObj = new externalAssembly.MyClass();
myObj.SetPoint(x.Value, y.Value);
return SqlBoolean.True; // because we can't have 'void' return type
}
////////////////////////////////////////////////////////////////////////////////////////////////////
In low to moderate concurrency (single hyperthreaded CPU with 20 sessions banging it in a loop), it *usually* does okay. In a higher concurrency situation (2 hyperthreaded cpus with 10 sessions stressing this code and 10 other sessions doing regular TSQL Selects) It runs for a long time, but will occasionally throw an out-of-memory exception. (Previously, I was managing my thread state manually with a locked dictionary, an Int32 key, and CreateSession/ReleaseSession calls). When an out of memory exception is thrown while the dictionary is locked, I get an AppDomain unload, which is *completely* unacceptable)
So, I know that sometimes, I won't be able to allocate my 3MB (it could be 3kb, it just shows up more readily with a larger allocation request). That doesn't mean my externalAssembly is "misbehaving" or "off-in-the-weeds". It just means the server is loaded right now and can't satisfy my request. One may catch an OutOfMemory Exception (perhaps to add additional info about the point of failure), but the thread is already being aborted.
I tried modifying this implementation to use a buffer pool that is allocated on start-up. That worked pretty well (reduced % Time in GC a bit, also), but it forced my external assembly to be marked as unsafe rather than just external access because of the Synchronization methods used to manage the buffer pool. It also doesn't scale, at least not as it sits. Its just a fixed size buffer pool. With more processors and less peripheral loading, the extra processors would just be waiting for a buffer. Besides that, I thought there was some escalation policy about "waiting too long", but I may be wrong.
I would like to eliminate the "UNSAFE" attribute from the primary assembly -- mainly because it "sounds scary", but more realistically, because it is unsafe! Or at least, experience in the field points to synchronization issues being a primary cause of unreliability in systems. Also, calling the C# lock, Mutex, Monitor etc call into native code to use the OS for locking. When this happens, SQL doesn't really know what you're waiting for and can't take that info into account when scheduling. All it knows is that you're waiting on an OS lock. I thought the hosting API would've allowed the host to optionally implement its own locking primitives, especially a host that runs its own scheduler.
I've looked into constrained execution regions and Chris Brumme's blog entry on hosting. Using them would help ensure some protection, but I think even they do not protect a thread from being unloaded in the face of an OutOfMemoryException (or any asynch exception); rather, they allow you to safely clean up unmanaged references and ensure state integrity for the appdomain.
At any rate, this is getting a little long winded. If anyone has any feedback, I'd be delighted to hear it.
Thank you.
-Troy
System Info:
SELECT @@version
Microsoft SQL Server 2005 - 9.00.2047.00 (Intel X86) Apr 14 2006 01:12:25 Copyright (c) 1988-2005 Microsoft Corporation Developer Edition on Windows NT 5.1 (Build 2600: Service Pack 2)
View 5 Replies
ADVERTISEMENT
Sep 27, 2001
I have a few in house developed application (VB based) that access the SQL server for adding, appending , creating tables. The application does the changes thru queries dynamically generated at the application level.
My MS SQL Server runs on a PIII / 256 MB Ram / 18 GB HDD
The problem is that the memory utilization of SQL server keeps growing constantly. Out of 512 MB (256 Physical + 256 Virtual) available teh memory utilization reaches a level of 490 MB and statys constant. Though SQL Server shows a utilization of 150 MB.
I suspect that SQL is not releasing memory back to the system. Please help in resolving. The problem may lie at the applications developed.
Jdindian
View 2 Replies
View Related
Sep 25, 2006
hi all
I got a small MSSQL server.. total database file size less than 7GB. with 2G rams installed 2 cpus. but for some reason when i check the task manager process mem usage is over 1.7G. the sql server memory setting in on Dynamically not fixed. and maximum 2G i believe is default. anyway. my question is over 1.7 memory usage is too high? because i dont have alot of transaction going. and cpu usage is very low. wondering if this's normal or not. and if is not normal . what cause the memory usage so high...... and how can i adjust back to normal. ? can anyone help me out? or any suggestion? thanks
View 4 Replies
View Related
Jan 29, 2008
I have a 2003 server with sql 2005 on it and the sqlservr.exe is using 880 meg of memeory and it will climb to 1.4 gig. if i reboot server it will go back to 100 meg and slowly climb back up. any ideas i am not a sql guy.
View 2 Replies
View Related
Jul 23, 2005
Hi,We have a prod server running on SQL server 2000 64 bit. It is a4cpu server with 16GB of RAM. we have a maxmemory setting of 15.5GBfor sql server. Inspite of 15GB being available for sql server, itstill uses paging file space, a lot. When looking thru task maanger wecan see sql server using 15.5GB of Memory usage and 22GB of Virtualmemory usage. I don't understand why it should even be using closer to7GB of Paging space, when it has so much memory. How does SQl serveruse Virtual memeory vs Physical memory?HAs anyone seen this before.ThanksGG
View 2 Replies
View Related
Sep 10, 2015
We have Windows Server 2008R2 installed on VM Server.On that we have three SQL instances running. From few past months we are observing physical memory is going high. Earlier we observed it was at 86-88%, now it is 96-97%.We have 16RAM & 8 CPU cores on VM. what is the best and ideal configuration so that we can rectify the high physical memory issue.
View 13 Replies
View Related
Apr 28, 2015
We have a system(32GB RAM and 2 TB hard disk, Windows7,SQL SERVER 2008R2 enterprise 64 bit). Looks like whenever i run some query(even query result 50 records) on the database, the Memory utilization is very high(30 GB) in task manager. How can i control this over usage? The memory setting is default in server properties(min 0 and max 2147483647).
View 7 Replies
View Related
Mar 5, 2014
My database server memory utilisation is growing faster from past 1 week. it remained same for 1 week around 55% and now it is going to 70% and increasing.
Total OS memory is 32GB and I kept cap for sql server memory upto 29GB. Dont know what to do..
View 9 Replies
View Related
Jun 11, 2015
I have been having issues with our SQL server for awhile now. It seems to run out of memory every few days and when I look at the memory dump, the MEMORYCLERK_SQLOPTIMIZER seem to take over memory and eventually cause the server to crash.
Here is the SQL verison we are using: Microsoft SQL Server 2012 (SP1) - 11.0.3460.0 (X64) Jul 22 2014 15:22:00 Copyright (c) Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 6.2 (Build 9200: ) (Hypervisor)..It is on a VM on Windows 2012 server. It has 20gb of RAM allocated to it and the MAX Server Memory is set to 16.5gb.
I have seen the MEMORYCLERK_SQLOPTIMIZER grow to about 11gb at the time of the server crash. Why that is happening? What is causing the memoryclerk_sqloptimizer to get so high? I have looked it up and it looks like it has to do with ad hoc requests, but is there something I can do to bring that memory down when it gets so high so that I can prevent a server crash?Do we just need to add more memory or is there a memory leak somewhere?
View 2 Replies
View Related
Aug 25, 2015
I know I should probably be posting this in the RS section but I have a Win 2008 R2 server and RS 2012 along with SSIS and Database server installed. I also have SQL 2005 instance with Sql 2005 SSIS running on the server.
I saw that Reporting services was consuming 9gb of ram a few days ago with no published reports. It's just a default install.
So I investigated the settings and the document: [URL]......
I added this section to the reportserver.config file to restrict memory usage
<WorkingSetMaximum>512000</WorkingSetMaximum>
<WorkingSetMinimum>128000</WorkingSetMinimum>
Any example altering the reportserver.config file in your standard build of sql server reporting services?
View 0 Replies
View Related
Nov 6, 2007
My Sql express will use up to 1G or more memory and never release.
I had the same problem on SQL 2005 standard, I solved by adding /3G in boot.ini and turn on AWE.
but it seems SQL express doesn;t support AWE. so how could I do here?
thanks
View 6 Replies
View Related
Mar 26, 2015
We have two server configured in always on mode .
windows is sql server 2012 64 bit edition and sql server is 2012 64 bit edition.
RAM installed on both server is around 65 gb of which 49 gb is max server memory allocated for sql services on both servers.
database related to reporting services are also in always on group .
We have also configured for reporting services and both are running on their respecting server.
Issue is on primary server reporting service is using almost 7 gb while on secondary it is using 10 gb even when there are 5 reports and its used within offices .
what issue and how to check why ssrs is using high memory..
any query , perfmon counters
reports are randomly used at client side
i have checked memory utilization through task manger..
View 0 Replies
View Related
Jun 8, 2015
I've got a problem: in few months from now my IDs (int) will reach max positive value. The only viable solution is to convert int-s to bigint-s.
There is a story behind this but I believe it's irrelevant. Simply, I need to 'upgrade' int-s to bigint-s; let's take it as a fact.
Question is: is there a formula to calculate increase in disk and RAM consumption? If we presume I'd need to have same number of pages in buffer cache, how can I calculate how much more memory I need?
Just to ease calculation, let's suppose that I have single table with 1e6 rows and 3 columns:
ID (int), PARENT_ID (int) and NAME (NVARCHAR(30)).
PK is on ID (let's call it PK_ID) and there are 2 indexes on PARENT_ID (i.e. IX_PARENT) and NAME (IX_NAME).
Now, if I turn int-s into bigint-s and still want to keep entire table in cache, how I can calculate how much memory I'd need?
View 5 Replies
View Related
Apr 23, 2007
Hi,
Is there a way to configure mirroring to go from High Availability to High Protection without having to reconfigure Database Mirroring? Using the interface in Management Studio, I can change the configuration option to High Performance, but not High Protection despite both of them being Synchronous.
If not, what are the recommended steps to configure the mirror once it already has been configured? Is just like initially setting up the mirror or would there be any shortcuts I could take? If I stop the mirroring and remove the witness, will the High Protection option be available?
Thanks,
J.
View 3 Replies
View Related
Mar 6, 2008
Hi There
I realise this is a stupid quesiton but i cannot really find any confirmation of this in BOL.
If you are running High Safety with automatic failover, when failover occurs does this automatically change to High Performance mode. SInce for failover to occur something has happen with the primary , it will be impossible to commit transactions on the new primary and mirror asyncronously since 1 of them is no longer available.
So am i correct in assuming that automatic failover also automatically changes the mode to High Performacne for that session?
Thanx
View 4 Replies
View Related
Jul 13, 2015
I am looking to test this feature - and the "Transaction Performance Collector" has recommended me a table to port to In-Memory OLTP.
I have now tried the "Table Memory Optimization Advisor" tool.
After a couple of tweaks to the table design - the tool is now passing validation but the tool is not allowing to progress to the next step:
Could it be down to not having enough memory? But would this not show in the advisor?
View 4 Replies
View Related
Sep 28, 2007
Hello. I have received the follwoing error upon an attempt to Browse the Cube. All other tabs are functional, including the Calculations tab. We are running Windows Server 2003 SP2 and SQL Server 2005 SP2. Any suggestions would be greatly appreciated!
**EDIT** - Have confirmed SP1 for VS2005 is installed both locally and on server, also.
Attempted to read or write protected memory. This is often an indication that other memory is corrupt. (Microsoft Visual Studio)
------------------------------
Program Location:
at Microsoft.Office.Interop.Owc11.PivotView.get_FieldSets()
at Microsoft.AnalysisServices.Controls.PivotTableFontAdjustor.TransformFonts(Font font)
at Microsoft.AnalysisServices.Browse.CubeBrowser.UpdatePivotTable(Boolean translate)
at Microsoft.AnalysisServices.Browse.CubeBrowser.UpdateAll(Boolean translate)
at Microsoft.AnalysisServices.Browse.CubeBrowser.InitialUpdate()
at Microsoft.AnalysisServices.Browse.CubeBrowser.SupportFunctionWhichCanFail(FunctionWhichCanFail function)
View 4 Replies
View Related
Oct 11, 2007
I've been researching AWE to determine if we should enable this for our environment.
Currently we have a quad core box with 4 gb of RAM (VMware). OS: Windows 2003 std, SQL Server 2005 std. 3GB is not set but will be as soon as we can perform maintenance on the server.
I have read mixed feedback on AWE, either it works great or grinds you to a hault. I would assume that the grinding to a hault is due to not setting the min/max values correctly or not enabling the lock page in memory setting.
We only have one instance of SQL on the server and this box won't be used for anything else aside from hosting SQL services. We do plan on running SSRS off of this server as well.
1. Will running SSRS and enabling AWE cause me problems? Will I have to reduce the max setting by the SSRS memory usage or will it share and play nice?
2. How do I go about setting the Max value? Should it be less than the physical RAM in the box? Right now its set to the default of 214748364, even if I don't enable AWE should this default value be changed?
3. It seems that even at idle the SQL server holds a lot of memory and the page file grows. If I restart the process in the morning, memory usage in taskmon is at 600mb or so. By the end of the day, its up around 2gb. How can I track down whats causing this, should this even concern me?
4. The lock Page in memory setting worries me. Everything I've read on this seems to give a warning about serious OS and other program support degradation. In some cases to the point where they have to restore the settings on the server before they can bring it back up. What are your thoughts on this.
View 3 Replies
View Related
Oct 25, 2006
Hi everybody,I need to understand how concurrency excatly
work in asp.net. For example, I'm confused what happens if two users at
the same time try to access the same record in a table or even the same
variable. Do ASP.NET handle this , I mean by locking one user and
letting the other to have access OR it's up to the programmer to write
some code to lock shared resources such as database , objects and
variables?If it's up to the programmer to do this task, I appreciate if you can show me an example that clarifies that.Thank you
View 2 Replies
View Related
Jul 23, 2005
Following on from a thread I started about "concurrency" (real-time-ishsystem), I thought I would play about to see if I could easily adapt my datamodel to take account of potential multi-user write conflicts. So, I wouldappreciate you checking my logic/reasoning to see if this kind of thingwill work. Below I have a stored procedure that will simply delete a givenrecord from a given table. I have appended a "_Written" counter to thecolumns of the table. Every time the record is written, the counter isincremented. Clients store the current _Written count in their objects andpass this in to any write procedure executed.The procedure explicitly checks the _Written count within the transaction tosee if it agress with the written count passed in by the client. If it doesnot, the client throws an error. Note I am explicitly checking the_Written count precisely so I can determine exactly why this operation mightfail, rather than checking @@ROWCOUNT after an update.Thanks.RobinCREATE PROCEDURE dbo.proc_DS_Remove_DataSet@_In_ID INTEGER,@_In_Written INTEGERASDECLARE @Error INTEGERDECLARE @WRITTEN INTEGERBEGIN TRANSACTIONSET @Error = @@ERRORIF @Error = 0BEGINSELECT @WRITTEN = _Written FROM MyTable WHERE ID = @_In_IDSET @Error = @@ERRORIF @WRITTEN <> @_In_WrittenBEGINRAISERROR ('10', 16, 1)SET @Error = @@ERRORENDENDIF @Error = 0BEGINDELETE FROM MyTable WHERE ID = @_In_IDSET @Error = @@ERRORENDIF @Error = 0COMMIT TRANSACTIONELSEROLLBACK TRANSACTIONRETURN @Error
View 5 Replies
View Related
May 27, 2007
Do single commands (or stored procedures) execute concurrently, or they are executed one by one. How do you perform a lock during the execution of a command (or stored procedure).
View 3 Replies
View Related
Aug 28, 2015
I have a Windows sever 2012 with sql server 2012 enterprise. Ram size is 22GB. Sometimes SQL sever takes 95% memory.My question, How to reduce memory size without killing any process because it's production server.So there are many background process is running. And,Is there any guides to learn why Memory is raise d so high and how to reduce it.
View 10 Replies
View Related
Jun 20, 2006
I have a user object that is stored in the session for each user but what if an administrator updates a certain user and I want to reflect the update to the user if they are logged in?One possible way of solving this is:Each time the user goes to a page, check the user table and compare the timestamp. That would mean if 30 users refresh the page..the db would hit 30 times lol. I don't think that would scale very well.Any ideas on how to solve this?
View 5 Replies
View Related
Aug 4, 2006
I have a table where I count how many emails of a given type are sent out each day. This incrementing is wrapped in a sproc that either inserts a new row, or updates the existing row. The column that counts the value is named Count of type INT.
Below is the sproc, seems like a straightforward thing. However, I'm seeing email counts higher than they should be when there's a high number of concurrent executions of the sproc. I'm pretty sure it's not a problem in the calling code, so I'm wondering about the UPDATE statement, since it updates a column based on the value of the column. I would think this should work since it's wrapped in a SERIALIZABLE transaction, anybody have further insight?
SQL Server 2005 by the way.
Sean
CREATE PROCEDURE [dbo].[IncrementEmailCounter]( @siteId SMALLINT, @messageType VARCHAR(20), @day SMALLDATETIME) ASBEGIN SET NOCOUNT ON;
SET TRANSACTION ISOLATION LEVEL SERIALIZABLE BEGIN TRANSACTION
IF (SELECT COUNT(*) FROM EmailCount WHERE SiteId = @siteId AND MessageType = @messageType AND [Day] = @day) = 0 INSERT INTO EmailCount (SiteId, MessageType, [Day], [Count]) VALUES (@siteId, @messageType, @day, 1) ELSE UPDATE EmailCount SET [Count] = [Count] + 1 WHERE SiteId = @siteId AND MessageType = @messageType AND [Day] = @day
COMMIT TRANSACTION SET TRANSACTION ISOLATION LEVEL READ COMMITTEDEND
View 3 Replies
View Related
Feb 17, 2007
I'm wondering whether the following code would work if users are RAPIDLY registering (assumption) WITH the same username.public bool UsernamExists(string username)
{
string sql = "SELECT true FROM [users] WEHRE username = @username;";
return Convert.ToBoolean(comm.ExecuteScalar());
}
public bool Signup(User user)
{
bool usernameExists = UsernameExists(user.Username);
if( usernameExists ) return false;
//update or insert sql for user etc blah blah
} If two users try to signup AT THE VERY SAME TIME (DOWN TO THE NANOSECOND), would this technique work? Do I have to wrap it in a transaction, stored procedure?? Thanks.
View 8 Replies
View Related
Jun 1, 2007
Hi,I'm trying to implement Optimistic Concurrency in asp 2 but so far it has caused me nothing but problems.First, when doing an UPDATE I tried to use the primary key & a timestamp field which I had in SQL Express.. VS 2005 generated the stored procedures fine however when it came to the actual updating I think there was a problem with the conversion of the timestamp field when it was being stored in a text box (in a FormView control). So.. as a result that failed. And also I checked sooo many places online and haven't been able to find any examples of code where a timestamp was used with success in asp2.Next, I got ride of the timestamp type (in SQL Express database) and used a datetime and then.. I just implemented Optimistic Concurrency by passing in ALL the values (ie all the original values) like is proposed http://www.asp.net/learn/dataaccess/tutorial21vb.aspx?tabid=63 . This... works however I really do not want to have to pass in ALL these values (ie original and new).Ideally I would like to be able to use the primary key & the datetime field to handle the Optimistic Concurrency checks where only the original values of both those fields are passed back into the stored procedure. Now.. I tried this as well, but I kept getting an error that suggests that (for some reason) the FormView or DataSource is passing ALL the values (original & new) into the dataset as opposed to only the original primary key & datetime fields & the new set of values.Can ANYONE offer any help? I really would like not to have to pass in all these values.Thanks in advance!
View 6 Replies
View Related
Apr 6, 2005
Hi! I'm building a web application with ASP.NET, and using MS SQL 2000 for my database server.
How should I do to guarantee the integrity of the data in spite of the concurrent access? Meaning... how can I make sure that more than 1 user can update 1 table at the same time, while no error will occur? Do I need to add some codes at my aspx file? Or do I need to do something to my database? Or do I not have to worry about it?
Thank you.
View 1 Replies
View Related
Sep 27, 1999
Can anyone help with concurrency issues. Small network and only one person at a time can log into the database.
It was originally written in MS Access and converted to SQL 7.0 with a VB front end.
Thanks,
Rick
View 1 Replies
View Related
Feb 2, 1999
We have two processors on which sql server was installed.
SMP Concurrency opion is -1. and process priority is 0.
The computer is dedicated to Sql server.
with the same setup, has any body faces any problem....?
some times suddenly system hangs... sql sever takes 99% cpu usage ...
Shall I run the SQL Executive, SQL Performance Monitor with this setup...?
will it afect the system...?
[ this kind of setup - those you experienced.. ]
please reply immediately....
thanks
Wincy
View 1 Replies
View Related
Mar 31, 1999
Hello,
SQL Server6.5 is installed on a dual processor computer. So can I make use of dual processors by setting SMP concurrency to -1 or 2.
I tried setting these values but failed to do so. Server is running at setting 1 always, irrespective of configured value.
Any suggestion???
Srini
View 3 Replies
View Related
Sep 6, 2004
Is there any way to get the sample below working so that both "threads" are guaranteed to get unique and incrementing values?
I'm suspecting the answer is no. You can use transactions on completely database oriented operations that read/write to a database and complete. But there aren't complete synchronization controls for operations like below that try to return a value to an outside process.
IF OBJECT_ID('SimpleTable') IS NOT NULL
DROP TABLE SimpleTable
CREATE TABLE SimpleTable (
A INTEGER
)
INSERT INTO SimpleTable (A) VALUES (1)
-- Run in one window
DECLARE @value INTEGER
BEGIN TRANSACTION
SELECT TOP 1 @value = A FROM SimpleTable
WAITFOR DELAY '00:00:05'
UPDATE SimpleTable SET A = @value + 1
COMMIT TRANSACTION
SELECT @value
SELECT A FROM SimpleTable
-- Run in a second window
DECLARE @value INTEGER
BEGIN TRANSACTION
SELECT TOP 1 @value = A FROM SimpleTable
UPDATE SimpleTable SET A = @value + 1
COMMIT TRANSACTION
SELECT @value
SELECT A FROM SimpleTable
View 7 Replies
View Related
Oct 5, 2006
Suppose process A is updating record #1 in table T.
By default, can other processes read record #1 while the updating is in progress ??
If the answer is Yes, then which value can they see - the old one or the new one ?
Thank you in advance.
View 1 Replies
View Related
Aug 16, 2006
Did I understand correctly the Pesimistic access:
When we choose pessimistic access
-we lock all the rows of the table. right.
-other users still can access the table if they specify read only mode meaning if their intention is to only read the data and not modify it. am I right
Thanks for your help.
View 1 Replies
View Related