i am using visual web developer 2005 and SQL Express 2005 with VB as the code behind
i am using the following code to update the database table
Dim update As New SqlDataSource()
update.ConnectionString = ConfigurationManager.ConnectionStrings("DatabaseConnectionString").ToString()
update.UpdateCommandType = SqlDataSourceCommandType.Text
update.UpdateCommand = "UPDATE orderdetail SET fromdesignstatus = 2 ,progresspercentage = 15 , fromdesignlink = '" + designlink + "' WHERE order_id =" + ordersid.ToString()
update.Update()
update.Dispose()
update = Nothing
i am using update.Dispose() and update = nothing to release the memory
is it really necessary to use both the commands
if not , in my case which one is enough and what is the reason
please help me
My application is a VB.net client server app with SQL Express on the backend, for some reason both my SQL server and Application continue to increase their memory usage over time, every procedure utilizes the close method of the sqlconnection and then sets the sqlconnection to nothing. Is there anything else I should be doing to close the connection and prevent this memory increase?
Has anyone ever seen a situation where SQLSRVR.EXE starts gobbling RAM when under load but does not seem to release it (as seen by mem usage under Task Manager or the related PerfMon counters?) I am running a test of 4 client applications that are hammering against the server but when I check the stats memory is consumed up to the maximum - when I halt the client applications and reduce the processing load to zero the usage stats still show the SQL engine as holding the memory.
I'm running a copy of SQL 7.0 EE on Win2K Advanced Server, using a Compaq 8500 w/ 750MB RAM.
vighnahar writes "I am using SQL server 2000 / 2005 If I run any query on SQL server it is using some memory for it’s execution but not releasing it’s memory after completion on SQL query. This is giving a problem in my application. Where for each user it is consuming 400 MB RAM on SQL server. after login of fifteen users server is getting slow. Is there any way by which I can release memory of SQL server as I don’t want SQL server to keep it’s result etc in memory, So that I can use this memory for other processing.
I am writing sample code of VB6 to check for memory utilization. After clicking on button you can observer memory in task manager.
Private Sub Command1_Click() Dim con As ADODB.Connection Dim sSql As String Dim Rs As New ADODB.Recordset Set con = New ADODB.Connection con.Open "Upcrest", "sa", "" Rs.Open " select * from sientity ", con MsgBox Rs.Source Rs.Close Set Rs = Nothing con.Close Set con = Nothing End Sub"
I'm running into a problem where the class I'm running seems to eat up a lot of memory with sql server. When it's done running, the memory usage never goes down in taskmanager. I can't figure out where the memory leak might be. Here's the code that is being called. Does anyone see a reason why it would continue to eat memory as it runs and then not release it? Thanks. using System; using System.Collections.Generic; using System.Text; using System.Data; using System.Data.SqlClient;
When I close a web form that has a connection to my SQL Server, I am not seeing the memory process close in task manager (of the SQL Server). I am using the "open late close early" theory of database connections. I am using the "close" method for my database connections. Is there any automated utility that will shut down these processes? I thought when the user was disconnected from the database, the memory process would automatically shut down.
Given the following code:System.Data.SqlClient.SqlCommand sc = new System.Data.SqlClient.SqlCommand();sc.CommandText = "MySP";sc.CommandType = System.Data.CommandType.StoredProcedure;sc.Connection = new System.Data.SqlClient.SqlConnection("MyConnectionString"); sc.Connection.Open();sc.ExecuteNonQuery(); sc.Connection.Close();sc.Connection.Dispose();sc.Dispose();Is the call to sc.Connection.Dispose() necessary?J
I have an Execute SQL Task that returns a dataset to variable DfltValData. A dataflow follows that with a script component that access that dataset (read only variable) (see code below) and everything is fine. Now, after that, there's another dataflow with a script component, with the same code as below, trying to access DfltValData. Here is where the problem is, the DfltValData object does not contains any row. Whats happening and how to solve this?
Thanks!
Dim olead As New Data.OleDb.OleDbDataAdapter Dim dt As New Data.DataTable Dim row As System.Data.DataRow
I've got an application uses a WPF GUI, is built in VS2008 final, targets .Net Framework 3.5, and uses a Sql Ce 3.5 database for local storage. After a specific WPF window is rebound the application hangs. Hitting pause on the debugger reveals the offending method is a call to SqlCeConnection.Dispose(). Via Reflector I can see this method calls SqlCeConnection+ObjectLifeTimeTracker.Close() which in turn calls GC.WaitForPendingFinalizers(). Within WPF there are some objects (specifically TextBox) which need to have their resources freed on the main application thread. The finalizer thread is waiting on the main application thread, which is in turn waiting on the finalizer thread, resulting in a deadlock.
You can see some additional discussion of the topic including our temporary resolution in this thread.
My question is why is there a call to GC.WaitForPendingFinalizers() buried within the SqlCeConnection.Dispose() call tree?
After making several hundred queries against a SQL CE 2.0 database (through NetCF/ADO.NET), I begin getting a SqlCeException: "Not enough storage is available to complete this operation."
Microsoft speaks to this situation in the following hotfix: http://support.microsoft.com/?kbid=827837
When I contacted them to receive the hotfixed ssce20.dll, they described the problem as SqlCeDataReader and SqlCeDataAdapter not releasing their memory resources after they went out of scope. In addition to using the hotfixed binary, they also advised me to call SqlCeEngine.Dispose() after every query to force SQL CE to release resources, as shown in the "finally" block of the code below.
I have a couple of questions about this:
(1) Will this cause a lot of performance overhead for me, especially if my application makes frequent queries using the following code?
(2) Can a cache an instance of SqlCeEngine and call Dispose() on that cached instance repeatedly, so I can avoid having to instantiate a new SqlCeEngine each time?
public void Dispose() { if (this.connection != null) { this.connection.Dispose(); } } // ... }
This only happens when I'm calling Application.Exit(); and Dispose is called through the destructor of the DatabaseManager class. When I'm disposing the connection during normal work the call works as intended. BTW I'm using SQL Server Mobile 3.0.5214.0 on a PPC 2003 AKU2 (Symbol PPT8846 industrial device).
During a Wise Installation upgrade of our software, we are renaming a directory that contains our ExtendedSP DLL. We issue a DBCC TSWSQLXP(Free) call before doing the rename, yet SQL Server, still "holds on" to that DLL. We install a new version, and the ListDLLs utility (from sysinternals) lists the new DLL in the correct directory.
However, when trying to remove the renamed directory, it won't let us remove the old DLL because it says it is in use. We can delete the NEW DLL in the NEW directory with no problem.
I have run the DBCC call numerous times and SQL Server STILL won't release the DLL for deletion. The only way to delete the OLD DLL is to stop the SQL Server, delete the DLL, and then restart SQL Server. We do NOT want to do this because there may be other processes running in SQL Server.
When i deleted some 8000 rows from aspnet_profile, some space should have been released. On the contrary, the db size increased. Where did the space go and why did the db size increase after deleting the records? There are no triggers either.
I thought it might be log files..but my hosting provider tells me that db is set to Simple Recovery which does not utilize a Log File. So we cannot shrink it.
Any idea how can i release some space. Does truncating a table release db space and not fill the log?
Please guide step by step. I am not very thorough with sql
I just ended a case with microsoft because I had an issue with Analysis services. The thing is, one of the solutions given by microsoft was to install cumulative update 5 for SQL Server 2005 SP2. Now SQL Server is using all the memory on the server which has windows 2003 server R2 Enterprise SP2 (8GB of Ram) /3GB and /PAE enabled. The weird thing is non of the sql server processes displayed in Task manager shows more than 200MB of memory. But, the server only has 500MB of memory available and 7GB of PF Usage. How did I find out that SQL Server was using the memory, well I shut down the service and 6.8 GB of memory available and the PF Usage went down to just 1.8GB. Turned on again and then the same situation.
well it seems SQL Server is reserving memory or something.
My Transactional log size increased to 39GB, it is in full recovery mode,
To regain the space, i have done the following BACKUP LOG DB_NAME WITH TRUNCATE_ONLY DBCC SHRINK_FILE (LOG_FILE_NAME,500) But not able to regain the space in the hard disk.
No Transactional backups to truncate the log file were setup.
Can you please tell me why the space was released and what should i do further to clean up the sapce
When I checked the Properties of the Statistic I can see it is on a varchar(3) field when there are only 3 different values in there - all char(1)
The total size of the data in the table according to the Disk Usage By Top Table Report is 199,680,712KB
So my question is this...
For the UPDATE STATS on this one column with FULL SCAN, does SQL Server read the entire table into the Buffer Pool. If so then if the table had 199,680,712KB of data then why did the session request 145,705,216KB.
Or does SQL Server just read the column and ClusteredIndex Key into the Buffer Pool?
I have a co-worker whose sql server 2005 is exhibiting strange behavior. We have already re-installed sql server 2005 and service packed it to SP2 to try and see if the behavior stops but it has not.
Every so often during the day sql server 2005 will start to slow down to the point that my co-worker's queries begin to time out. He turned on profiler to look at what was going on behind the scenes.
We see where sa is releasing and acquring locks to the tune of 180,000 rows in a fifteen minute span when this behavior starts so does his time outs. He has reporting options and analysis services installed but not configured. His only connection is to his local database. Occasionally, you see a number like - (03000d8f0ecc) appear in the Text Data column in profiler for sa. I read something about reporting options using sa for clean up but I don't think that is what is happening here.
Does someone have a clue as to what is happening and a way we can prevent the behavior? It is affecting his ability to work on his application.
Hello. I have received the follwoing error upon an attempt to Browse the Cube. All other tabs are functional, including the Calculations tab. We are running Windows Server 2003 SP2 and SQL Server 2005 SP2. Any suggestions would be greatly appreciated!
**EDIT** - Have confirmed SP1 for VS2005 is installed both locally and on server, also.
Attempted to read or write protected memory. This is often an indication that other memory is corrupt. (Microsoft Visual Studio)
------------------------------ Program Location:
at Microsoft.Office.Interop.Owc11.PivotView.get_FieldSets() at Microsoft.AnalysisServices.Controls.PivotTableFontAdjustor.TransformFonts(Font font) at Microsoft.AnalysisServices.Browse.CubeBrowser.UpdatePivotTable(Boolean translate) at Microsoft.AnalysisServices.Browse.CubeBrowser.UpdateAll(Boolean translate) at Microsoft.AnalysisServices.Browse.CubeBrowser.InitialUpdate() at Microsoft.AnalysisServices.Browse.CubeBrowser.SupportFunctionWhichCanFail(FunctionWhichCanFail function)
I've been researching AWE to determine if we should enable this for our environment.
Currently we have a quad core box with 4 gb of RAM (VMware). OS: Windows 2003 std, SQL Server 2005 std. 3GB is not set but will be as soon as we can perform maintenance on the server.
I have read mixed feedback on AWE, either it works great or grinds you to a hault. I would assume that the grinding to a hault is due to not setting the min/max values correctly or not enabling the lock page in memory setting.
We only have one instance of SQL on the server and this box won't be used for anything else aside from hosting SQL services. We do plan on running SSRS off of this server as well.
1. Will running SSRS and enabling AWE cause me problems? Will I have to reduce the max setting by the SSRS memory usage or will it share and play nice?
2. How do I go about setting the Max value? Should it be less than the physical RAM in the box? Right now its set to the default of 214748364, even if I don't enable AWE should this default value be changed?
3. It seems that even at idle the SQL server holds a lot of memory and the page file grows. If I restart the process in the morning, memory usage in taskmon is at 600mb or so. By the end of the day, its up around 2gb. How can I track down whats causing this, should this even concern me?
4. The lock Page in memory setting worries me. Everything I've read on this seems to give a warning about serious OS and other program support degradation. In some cases to the point where they have to restore the settings on the server before they can bring it back up. What are your thoughts on this.
I have a Windows sever 2012 with sql server 2012 enterprise. Ram size is 22GB. Sometimes SQL sever takes 95% memory.My question, How to reduce memory size without killing any process because it's production server.So there are many background process is running. And,Is there any guides to learn why Memory is raise d so high and how to reduce it.
Hello, I understand that we should use SSMS -> Server Properties -> Memory to put a cap on the SQL server memory usage, therefore it gives some space memory for OS, this is based on the fact if the max memory is not specified, SQL will use whatever available memory and eventually crash the system.
My question is that when a server has SSIS and SSAS services installed along with the SQL service. Would the max memory setting covers the SSIS and SSAS memory usage, or the SSIS and SSAS has to shared the memory with OS?
I am running Visual Studio 2005. I have an SSIS Package which is consuming a huge amount of memory. During the execution of the package the memory keeps increasing. Until finally i get an Out of Memory exception. I have run this package using dtexec, and in the BIDS. No difference. I do have some script components and have added some code to get the assemblies in the current appdomain. I do see that one particular assembly is increasing on every loop. VBAssembly every time it hits the script component is increasing by 6, and along with it the memory is climbing. What is this VBAssembly being used for is there an update to SQL Server Integration Services that I need?
sql server 2000 is running on windows server 2003 ... 4gb of memory on server .... 2003 was allocated 2.3gb nd sql server was allocated (and using all of it) 1.6gb for total of approx 4gb based on idera monitor software ... all memory allocated betweeen the OS and sql server .... then 4 more gb of memory added for total now of 8g ... now idera monitor shows 1.7gb for OS and 1.0 gb for sql server ..... 'system' info shows 8gb memory with PAE ... so I assume that the full 8gb can now be addressed .... why are less resources being used now with more total memory .... especially sql server ..... i thought about specifying a minimum memmry for sql server but i amnot convinced that would even work since it seems that this 1gb limit is artificial .... it it used 1.6 gb before why would it not use at least that much now ??
I've a database with a memory optimized filegroup on it. How can I remove it?I have removed the memory optimized table I had on it, but when I try to remove the filegroup I receive an error.
So I started a new job recently and have noticed a few strange configurations. Typically I would never mess with min memory per query option and index create memory option configuration because i just haven't seen any need to. My typical thought is that if it isn't broke... They have been modified on every single server in my environment.
From Books Online: • This option is an advanced option and should be changed only by an experienced database administrator or certified SQL Server technician. • The index create memory option is self-configuring and usually works without requiring adjustment. However, if you experience difficulties creating indexes, consider increasing the value of this option from its run value.
I did a load testing and found the following observations:
1. The Memory:Pages/sec was crossing the limit beyond 20.
2. The Target Server Memory was always greater than Total Server Memory
Seeing the above data it seems to be memory pressure. But I found that AvailableMemory was always above 200 MB. Also Buffer Cache HitRatio was close to 99.99. What could be the reason for the above behavior?
Hi~, I have 3 questions about memory based bulk copy.
1. What is the limitation count of IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit(true)? For example, how much insert row at below sample?(the max value of nCount) for(i=0 ; i<nCount ; i++) { pIFastLoad->InsertRow(hAccessor, (void*)(&BulkData)); }
2. In above code sample, isn't there method of inserting prepared array at once directly(BulkData array, not for loop)
3. In OLE DB memory based bulk copy, what is the equivalent of below's T-SQL bulk copy option ? BULK INSERT database_name.schema_name.table_name FROM 'data_file' WITH (ROWS_PER_BATCH = rows_per_batch, TABLOCK);
------------------------------------------------------- My solution is like this. Is it correct?
// CoCreateInstance(...); // Data source // Create session