SQL Server 2012 :: How To Capture CPU And Memory Usage For A Stored Procedure
Jan 19, 2015
I have around 100 packages (all [packages run at same time) each package calls a stored Procedure once the Stored proc Execution is completed the package will write the log information into a log table, here how can i capture the CPU and Memory usage for execution of each stored proc.
I have been having issues with our SQL server for awhile now. It seems to run out of memory every few days and when I look at the memory dump, the MEMORYCLERK_SQLOPTIMIZER seem to take over memory and eventually cause the server to crash.
Here is the SQL verison we are using: Microsoft SQL Server 2012 (SP1) - 11.0.3460.0 (X64) Jul 22 2014 15:22:00 Copyright (c) Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 6.2 (Build 9200: ) (Hypervisor)..It is on a VM on Windows 2012 server. It has 20gb of RAM allocated to it and the MAX Server Memory is set to 16.5gb.
I have seen the MEMORYCLERK_SQLOPTIMIZER grow to about 11gb at the time of the server crash. Why that is happening? What is causing the memoryclerk_sqloptimizer to get so high? I have looked it up and it looks like it has to do with ad hoc requests, but is there something I can do to bring that memory down when it gets so high so that I can prevent a server crash?Do we just need to add more memory or is there a memory leak somewhere?
I know I should probably be posting this in the RS section but I have a Win 2008 R2 server and RS 2012 along with SSIS and Database server installed. I also have SQL 2005 instance with Sql 2005 SSIS running on the server.
I saw that Reporting services was consuming 9gb of ram a few days ago with no published reports. It's just a default install.
So I investigated the settings and the document: [URL]......
I added this section to the reportserver.config file to restrict memory usage
I would like to know if there is a penalty for Varchar variables in stored procedures if I declare them Varchar(8000) instead of Varchar(1000). I have a lot of variables and sometimes the content will be more them 1000 characters. Is memory only allocated for the the actual contents or for the complete declared length?
I am maintaining an application where most of the business rules are inTriggers, Stored Procedures and User Defined Functions. When a bugarises, it can get very tedious to debug. Today for example, I wantedto modify a function that was being called by a trigger. The problemis that I don't want to change the function, for fear that it is beingcalled by one of the other SP's or triggers in the database (there arehundreds of them)Essentially, I need a tool that allows me to view where functions andsp's are being referenced from. At the very least, I'd like to performa "full text search" in the database objects, so that let's say I havea function named "fn_doSomething", I can search the schema for thisstring and get all the places where it appears.As you can see, I'm in the dark here. I've never worked on a systemwhere all business rules are at the database level. If you know of atool that does what I describe above, or anything else that wouldfacilitate my life, please let me know!Thanks for your help,Marc
I have a client program that writes to sql server database 10 records per second . i want to compute the CPU usage and the memory usage for the whole program or CPU usage,memory usage for the insert statement in the program .
I have a stored procedure that does updates. The sproc is called from an €œExec SQL Task€?. The sproc returns the number of rows that were updated using this technique:
RETURN @@ROWCOUNT
How do I capture the sproc€™s return value (number of rows) in a package-level variable? I eventually want to log this value. The sproc is not returning a result set. I€™m new to SSIS so any general guidance would be appreciated.
One of the production box running only sql server application, is showing 80% memory usage on the task manager-memory usage history right now.
We are running sql server 2000 standard version-sp3 with 2GB memory on this box. Server is not on the scheduled reboot at this point.
We have seen this behavior for this box last month that after task manager showing 90% memory usage contantly for several days, when server was manually rebooted, memory usage dropped to 35%. Now it's back to 80%.
Our DBA thinks that server should be rebooted on a regular schedule regardless of memory problem. Our network admin doesn't seem to agree with this. He is not ready to reboot the machine even with this high memory usage.
There is no noticable difference performancewise yet.
My questions are: Is it bad that memory usage reaches from 35% constant to 80-90% or is it common? Should sql server be rebooted immediately to take care of it? Should sql server 2000 rebooted on regular basis regardless of any problems? Shouldn't sql server be releasing memory back to the OS even without rebooting? How do I find out whether server actually is going through memory problems and what is causing it?
Currently I have an application that uses SQL 2000. The SQL server service tends to take up as much of the physical memory as possible. The problem is I also have other services relating to this application running that are very important.
What tends to happen after a period of time is SQL takes up all of the physical memory, so that the other services are using the paging file (virtual memory). This causes extremely slow response time over the network as these other services are having to parse the paging file.
Upgrading the memory is currently not an option :(
I know there is an option to set memory usage for SQL but I am unsure how this would respond in a production environment. What would happen if SQL would require more memory than what was allocated to it?
Can SQL release the memory and still act as normal?
I am using SQL Server 7.0 on a Windows NT machine. We have been having problems with SQL Server not releasing the memory after it has utilized it.
Currently it is configured to allow a max of 511MB memory (1024 is total on the machine). I had some advice that my best solution would be to reduce the Max Memory to a lower value (say 400 MB) to help reduce the problem.
Is this not counter-intuitive? Or is this the correct solution?
Hello all,How can I tell how much memory SQL Server is using on a server. OnWindows 2000, whenever I go to Task manager/processes/memory usage SQLServer seems to be showing 1,744,124K. On all of my servers withvarious size, usage of databases, all of them show SQL Server to beusing about the same amount of memory. Can someone explain this to me?Shouldn't it use more for larger databases, heavy hitting databases?Also, I normally check Dynamically configure SQL Server memory and putthe maximum threshold to a little bit less than the max on the server.The minimum query memory is set to 1024. Is that 1024 a subset of thememory used by SQL Server, or is this additional that can be used?Thanks,Raziq.*** Sent via Developersdex http://www.developersdex.com ***
I'm wondering about the memory usage of SQL Server Ce and the Compact Framework.
I've heard I can use only 32 MB per process. SQL Server Ce runs in process and my database file is about 60 MB big.
1. Does that means, I can only use for my application: 32 MB - CLR - SQL-Server Runtime - parts of my database file hold in memory
2. Is SQL Server Ce swapping parts of my database file into my process memory? If so, is there a chance to group some tables, that they are swapped together to increase performance?
3. Means big databases big memory usage in my process memory?
I have a stored procedure and in that I will be calling a stored procedure. Now, based on the parameter value I will get stored procedure name to be executed. how to execute dynamic sp in a stored rocedure
at present it is like EXECUTE usp_print_list_full @ID, @TNumber, @ErrMsg OUTPUT
I want to do like EXECUTE @SpName @ID, @TNumber, @ErrMsg OUTPUT
When a stored PL/SQL procedure in my Oracle database is called from ASP.NET, is it possible to retrieve the OUT parameter from the PL/SQL procedure? For example, if I have a simple procedure as below to insert a row into the database. Ideally I would like it to return back the parameter named NewId to my ASP.NET server. I'd like to capture this in the VB.NET code. 1 create or replace procedure WriteName(FirstName in varchar2, LastName in varchar2, NewId out pls_integer) is2 3 NameId pls_integer;4 5 begin6 7 select name_seq.nextval into NameId from dual;8 9 insert into all_names(id, first_name, last_name)10 values(NameId, FirstName, LastName);11 12 NewId := NameId;13 14 end WriteName; 1 <asp:SqlDataSource 2 ID="SqlDataSaveName" 3 runat="server" 4 ConnectionString="<%$ ConnectionStrings:ConnectionString %>"5 ProviderName="<%$ ConnectionStrings:ConnectionString.ProviderName %>" 6 SelectCommand="WRITENAME"7 SelectCommandType="StoredProcedure">8 <SelectParameters>9 <asp:ControlParameter ControlID="TextBoxFirstName" Name="FIRSTNAME" PropertyName="Text" Type="String" />10 <asp:ControlParameter ControlID="TextBoxLastName" Name="LASTNAME" PropertyName="text" Type="String" />11 </SelectParameters>12 </asp:SqlDataSource>This is then called in the VB.NET code as below. It is in this section that I would like to capture the PL/SQL OUT parameter NewId returned from Oracle. 1 SqlDataSaveName.Select(DataSourceSelectArguments.Empty) If anybody can help me with the code I need to add to the VB.NET section to capture and then use the returned OUT parameter then I'd be very grateful.
I am creating a package that has many SQL tasks. Each task executes a stored procedure. I need to capture any error messages returned by the stored procedures. Eventually, the error messages will be logged so that we can audit the package and know if individual tasks succeeded or failed.
I'm not sure where or how I can access a stored procedure message. What is the best way?
Hi all, Our Production server has 4GB RAM and is running SQL Server 7.0. By default since SQL Server 7.0 Standard Edition can take up only less than 2GB, our SQL Server is now using only 1.8GB (leaving the rest for the OS, Windows 2000 Server).
Inorder for SQL Server to take advantage of more than 2GB of RAM it is suggested that boot.ini be modified to include the switch, /3GB
Has anyone seen any issues with doing this? Is it safe to do so on the Standard Edition of SQL Server 7.0?
Good day to all of you I am faced SQL SERVER 2000 Memory usage problem. I am using Windows 2000 Server SP4 and SQL SERVER 2000 SP4. When User running some in-house application software, the memory for sqlservr.exe was increased . But, when user logout from the software. The sqlservr.exe did not decrease the memory. I have around 100 Users in my company. SQL Memory will countinues increase till max memory usage in CPU.
May i know how to order SQL SERVER need to purge memory when USER was log off from the Program? Or my SQL SERVER was corrupted /missing file?
Hi AllSome my SQL Server are experience high memory usage.1. How can I detect which process which process cause the big memoryusage and not released?2. Which sql server components in this memory, and what are their usagedistribution?Any help will be appreciated.ThanksWillie
I'm new at SQL Server, I decided to use it in order to get all the advantages of using Vb.net and Sql. My server is a SQL standard version. I'm using a relational DB most of the time for complex select queries, everytime the server is used it performs 30 or 40 queries at the time, and I have recently realized that server consumes a lot of memory after one or two days of beign up.
Let's say that if I restart the SQL Server memory usage is about 650 Mb, but after two days memory is 1.4Mb, I have used Sql Profiler and Tunning Assistant where it recommended me to create some indexes, which indeed I created them, but that did not solved the memory problem, although some queries run faster.
My questions are:
is this memory usage is normal ? if not, what should I check out to reduce memory usage?
We have an application that we currently run on SQL Server 2000 that works by creating a DTS package that it then executes.
Due to performance reasons, we have been considering switching to 2005, for a few reasons. Can anyone confirm clarify the following?
1) SQL Server 2000 caps RAM usage at 2GB, whereas SQL Server 2005 is only limited by the OS - RAM usage is a big current issue for us, so if upgrading to 2005 would solve this it would help a lot. Can anyone confirm my understanding of this? 2) Would using the legacy DTS in SQL Server 2005 take advantage of this RAM difference, or is it running on the old 2000 engine and only able to use the 2GB?
I have a Windows sever 2012 with sql server 2012 enterprise. Ram size is 22GB. Sometimes SQL sever takes 95% memory.My question, How to reduce memory size without killing any process because it's production server.So there are many background process is running. And,Is there any guides to learn why Memory is raise d so high and how to reduce it.
My stored procedure works and codes is working except I need to capture the return value from the stored procedure and use that value in my code behind page to indicate that a duplicate record entry was attempted. In my code behind file (VB) how would I capture the value "@myERROR" then display in the label I have that a duplicate entry was attempted. Stored ProcedureCREATE PROCEDURE dbo.usp_InsertNew @IDNumber nvarchar(25), @ID nvarchar(50), @LName varchar(50), @FName varchar(50) AS DECLARE @myERROR int -- local @@ERROR , @myRowCount int --local @@rowcountBEGIN -- See if a contact with the same name and zip code exists IF EXISTS (Select * FROM Info WHERE ID = @ID) BEGIN RETURN 1END ELSEBEGIN TRAN INSERT INTO Info(IDNumber, ID, LName, FName) VALUES (@IDNumber, @ID, @LName, @FName) SELECT @myERROR = @@ERROR, @myRowCount = @@ROWCOUNT If @myERROR !=0 GOTO HANDLE_ERROR COMMIT TRAN RETURN 0 HANDLE_ERROR: ROLLBACK TRAN RETURN @myERROR ENDGO asp.net page<asp:SqlDataSource ID="ContactDetailDS" runat="server" ConnectionString="<%$ ConnectionStrings:EssPerLisCS %>" SelectCommand="SELECT * FROM TABLE_One" UpdateCommand="UPDATE TABLE_One WHERE ID = @ID" InsertCommand="usp_InsertNew" InsertCommandType="StoredProcedure"> <SelectParameters> <asp:ControlParameter ControlID="GridView1" Name="ID" PropertyName="SelectedValue" /> </SelectParameters> </asp:SqlDataSource>
I have a stored procedure that dynamically bulk loads several tables from several text files. If I encounter an error bulk loading a table in the stored procedure, all I get is the last error code produced, but if I run the actual bulk load commands through SQL Management Studio, it gives much more usable errors, which can include the column that failed to load. We have tables that exceed 150 columns (don't ask), and having this information cuts troubleshooting load errors from hours down to minutes. Onto my question..., is there any way to capture all of the errors produced by the bulk load from within a stored procedure (see examples below)?
Running this...
BULK INSERT Customers
FROM 'c: estcustomers.txt'
WITH (TabLock, MaxErrors = 0, ErrorFile = 'c: estcustomers.txt.err')
Produces this (notice column name at the end of the first error)...
Msg 4864, Level 16, State 1, Line 1
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 1 (CustId).
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
Running this (similar to code in my stored procedure)...
BEGIN TRY
BULK INSERT Customers
FROM 'c: estcustomers.txt'
WITH (TabLock, MaxErrors = 0, ErrorFile = 'c: estcustomers.txt.err')
END TRY
BEGIN CATCH
SELECT
ERROR_NUMBER() AS ErrorNumber,
ERROR_SEVERITY() AS ErrorSeverity,
ERROR_STATE() AS ErrorState,
ERROR_PROCEDURE() AS ErrorProcedure,
ERROR_LINE() AS ErrorLine,
ERROR_MESSAGE() AS ErrorMessage;
END CATCH
Produces something similar to this (which is useless)... ...Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
Hi, On the last Sunday 26/12/04 on 20:00 I got on one of our applications in the customer site this error: "There is insufficient system memory to run this query."
Details:
Computer: IBM XSERIES_345, dual CPU 2.8 Intel xeo, 1,047,952 KB ram, windows 2000 server sp4. Disk space: 4395 MB, Virtual memory: initial size 1536 MB, Max size 3072 MB. Registry size: current 14 MB, Max 90 MB.
SQL Server 2000 SP 3, two production databases (1) Data size 100 MB, log size 15 MB (2) Data size 500 MB, log size 125 MB All the memory options are configured as default i.e. dynamically configure for memory and not changes in any sp_configure options.
SQL server is the only major application running on the machine.
Facts: On that evening the I've checked the Task Manager and found that SQL Server use 500 MB in Mem Usage and 1200 MB in VM size. In the query analyzer I execute the select query which results the "…insufficient system memory…" error on both databases and on the one with more data it failed with this error, but on the other database it completed successfully. I restart SQL Server service to solve it, and as for today 28/12/04 at 08:00 it use 590 MB in Mem Usage and 590 MB in VM size.
I have checked the system with Performance monitor and didn't find any problems I use the Memory: Pages/sec, Memory: Available Bytes, Physical Disk: % Disk Time, Processor: % Processor Time, SQL Server Buffer: Buffer Cache Hit Ratio.
We have 8 user connections for one of the database and 16 for the other one. Our system runs with this configuration for 2 months, I backup the databases FULL, DIFF and LOG backup. It's not real time system, and the load on the SQL is medium, but we use bad queries as update for all fields in one query (please don't ask…)
I know that SQL server is greedy with the memory and released memory only if other process needs it.
we are working on a release 2.0 mobile solution right now. In our version 1.0 we did not have to worry about memory issues as our application was the only application running on our target devices (e.g. T-Mobile MDA Compact II Pocket PCs, WM2005). Now we need to share the available memory with others. As our application relies on its SQL Server 2005 Mobile Edition database we are wondering about memory usage of that server.
We know that a Pocket PC divides its memory into Storage and Program. If our application uses a 5 MB database and 1.5 MB for DLLs and it's exe-file. These files reside in the storage space when not loaded. When the application starts up it is loaded in the program memory. What happens to the 5 MB database file? Is is loaded into Program memory as well? Are only portions of that file loaded? Or is nothing loaded at all?
Does anyone have a deeper insight into that server an can answer my questions.
We are looking for some guidance with an issue we have picked up with our implementation of Service Broker here on the ABSA Capital project and I am hoping you can help or point us in the direction of someone.
The architecture we have implemented for service broker is to make use of an Activation stored procedure on two queues (1 SP per queue) to process the messages received. What we have found is that the activation stored procedure runs on a background session and its CPU time and memory just grows to the point where it brought one of our UAT servers to a grinding halt.
Is there anyway we can reduce the memory consumption of the activation stored procedure or is this one of those things that still need to be ironed out in Service Broker?
On SQL 2012 (64bit) I have a CLR stored procedure that calls another, T-SQL stored procedure.
The CLR procedure passes a sizeable amount of data via a user defined table type resp.table values parameter. It passes about 12,000 rows with 3 columns each.
For some reason the call of the procedure is verz very slow. I mean just the call, not the procedure.
I changed the procdure to do nothing (return 1 in first line).
So with all parameters set from
command.ExecuteNonQuery()to create proc usp_Proc1 @myTable myTable read only begin return 1 end
it takes 8 seconds.I measured all other steps (creating the data table in CLR, creating the SQL Param, adding it to the command, executing the stored procedure) and all of them work fine and very fast.
When I trace the procedure call in SQL Profiler I get a line like this for each line of the data table (12,000)
SP:StmtCompleted -- Encrypted Text.
As I said, not the procedure or the creation of the data table takes so long, really only the passing of the data table to the procedure.