SQL Server 2008 :: How To Find Which Queries / Processes Causing Large Memory Paging Rate
Mar 30, 2015
Our monitoring tool shows that our production system periodically experiencing large rate - up to 800 memory pages/sec. How to find out which particular queries, S.P., processes that initiate this?
View 3 Replies
ADVERTISEMENT
Apr 22, 2015
I am monitoring our production server, and noticed that periodically we have spikes of Memory Paging Rate (pages/sec).
How to find particular queries/stored procedures that causing this?
View 5 Replies
View Related
Dec 3, 2007
We have several 2005 servers with "Maximum server memory" set to 214 gig, which I believe is the default at installation time. I am told that this means "use all the memory there is including paging." Well, this is nuts but the servers seem to work fine with this setting no matter how much physical memory they have.
One of our 2005 servers recently started paging like crazy, so I reduced "Maximum server memory" to 6000 and the paging disappeared (server has 8 gig of physical memory) and the server appears happy.
I can not explain why only this one server has this paging issue and the others do not. Should I be setting "Maximum server memory" on all my servers? Are there other considerations which might cause the server to eat-up all the memory? As far as I know no other applications run on this box.
Thanks,
Michael
View 6 Replies
View Related
Jan 29, 2007
Summary: We host reports off of one server that are accessed and refreshed 24/7/365. On a daily basis, I receive calls about reports taking several minutes to run IF they run at all. (normally these reports take seconds). There has yet to be an actual "error message" associated with this behavior. When I check the server at these times, I will find that w3wp.exe process is taking up about 4G and is causing paging. It seems that once it gets to that point, it will kill itself and all goes back to normal. However, it's a slow death and is impacting business.
We're using MSRS 2005. Hosting the reports is the job of this server--no other applications are hitting it. There is a development SQL db on the box, but not in use at this time. The datasources in most cases are stored procedures housed on another box. The web interface (for end users running the report) is hosted on a third server.
Any thoughts?
View 5 Replies
View Related
Apr 22, 2015
How do I find out top expensive queries from SQL Server 2008 – Standard edition ?
View 9 Replies
View Related
Jun 22, 2015
my table includes detailed records with total Rate repeated in each record:
CREATE TABLE Table1
(
Providerid varchar (6) NOT NULL,
Providername char (30) NOT NULL,
Clientid varchar (15) NOT NULL,
[code]....
View 1 Replies
View Related
Nov 23, 2015
Is there any easy way to detect SQL Server memory is getting paged out? Any DMV query or tips to confirm it is paging out.
My SQL Server Version is: Microsoft SQL Server 2008 (SP3) - 10.0.5500.0 (X64)
View 5 Replies
View Related
Oct 26, 2015
I have a table and a specific column inside this table. I know this table is being updated, by using sys.dm_db_index_usage_stats, I was able to determine this, by some process (stored procedure / SQL Job / etc), but the problem is, I am not sure what process is doing it.
How would I search our SQL Server 2008 database to find any process that manipulates this table / column (I only care about Inserts / Updates and Deletes, but do not really care for SELECT).
View 8 Replies
View Related
Sep 7, 2006
Hi,
Anyone knows what the size should the Paging File be for a 8 P 16 Core Opteron server with 128GB memory installed? The server will be running SQL 05 Ent and IIS, on top of Win2k3 R2 x64 Ent.
Please also give the reason for that size.
Regards,
dong
View 1 Replies
View Related
Jun 4, 2007
All;
We have a Windows App and Web App that share business objects which points to a single database. When a Windows user logs in, an average of 50 processes are created in the first few seconds and never go away. The details window is blank and they all remain sleeping from that point on.
I have stepped through the code to see if there is anything odd going on but most of the processes are created when validating the number of parameters the stored procedure has or the length of the stored procedure name. This translates to 1000-1500 processes on average.
Is this normal? Will it hurt performance? Is there a way to remove them?
View 1 Replies
View Related
Oct 29, 2015
We have a massive database with an almost massive amount of traffic to and from it.
I've been requested to implement a sliding window partitioning with 2 partitions an active and passive 1,I managed to test this on a very small testbed last month.
I currently moved 97k table on to the partition function leaving me another 26 k to go
I'm using the following stored procedure to implement the sliding window
CREATE PROCEDURE [dbo].[ManageFactSlidingWindow](@pFunction nvarchar(max),@pSchema nvarchar(max),@FG nvarchar(max),@moveDays int)
/*****************************************************************************
PROCEDURE NAME: [ManageFactSlidingWindow]
AUTHOR: Arshad Ali
CREATED: 02/24/2013
DESCRIPTION: This stored procedure manages sliding window for the partitioned table
VERSION HISTORY:
DATE EMAIL Company DESCRIPTION
[Code] .....
When I try to move the partition even a single day I get loads of locks.
View 0 Replies
View Related
Apr 22, 2015
Wanted to do the forecasting of disk growth for one year. How to find the database growth rate?
View 4 Replies
View Related
Oct 21, 2007
Here's a question you'll never quit hearing: is there a convenient way to page through large recordsets in SQL Server 2000?
I'm writing some software which, for all intents and purposes, works like a messageboard: users can create threads, leave replies, and so on.
I have about a half million records in a few of my tables, and some of my queries return 1000s of results. I'd prefer not to return 1000s of records all at once, so I don't want to page my records in code; I'd rather page them in SQL Server. Naturally, I want to page replies. However, I don't know of a convenient way to page records in SQL Server 2000.
View 1 Replies
View Related
Sep 4, 2007
Hello,
I have an SSIS package that basically inserts a large amount of data into a SQL Server table. The table contains sixty five columns, and a single load of data can contain two million records.
The 'loads' are split up into several 'daily' flat files. The package uses a ForEachFile loop to process each of the files. As each file is processed, the data from the files is loaded into a SQL Server table (destination).
Apparently, as the package is running, tempDB begins to consume a lot of disk space. The data file for TempDB on this particular server is configured to grow in 50mb increments with unrestricted file growth. During the last run of the package, the data file grew to 17GB. I ran the following and got the data file size down to 50mb;
USE TempDb
GO
DBCC SHRINKFILE(tempdev, 1)
Should I consider incorporating this code as part of the package, or is there something else I should consider to configure the SSIS package so that I don't run into space problems with TempDB?
Thank you for your help!
cdun2
View 2 Replies
View Related
May 29, 2006
lets say we have more than 100 000 rows in Table1, and we want to view each 10 rows alone.... and by pressing on a NEXT button we will see the other 10 pages....
there is 2 buttons : NEXT and PREVIOUS
so can anyone tell me how to do that in SQL 2005, and what is correctly called.
I have found a code that does use ROW_NUMBER in order to view results between 2 numbers,
example: rows between 10 and 50....
but It is not what I want, so please I need some help, thank you
By Uncle Sam
View 10 Replies
View Related
Apr 10, 2008
Im looking at this article http://www.dotnetjunkies.com/Article/EA868776-D71E-448A-BC23-B64B871F967F.dcik
and it seems like they are selecting the entire customers table into the temp table, correct ?
View 2 Replies
View Related
May 29, 2006
lets say we have more than 100 000 rows in Table1, and we want to view each 10 rows alone.... and by pressing on a NEXT button we will see the other 10 pages....
there is 2 buttons : NEXT and PREVIOUS
so can anyone tell me how to do that in SQL 2005, and what is correctly called.
I have found a code that does use ROW_NUMBER in order to view results between 2 numbers,
example: rows between 10 and 50....
but It is not what I want, so please I need some help, thank you
By Uncle Sam
View 4 Replies
View Related
Jul 18, 2006
I gave up on the ScriptTasks, but desided to use Custom tasks instead. Problem again. My code opens a 400M file and reads it line by line using StreamReader. Each line is approximately the same length. For each line, there is some processing, and then the line is written into another file using StreamWriter. I am watching the DtsDebugHost process with TaskManager open and here is what happens:
Initially its all good. Then when it read through first 150M+ of the input file, the memory usage of the DtsDebugHost shoots up dramatically - about 1Gig (both virtual and physical memory). Then the task fails with OutOfMemoryException. I thought the problem is with my code, but it still happens even if I only read a line from one file and write it to another, without any processing!
When I invoke the same code from Execute Process task, its all good - no problems at all whatsoever.
Any ideas?
View 1 Replies
View Related
Jul 10, 2014
I have a database of employees and pay rates.
One employee has two pay rates for two different jobs:
Job A: Rate $10.00
Job B: Rate $15.00
I will be updating their record so that they only have one job going forward, Job C. I need Job C to equal their HIGHER of the two existing jobs.
I have a select statement to find what the higher rate is. However, I am not sure how I can apply the rate to be the new job's rate. Here's what I used to find the highest rate for one single person:
SELECT max(rate), employeeID
FROM JobsTable
inner join IDTable
on JobsID2 = IDID2
WHERE JobCode in ('JOBA','JOBB')
and EmployeeID = '12345'
GROUP BY EmployeeID
(this returns the employee ID from one table, and the highest rate from Jobs A and B from another table)
I can get it to update to add JobC -- how can I get it to assign the result from the above query to be the rate used for Job C?
View 1 Replies
View Related
Oct 3, 2001
In an Intranet Application using Win NT, Apache, Tomcat and SQL Server, the memory space used by SQL Server is drastically increasing and finally the system crashes. Nearly 40 people are accessing the system. The hardware configuration is P2 processor with 393 MB RAM and 2GB Virtual Memory. SQL Server,Web server and Servlet Engine are running on same machine.
Within three hours, SQL Server occupies 200M memory and the system perfomance comes down and finally the system stopes the tomcat servlet engine.
Anybody have any idea on this? We have nearly 1500 JSP pages,200 Bean files and 300 tables in SQL Server.
View 2 Replies
View Related
Jun 13, 2001
When I try open a table in Enterprise Manager, I am getting the following error:
The instruction at "0x418561c4" referenced memory at "0x00000034". The memory could not be "read".
Any ideas what this is?
Paul
View 1 Replies
View Related
Nov 1, 2015
I have 4 instances on a box. How can i find which instance is causing memory pressure? Also, how to find which instance is consuming more memory?
View 3 Replies
View Related
Jul 4, 2007
Hi,
We are running SQL Server 2005 Ent Edition with SP2 on a Windows 2003 Ent. Server SP2 with Intel E6600 Dual core CPU and 4GB of RAM. We have an C# application which perform a large number of calculation that run in a loop. The application first load transactions that needs to be updated and then goes to each one of the rows, query another table get some values and update the transaction.
I have set a limit of 2GB of RAM for SQL server and when I run the application, it performs 5 records update (the process described above) per second. After roughly 10,000 records, the application slows down to about 1 record per second. I have tried to examine the activity monitor however I can't find anything that might indicate what's causing this.
I have read that there are some known issues with Hyper-Threaded CPUs however since my CPU is Dual-core, I do not know if the issue applies to those CPUs too and I have no one to disable one core in the bios.
The only thing that I have noticed is that if I change the Max Degree of Parallelism when the server slows down (I.e. From 0 to 1 and then back to 0), the server speeds up for another 10,000 records update and then slows down. Does anyone has an idea of what's causing it? What does the property change do that make the server speed up again?
If there is no solution for this problem, does anyone know if there is a stored procedure or anything else than can be used programmatically to speed up the server when it slows down? (This is not the optimal solution however I will use it as a workaround)
Any advice will be greatly appreciated.
Thanks,
Joe
View 3 Replies
View Related
May 6, 2007
As I said above, how do I put sorting + paging in a stored procedure.My database has approximately 50000 records, and obviously I can't SELECT all of them and let GridView / DataView do the work, right? Or else it would use to much resources per one request.So I intend to use sorting + paging at the database level. It's going to be either hardcode SQL or stored procedures.If it's hardcode SQL, I can just change SQL statement each time the parameters (startRecord, maxRecords, sortColumns) change.But I don't know what to do in stored procedure to get the same result. I know how to implement paging in stored procedure (ROW_NUMBER) but I don't know how to change ORDER BY clause at runtime in the stored procedure.Thanks in advance.PS. In case "ask_Scotty", who replied in my previous post, http://forums.asp.net/thread/1696818.aspx, is reading this, please look at my reply on your answer in the last post. Thank you.
View 3 Replies
View Related
Jun 14, 2007
Hi, can anyone tell me how I can get implement sorting and paging using a recursive query?
I have a created a stored procedure (bit like the simple example below), but would like to add in sorting and paging using order by and row_number().
DECLARE @CategoryID intSET @CategoryID = 12;WITH CTE_Example (CategoryID, CategoryName, ParentID, Depth, RowNum)AS(SELECT CategoryID, CategoryName, ParentID, 0 AS Depth, Row_Number() OVER (ORDER BY CategoryName) AS RowNumFROM Categories WHERE CategoryID = @CategoryIDUNION ALLSELECT Categories.CategoryID, Categories.CategoryName, Categories.ParentID, CTE_Example.Depth + 1 AS Depth, Row_Number() OVER (ORDER BY Categories.CategoryName) AS RowNumFROM CategoriesJOIN CTE_Example ON Categories.ParentID = CTE_Example.CategoryID)SELECT * FROM CTE_Example WHERE RowNum BETWEEN @Start AND @End ORDER BY RowNum
I think the problem comes down to the Union, appreciate if someone can help.
Many thanks,
Matt
View 1 Replies
View Related
May 21, 2007
This morning msmdsrv.exe looked like the following in Task Manager:
Memory Usage: 600,000 K
VM Size: 2,500,000 K
If I understand correctly, that means that a good deal of SSAS memory has been swapped to disk by the OS. Is there any way to prevent this? (I have read about using the Lock Pages In Memory privilege at http://msdn2.microsoft.com/en-us/library/ms179301.aspx which lets SQL prevent the OS from paging sqlservr.exe memory. I don't suppose there's an equivalent SSAS setting.)
In particular, the symptom I'm seeing is that after we finish processing, all that paged memory has to be loaded back into memory before the transaction can be committed. Committing the transaction is usually very quick. But when most of the SSAS process memory is pages, it takes quite a while, during which I can see the Memory Usage number in Task Manager growing. (I don't think it's a blocked transaction commit.)
View 2 Replies
View Related
Jun 30, 2006
Are there any *negative* consequences to SQL Server 2000 Standardpaging to disk more often if one reduces the amount of available RAMfrom 2.0 GB to 1.5 GB to give the OS and other apps enough RAM, otherthan things possibly being "slower" and whatever wear and tear couldhappen to the disk drives??Or is it better to add in more RAM per se to bring the server up to 4GB (Win2k3 Svr SP1)?? (it has a 4GB pagefile)Is any information available about judging how much RAM the OS (Windows2003 Server) should have available?? Does IIS require a lot of RAM??Thank you, Tom
View 1 Replies
View Related
Mar 28, 2007
Currently, we have the following setup...
SQL 2005 Std, 8gb ram, boot ini switches for /PAE and /3GB, AWE enabled and the min/max memory settings for a single instance respectively set to 1024 and 7168. Performance-wise the system is a bit slower than before (with SQL2000) and there seems to be alot of paging. The SQL service is also running under the "LocalSystem". In this case, do I need to specify the local "SYSTEM" to have rights to "Lock pages in memory"? I have a feeling there is a misconfiguration somewhere. Can someone point me in the right direction? Thanks.
View 4 Replies
View Related
Apr 24, 2015
Background:
* SQL Server 2008 R2
* Database was created from a third party product. The product writes to the 3 tables that I need to make changes to 24/7 and downtime is not an option. All changes must be done live.
* Database overall size is ~200 GB
* The 3 tables I must update make up ~190 GB of that space.
* Tables have no primary key or ID columns. Therefore, the data is highly fragmented.
* Of the ~190 GB of space allocated for the tables, there is roughly 70 GB of actual data.
* Rows of the table are not guaranteed to be unique. In fact, on one of the tables, tests were ran with a small sample of data and duplicates were very much evident.
What I'm trying to accomplish here is to get an ID column added to the 3 tables and set that ID field as the primary key. Doing so will force the data to become much less fragmented than it is currently and with purging and new inserts, eventually fragmentation will be nearly non-existent.
Problem:
Making table changes on tables this large while data is constantly being added poses many risks and can cause data loss. This was tried on a smaller table than these three and the entire table was lost in the process. Restore from backup was needed to get back to most recent log backup point.
Original Solution:
My original plan was to create a backup of each table and run the script below to migrate the majority of the data temporarily into the new table. I could then update the original table (which now would contain much less data) and then migrate the data back.
CREATE TABLE #temp
(
MsgDate varchar(10)
,MsgTime varchar(8)
,MsgPriority varchar(30)
,MsgHostname varchar(255)
[Code] ....
Original Solution Problem:
The problem with the solution above is that it calls the DELETE function on the original table using the values from the temporary table. When there are duplicate rows, which have not all been inserted into the backup table yet, they will all be removed from the original table because there is nothing unique to separate them out. In my testing, I had 10,000 rows in the original table and ended up with 9,959 rows in the backup table.
Question 1: Is my approach to making these table changes reasonable?
Question 2a: If so, how can I make sure I don't lose data as part of this temporary migration of the data to my backup tables?
Question 2b: If not, what would be a better approach that isn't going to cause disruption to the application that INSERTs data 24/7 and won't have any risk of data loss?
View 9 Replies
View Related
Jul 14, 2015
How many no of records of the tables are called large tables.
We are getting more deadlocks. We are using default isolation. Read & insert statements are blocking each other and causes dead locks.
I am thinking that might be purging will reduce deadlocks.
The table has 15million records. Is this table consider as large table or not in OLTP systems?
In general how many records we need to consider as large table.
View 1 Replies
View Related
Sep 19, 2015
I've been having some trouble getting a single-column "varchar(5)" field to reliably use a table seek instead of a table scan. The production table in this case contains 25 million rows. As impressive as it is to scan 25 million rows in 35 seconds, the query should run much faster.
Here's a partial table description:
CREATE TABLE [dbo].[Summaries_MO]
(
[SummaryId] [int] IDENTITY(1,1) NOT NULL,
[zipcode] [char](5) COLLATE Latin1_General_100_BIN2 NOT NULL,
[Golf] [bit] NULL,
[Homeowner] [bit] NULL,
[Code] .....
Typically, this table is accessed with a query that includes:
SELECT ...
FROM SummaryTable
WHERE ixZIP IN (SELECT ZipCode FROM @ZipCodesForMO)
This query insists on using a table scan. I've tried WITH (FORCESEEK) for example, but that just makes the query fail.
As I've investigated this issue I also tried:
SELECT * FROM Summaries WHERE ZipCode IN ('xxxxx', 'xxxxx', 'xxxxx')
When I run this query with 64 or fewer (actual, valid) ZIP codes, the query uses a table seek.But when I give it 65 or more ZIP codes it uses a table scan.
To summarize, the production query always uses a table scan, and when I specify 65 or more ZIP codes the query also uses a table scan. I'm wondering if the data type of the indexed column (Latin1_General_100_BIN2) is somehow the problem. I'll likely try converting the ZIP codes to an integer to see what happens.
View 9 Replies
View Related
Sep 10, 2007
I have a 2GHZ cpu with 1GB of RAM. I occassionally see very slow (long) queries against a local SQL Server 2005 Express (SP2) database. The issue occurs against different SQL Queries, but all queries are rather basic select statements Perfmon shows that the SQL Server counter for the "MEMORY GRANT QUEUE WAIT Avg MS" gets extremely high (25000+ ms). Perfmon also also shows that PAGING is not occuring, and the system is not under unsual stress. The problem is not reproducible with MSDE.
Has anyone seen this issue, or have any recommendations for a next course of action?
View 1 Replies
View Related
Feb 5, 2015
Currently our database size is around 350G. It will grow up to 1.5 TB
We have the
Auto create statistics option :True,
auto update statistics option :True,
auto update statistics asynchronously option : False
at database level
we have a weekly job, update statistics running very long time. It is created through maintenance plan using the option full scan.
Previously they tested with sampling but instead of full scan running with the sampling effected the queries.
Is there option to avoid the long time job duration.
If we didn't run the statistics manually what will happen? How do you maintain statistics with large databases
View 9 Replies
View Related