Sizing A Pagefile On Servers With Large Amounts Of Memory

Sep 19, 2007

I know the standard Microsoft recommendation is to make the pagefile at least 1.5 to 3 times larger then the amount of physical memory. However, if you're talking about a server with lots of memory such as 16GB or 32GB, would following this rule be unnecessary. With SQL 2000 running on Windows 2000 Server or Windows Server 2003 I typically see pagefile usage no more then 12% for a 2GB pagefile. Anything over 15% means I need to look at other indicators to see if a memory bottleneck has developed. If I have 32GB of physical memory and make the pagefile only 1.5 x 32GB I have a 48GB pagefile. 10% of this is 4.8GB, which I would hope I never see consumed.

Any thoughts?

Thanks, Dave

View 11 Replies


ADVERTISEMENT

Pagefile And Memory Advice??

Dec 19, 2006

Hello, I've been reading about this topic, and I've gotten myself moreconfused, not less.We have a single-processor license SQL Server Standard 2005 (xeon 2.8ghz) with 4 GB RAM in Windows Server 2003 SP1 Standard.I turned on the /3GB switch in boot.ini but not PAE or AWE, would it begood to have either one??


Quote:

View 3 Replies View Related

Large Amounts Of Text

Nov 20, 2003

I was wondering what is the best solution to store large amounts of text in a SQL 2000 field. This text will be entered into a multiline textbox.

ex) what data type?......should I use BLOBs?

Thanks,
Trey

View 4 Replies View Related

Storing Large Amounts Of Data

Mar 31, 2008

Hi,

I was wondering if any one could help me, I need to store large amounts of data in my database, at present I have it set to nvchar (8000), I've looked around and noticed you can use text which stores up to 2 million, but is slow in displaying the information.

Any ideas or points in the right directions would be great.

Thanks

View 6 Replies View Related

Inserting Large Amounts Of Data

Sep 8, 2005

Does anyone have ideas on the best way to move large amounts of databetween tables? I am doing several simple insert/select statementsfrom a staging table to several holding tables, but because of thevolume it is taking an extraordinary amount of time. I consideredusing cursors but have read that may not be the best thing for thissituation. Any thoughts?--Posted using the http://www.dbforumz.com interface, at author's requestArticles individually checked for conformance to usenet standardsTopic URL: http://www.dbforumz.com/General-Dis...pict254055.htmlVisit Topic URL to contact author (reg. req'd). Report abuse: http://www.dbforumz.com/eform.php?p=877392

View 4 Replies View Related

Dealing With Large Amounts Of Data

Jul 20, 2005

We are looking to store a large amount of user data that will bechanged and accessed daily by a large number of people. We expectaround 6-8 million subscribers to our service with each record beingapproximately 2000-2500 bytes. The system needs to be running 24/7and therefore cannot be shut down. What is the best way to implementthis? We were thinking of setting up a cluster of servers to hold theinformation and another cluster to backup the information. Is thispractical?Also, what software is available out there that can distribute querycalls across different servers and to manage large amounts of queryrequests?Thank you in advance.Ben

View 10 Replies View Related

Problem With Large Data Amounts

Jun 24, 2005

I have a dataset with 300,000 records and I'm getting the following error with MS Reporting Services.  "An error has occurred during report processing.  Exception of type System.OutOfMemoryException was thrown.  any help with this would be highly appreciated.

View 11 Replies View Related

Graphing Large Amounts Of Data

May 8, 2007

Greetings



I need to be able to graph roughly about 150 employees/ supervisor and their monthly cell phone usage in minutes. I understand that I will need to group this on say one graph for every ten employees so it doesn't look messy and cluttered. I have read some threads here but they dont seem to work for me.



So again each supervisor has 100+ subordinates and I need to graph theie phone usage by month



thoughts???

km

View 2 Replies View Related

Adding Large Amounts Of Text

Mar 20, 2007

this may seem like a simple question, but I have a report/lease agreement I need to put together and wanted to know the simpliest way to add large amounts of text. Basically its all the legal stuff most leases include in the amount of some 14 pages.

Should this be just one long string-- or does ssrs have another way to format this

thanks as always

KM

View 2 Replies View Related

Fastest Way To Do Large Amounts Of Updates

Mar 3, 2006

I was wondering what is the fastest way to UPDATE lots of recods. I heard the fastest way to perform lots of inserts in to use SqlCeResultSet. Would this also be the fastest way to update already existing records? If so, is this the fastest way to do that:

1. Create a SqlCeCommand object.
2. Set the CommandText to select the datat I want to update
3. Call the command object's ExecuteResultSet method to create a SqlCeResultSet object
4. Call the result set object's Read method to advance to the next record
5. Use the result set object to update the values using the SqlCeResultSet.SetValue method and the Update method.
6. repeat steps 4 and 5

Also I was wondering do call the SqlCeResultSet.Update method once per row, or just once? Also would it be possible and faster to wrap all that in a transaction?

Would parameterized updates be faster?
Any help will be appreciated.

View 3 Replies View Related

Mirroring Large Amounts Of Databases

Mar 6, 2008

I have been looking into mirroring a large amount of small databases approx 150 databases.



As I understand this won't be feasible because of the way mirroring threading works, http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=441900&SiteID=1



As I understand it for every database being mirrored sql will ping the mirror second, causing a network bottleneck?.

Also that the amount of threads generated for each mirrored database will cause also cause a bottleneck?



At the moment our database servers are under very little pressure and as an estimate use about 10% of the resources allocated to them such as CPU utilization, memory, disk IO and network. Our server hardware is Dual Quad core Xeons with 4 - 8 gig of memory and variety of 10k SCSCI raid configurations from raid 5 or 1,0 and sql 2005 32bit.



Ive done some calculations on the log file generation rate compared to network bandwidth there is more than enough network bandwidth.



Has anybody had any luck in mirroring many small databases?



My concerns is how much traffic is caused by the pinging of the mirror for each database?,

How many threads will the mirroring cause and what is the max amount of threads sql can handle?

How much memory will be consumed by each one of these mirroring threads?

View 1 Replies View Related

Problem Inserting Large Amounts Of Text

Aug 12, 2005

I am running into a problem inserting large amounts of text into my table. Everything works well when I test with a few simple words but when I try to do a test with larger amounts of text (ie 35,000 characters) the appropriate field is left blank. The Insert still performs (all the other fields recieve their data, but the "Description" field is blank. I have tried this with both "text" and "ntext" datatypes. I am using a stored procedure with input parameters. As I mentioned, the query goes off flawlessly with small amounts of data (eg "Hi there!") but not with the larger amount.I check and the ntext field claims to be able to accept 1073741823 bytes of data. Is there some other thing I should consider with large amounts of text?

View 6 Replies View Related

Sqlserver 2000- Large Amounts Of Data 1.2 TB

Apr 20, 2001

p.s. my email was incorrect in the last mail.
Hi all,
is there a sql 2k thread. Am interseted in finding out what the largest database size of a sqlserver database people have worked with.
We have a 1.2 Terabyte db with about 150-200 million new rows being processed everyday. Would like to share some thoughts on this with other people who are working with this much data and what they are doing with it.

bhala
----------------------------------------
Please check us out at: http://www.bivision.org/bivision

View 2 Replies View Related

Migrating Large Amounts Of Data From SQL Server 7.0 To 2000

Jul 20, 2005

I'm in the process of migrating a lot of data (millions of rows, 4GB+of data) from an older SQL Server 7.0 database to a new SQL Server2000 machine.Time is not of the essence; my main concern during the migration isthat when I copy in the new data, the new database isn't paralyzed bythe amount of bulk copying being one. For this reason, I'm splittingthe data into one-month chunks (the data's all timestamped and goesback about 3 years), exporting as CSV, compressing the files, and thenimporting them on the target server. The reason I'm using CSV isbecause we may want to also copy this data to other non-SQL Serversystems later, and CSV is pretty universal. I'm also copying in thisformat because the target server is remotely hosted and is notaccessible by any method except FTP and Remote Desktop -- nodatabase-to-database copying allowed for security reasons.My questions:1) Given all of this, what would be the least intrusive way to copyover all this data? The target server has to remain running and berelatively uninterrupted. One of the issues that goes hand-in-handwith this is indexes: should I copy over all the data first and thencreate indexes, or allow SQL Server to rebuild indexes as I go?2) Another option is to make a SQL Server backup of the database fromthe old server, upload it, mount it, and then copy over the data. I'mworried that this would slow operations down to a crawl, though, whichis why I'm taking the piecemeal approach.Comments, suggestions, raw fish?

View 2 Replies View Related

SSIS Using MASSIVE Amounts Of Memory

Feb 8, 2008

Hi,

I have a series of SSIS packages, all of which are ultimately executed by a parent package.

I'm consitently getting "OutOfMemory" errors when working with the packages which is temporarily solved by closing Visual Studio and re-opening the package(s)... This solution is short lived however as the OutOfMemory error occurs quite quickly after re-opening, often after doing nothing other than altering a variables default value and attempting to save the package.

The average size of the packages in question (.dtsx files) is around 7,000kb with the largest being 12,500kb. The total size of all the solution's packages is ~75,000kb.

The Processes tab in Task Manager shows a Mem Usage counter for devenv.exe *32 of around 20,000kb when Visual Studio is first opened however, when a single ~6,000kb dtsx file is opened this counter jumps to +300,000kb and when the entire solution is opened (When the parent package is executed), the Mem Usage counter for devenv.exe *32 is a massive +800,000kb!!!

Is this normal SSIS behaviour or do I have a major problem? Any tips or suggestions as to how to resolve this issue would be gratefully received.

FYI, "SELECT @@VERSION" gives me "Microsoft SQL Server 2005 - 9.00.3042.00 (X64) Feb 10 2007 00:59:02 Copyright (c) 1988-2005 Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 5.2 (Build 3790: Service Pack 2) "

My Server is Windows Server 2003 R2 Enterprise x64 SP2 with 8GB of RAM.

Thanks in advance.

Leigh.

View 7 Replies View Related

Best Way To Insert Large Amounts Of Data From A Webform To SQL Server 2005

Oct 21, 2007

HiI have a VB.net web page which generates a datatable of values (3 columns and on average about 1000-3000 rows).What is the best way to get this data table into an SQL Server? I can create a table on SQL Server no problem but I've found simply looping through the datatable and doing 1000-3000 insert statements is slow (a few seconds). I'd like to make this as streamlined as possible so was wondering is there is a native way to insert all records in a batch via ADO.net or something.Any ideas?ThanksEd

View 1 Replies View Related

SQL Server Storing Large Amounts Of Data In Multiple Tables

Jul 20, 2005

Hello,Currently we have a database, and it is our desire for it to be ableto store millions of records. The data in the table can be divided upby client, and it stores nothing but about 7 integers.| table || id | clientId | int1 | int2 | int 3 | ... |Right now, our benchmarks indicate a drastic increase in performanceif we divide the data into different tables. For example,table_clientA, table_clientB, table_clientC, despite the fact thetables contain the exact same columns. This however does not seem veryclean or elegant to me, and rather illogical since a database existsas a single file on the harddrive.| table_clientA || id | clientId | int1 | int2 | int 3 | ...| table_clientB || id | clientId | int1 | int2 | int 3 | ...| table_clientC || id | clientId | int1 | int2 | int 3 | ...Is there anyway to duplicate this increase in database performancegained by splitting the table, perhaps by using a certain type ofindex?Thanks,Jeff BrubakerSoftware Developer

View 4 Replies View Related

Memory Sizing For A SQL 2000 Cluster - Windows 2003 Enterprise SP1

Oct 4, 2007



I am looking for some recommendations for memory sizing and options for a SQL 2000 Cluster. This is a two node cluster built on Windows 2003 ENT SP1 (x86). Both the nodes have the following hardware:

- 4 x Dual Core AMD Processors
- 16 GB Memory
- EMC Shared Disk

We are running six SQL 2000 instances and don't expect each of these instances to use more than 1.7 GB of memory. All these instances are going to support BizTalk 2004 Databases. I already have /PAE enabled on the nodes. I am looking for the following answers:

- Do I need to enable AWE on all the instances even if the instances ? Currently, we don't have that enabled and we have seen some issues regarding excessive paging even when there is physical memory available. The DBAs think that we don't need to enable AWE. I am bit confused on this one.

- We normally run 3 instances on each node and would like size the cluster in such a way that it can take six instances in case of a node failure

Any input will be highly appreciated.

View 4 Replies View Related

Sql 2005 Updating Table Definition With 'large' Amounts Of Data - Timeout

Feb 4, 2006

I'm trying to move my current use of an sql 2000 db to sql 2005.



I need to update a table definition (to change a field to an Identity)



I'm getting a dialog box (in SQL server management studio) on save saying :



'xxxx' table

- Saving Definition Changes to tables with large amounts of data could
take a considerable amount of time. While changes are being
saved, table data will not be accessible.



I press 'Yes' to the dialog box.



After 35 seconds, I get another dialog box saying:



'xxxx' table

- Unable to modify table.

Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.



Well, the server is responding and I can query that talbe and other, I
can add/delete rows to other columns. I can modify other
(smaller) tables.



Any ideas where I can change this timeout?



Daniel

View 10 Replies View Related

DtsDebugHost Processes Consumes Huge Amounts Of Memory, Then Fails

Jul 18, 2006

I gave up on the ScriptTasks, but desided to use Custom tasks instead. Problem again. My code opens a 400M file and reads it line by line using StreamReader. Each line is approximately the same length. For each line, there is some processing, and then the line is written into another file using StreamWriter. I am watching the DtsDebugHost process with TaskManager open and here is what happens:

Initially its all good. Then when it read through first 150M+ of the input file, the memory usage of the DtsDebugHost shoots up dramatically - about 1Gig (both virtual and physical memory). Then the task fails with OutOfMemoryException. I thought the problem is with my code, but it still happens even if I only read a line from one file and write it to another, without any processing!

When I invoke the same code from Execute Process task, its all good - no problems at all whatsoever.

Any ideas?









View 1 Replies View Related

SQL Server Is Occupying Large Memory Space

Oct 3, 2001

In an Intranet Application using Win NT, Apache, Tomcat and SQL Server, the memory space used by SQL Server is drastically increasing and finally the system crashes. Nearly 40 people are accessing the system. The hardware configuration is P2 processor with 393 MB RAM and 2GB Virtual Memory. SQL Server,Web server and Servlet Engine are running on same machine.
Within three hours, SQL Server occupies 200M memory and the system perfomance comes down and finally the system stopes the tomcat servlet engine.
Anybody have any idea on this? We have nearly 1500 JSP pages,200 Bean files and 300 tables in SQL Server.

View 2 Replies View Related

TableDiff Out Of Memory Exception On Large Tables.

Sep 20, 2007

Hello,
I hope I am posting this in the right forum.

I am using tableDiff.exe to create a diff SQL script for a very large table (~4 million rows).


After a few minutes, I recieve a "System.OutOfMemoryException".

I have 4GB of ram on the machine executing the table diff.
The server is 32-bit, so adding ram is not an option.

I am executing the following command line:





Code Snippet

TableDiff.exe" -sourceserver "SERVER" -sourcedatabase "SourceDB" -sourcetable "Table1" -destinationserver "SERVER" -destinationdatabase "DestDB" -destinationtable "Table1" -f "C:TableDiffsTable1"

I have seen reports of other users executing tableDiff against 2million row tables.

Is there anyway to buffer tableDiff, so that I do not run out of memory on the server?

Could anything else be causing this error?

Thanks,
Dave

View 3 Replies View Related

Asp.net Worker Process Runs Out Of Memory When Using A Large Dataset

Feb 27, 2006

Hi,
I'm running an application on a server which grabs data from a database table on another server using SqlConnection, SqlDataAdapter and DataSet.
The application then updates every row in that DataSet's DataTable and the updates are saved back using DataAdapter. The code is pretty much straightforward code that you would find on MSDN documentation for using DataSets. The table contains a little over a million rows.
When I run the application, I get an error saying the Server Application is not available. Upon looking into the application event log, I get this message.
aspnet_wp.exe was recycled because memory consumption exceeded the 306 MB (60 percent of available RAM)
How do I get round this? I thought DataSets were supposed to handle large datatables comfortably without having memory issues.
-Thanks

View 1 Replies View Related

SQL Server 2008 :: What Could Cause Large Gaps In Servers Default Trace

Feb 12, 2015

I have the default trace on a SQL Server 2008R2 instance enabled and found today that there is a gap of nearly 4 minutes in the trace during a time of the day when there most certainly is not going to be a 4 minute window of nothing.

What if anything could cause the default trace to have a gap like this? The SQL Server Instance (against my preferences) is hosted on VMware however it has its on HOST and so its resources are not being shared with any other server. The data & log files reside on different parts of the SANS. Our IT & Network admins are looking into the issue on their end but when I looked and found a near 4 minute gap in the default trace it hit me that this could be something above/outside of SQL Server.

View 1 Replies View Related

PAE, AWE And Pagefile

Jul 10, 2006

Can you please consider the following scenario:

Windows Server 2003 Enterprise Edition (32-bit)

SQL 2000 Enterprise Edition

Server with 8GB memory



1. If we enable /3GB and /PAE, the server can see and use 8GB memory, OS has 1GB, applications can use the remaining 7GB, can install 2 instances of SQL with a max server mem set to 3GB each, SQL should be able to allocate memory from the upper 4GB RAM. (True / False)

2. We only need to enable AWE if any single instance need to cache more than 4GB of data so we could for example configure one instance with AWE and set the max mem to 5GB and set the second instance to use 2GB max . (True / False).

3. 32-bit windows server OS can support a maximum of 4GB virtual memory, does this mean setting the pagefile to more than 4GB is a waste of disk space or does PAE extend this limitation ?

4.With AWE enabled, anything above the 4GB virtual space is used for data caching only (can't be used to manage user connections, locks, stored procedure compilations...etc). How does this work when running 2 separate instances with 3GB each and AWE disabled? would they receive 4GB of virtual pageable address space each ?

Regards,

AE

View 7 Replies View Related

SQL Server 2008 :: How To Find Statements That Cause Large Memory Paging

Apr 22, 2015

I am monitoring our production server, and noticed that periodically we have spikes of Memory Paging Rate (pages/sec).

How to find particular queries/stored procedures that causing this?

View 5 Replies View Related

Visual Studio/BIDS Out Of Memory Errors Opening Large DTSX Files

Jul 26, 2006

I'm currently experiencing major problems with SSIS when opening and editing large .DTSX package files that contain Exec DTS 2000 Tasks which have the package data loaded internally. I have no issues if I point the task to a .DTS file, or to an actual DTS package on a SQL 2000 server - but if I load the package internally then once the underlying .DTSX file gets over around 17MB or so in size (which doesnt take long making a few edits to even fairly simple packages now and then), I start to experience major issues with VS/BIDS 2005 crashing randomly when I try to perform any action with the package (open, save etc). Things like OutOfMemory exception errors, followed by the properties of Exec DTS 2000 task being deleted, and also sometimes accompanied by messages about the application not being installed properly.

Again its ONLY when the underlying .DTSX file reaches a certain size limit, and only when I've got an Exec DTS 2000 task with the package loaded internally. I've replicated the issue using several different package files on several different machines (even on servers with lots of memory, fwiw).



Can anyone out there help me with this? SSIS - namely SSIS Exec DTS 2000 package tasks - are our lifeblood at my company and this trend of random and serious crashing on large package files is very disturbing to say the least.



thanks,



Wil

View 1 Replies View Related

SQL Server 2008 :: How To Find Which Queries / Processes Causing Large Memory Paging Rate

Mar 30, 2015

Our monitoring tool shows that our production system periodically experiencing large rate - up to 800 memory pages/sec. How to find out which particular queries, S.P., processes that initiate this?

View 3 Replies View Related

SQL Server Build (Pagefile Configuration)

Mar 14, 2008

Need some recommendations on . I have been tasked with building some SQL servers. The hardware config is:

4 Quad-core processors
64 GB of RAM (128GB of RAM coming in next group of servers)
4x73GB hard drives.

We normally build the servers with two Raid 1 disk groups. Mirroring the OS on one Raid group and keeping the default pagefile (for crash dumps) and then create another large pagefile on the other Raid group. We have been told from our parent company that we are to build the servers with a RAID 5 config and place the pagefile onto the same raid group as the OS. I know this is wrong and I need evidence to support my thoughts. If anyone has any links to whitepapers or research I would greatly appreciate it.

View 1 Replies View Related

Database Sizing

Oct 2, 2001

Hi,
can anyone answer me this? I recently inherited a SQL Server database set up. The user database size is about 3 times as big as necessary - does this matter? Does it have an impact on performance and if so is there anything I can do to reduce the size? From what I have managed to gather so far you cannot use alter db to change the database to a smaller size than it was originally created at. If anyone can help it would be most appreciated!


Cheers,

Petra

View 4 Replies View Related

Sizing For SQL Server 7.0

Mar 31, 1999

We are considering migrating our Oracle database over to SQLserver 7.0. I'd like to do some sizing estimates, but I have not been able find any methods for sizing SQL server 7.0 databases. There didn't seem to be any calculation guidlines or formulas in the manual or the online documentation. The only formulas I have found on the WEB are for version 6.5. Does anyone have (or know where I can find)formulas that are valid for sizing SQL server 7.0 databases?

Note: an archived message on this site referred to an attachment containing sizing formulas, but the attachment was not accesible.

View 2 Replies View Related

Sizing Db Objects

Dec 28, 2004

Hello, I'm looking for a sizing Excel (or any other format) file, that permits me evaluate the size of my sql server database, actually I have one excel file for Oracle, where I only put the expected # of rows, the average size of the columns, and other few data, and I can get the size (in Megabytes) of a table or an Index.

Could you provide me something similar??

Thanks !

View 1 Replies View Related

SQL Server Sizing

Mar 8, 2007

HI everybody,

We are designing a system using .NET/ASP.NET and SQL Server (2000 Standard as a customer requirement). The system will be hosted into 2 dedicated servers (an web/application and a db server) and we need to estimate the hardware requirements needed.

Could anyone help me on sizing this? Is there any literature on how to do this? I found some sizing web sites but I'd like to double check them. I already have some estimation on data storage size/growth based on some information I've got from the customer.



Thanks in advance!

Sergio

View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved