Variable Memory Utilization And Performance Considerations
Apr 26, 2006
Can someone point me to some good articles or perhaps directly supply some words of wisdom with regard to wise utilization of variables within a T-SQL script from and standpoint of conserving memory usage and improved execution cost?
For example:
(1) Is it better to use varchars, nvarchars, etc. defined with minimal lengths to support the needs of the script or is it just as efficient to declare all with a length of say 4,000?
(2) I've seen behavior that leads me to believe that when passing a variable as a parameter in a nested procedure call, if the declared types of the parameter and the variable being passed in don't match (i.e. one is numeric(38,10) and the other is int), then implicit type conversions hurt performance. Is this true and how broadly does it apply?
(3) Does the number of variables declared in a script materially impact the performance and / or resource utlization?
(4) Is it more efficient to have a series of variable value assignments in a single SELECT statement versus a series of SET statements? Should I always perfer one to the other? Only within a looping construct?
Thanks,
Shadowraven
View 1 Replies
ADVERTISEMENT
May 10, 2000
I installed 4GB of memory and I have never seen SQL memory utilization go beyond 2GB. I have SQL server set up to use as much memory as it needs. Does anyone no if SQL server can make use of more than 2GB.
View 1 Replies
View Related
Feb 28, 2000
Ours is a SQL 7.0 Enterprise edition with NT 4.0 Enterprise Edition. SQL Server has been configured with the default, 'Dynamic memory Allocation'. The system has 4GB of RAM. This is a dedicated SQL Server machine. But SQL Server seems to use only 1.8GB RAM(Counter: Total Server Memory) The page faults seem to be a max. of 600 and an avg. of 100. The processor utilization has suddenly increased to 90%. Is there anything wrong with the way SQL server is using memory? Is is not true that SQL Server 7.0 Enterprise edition can use upto 3GB RAM in a 4GB system?
Are there any links that can help troubleshoot this problem?
Thank you.
-Praveena
View 2 Replies
View Related
Jul 20, 2005
What do people think is normal for memory utilization? I know that's toobroad, so here are some basics.MS SQL Server 2000, Windows 2000 Server, 2GB RAMDb 1, size = 2.0 GBDb 2, size = 300MBDb 3, size = 50MBDb 4, size = 30MBDb 5, size = 30MBTypically 4-6 users, moderate usage 8-hrs/day. Performance has not slowed.Reboot on Sunday. sqlservr.exe in the Task Manager reports the followingSun 61MBMon 200MBTues 800MBWed 1,124MBThu 1,424MBFri 1,303MBI was getting srv 2020 errors when I had just 1 GB RAM: "The server wasunable to allocate from the system paged pool because the pool was empty."Then I did several updates to address this and got more RAM. I haven't seenthe errors since, but I haven't waited for them to happen: I'm rebootingevery week now. The memory numbers make me suspect SQL Server.Scratching my head. Not sure if my problem is gone, and this is normal SQLServer 2000 behavior, or if my problem is still lurking and I've only mutedit a bit.Any thoughts greatly appreciated.Tom
View 3 Replies
View Related
Apr 19, 2007
Dear all,
One of the server is having 2 GB of RAM and task manager is showing 1.87 GB memory in use.
I have to migrate few databases on the same server.
With high IO Operations.
I know server require more RAM, but how can i prove that server needs more RAM ?
Regards
Mohd Sufian
View 3 Replies
View Related
Sep 27, 2001
I have a few in house developed application (VB based) that access the SQL server for adding, appending , creating tables. The application does the changes thru queries dynamically generated at the application level.
My MS SQL Server runs on a PIII / 256 MB Ram / 18 GB HDD
The problem is that the memory utilization of SQL server keeps growing constantly. Out of 512 MB (256 Physical + 256 Virtual) available teh memory utilization reaches a level of 490 MB and statys constant. Though SQL Server shows a utilization of 150 MB.
I suspect that SQL is not releasing memory back to the system. Please help in resolving. The problem may lie at the applications developed.
Jdindian
View 2 Replies
View Related
Feb 12, 2007
Environment: Win2003 SP1, 32 bit, SQL Server 2K5
My server has 16GB RM but it is using only 3GB. And I see my server is using 3GB of Virtual Memory, too. Why my physical memory is not being utilized? How can I increase Physical Memory usage and decrease VM usage?
Canada DBA
View 19 Replies
View Related
Apr 28, 2015
We have a system(32GB RAM and 2 TB hard disk, Windows7,SQL SERVER 2008R2 enterprise 64 bit). Looks like whenever i run some query(even query result 50 records) on the database, the Memory utilization is very high(30 GB) in task manager. How can i control this over usage? The memory setting is default in server properties(min 0 and max 2147483647).
View 7 Replies
View Related
Apr 30, 2015
I am searching for query to find total memory allocated to sqlserver, Â and how much being used utilized as well cpu utilization percentage .Â
View 10 Replies
View Related
Mar 26, 2015
We have two server configured in always on mode .
windows is sql server 2012 64 bit edition and sql server is 2012 64 bit edition.
RAM installed on both server is around 65 gb of which 49 gb is max server memory allocated for sql services on both servers.
database related to reporting services are also in always on group .
We have also configured for reporting services and both are running on their respecting server.
Issue is on primary server reporting service is using almost 7 gb while on secondary it is using 10 gb even when there are 5 reports and its used within offices .
what issue and how to check why ssrs is using high memory..
any query , perfmon counters
reports are randomly used at client side
i have checked memory utilization through task manger..
View 0 Replies
View Related
Aug 2, 2006
Hi
I did a load testing and found the following observations:
1. The Memory:Pages/sec was crossing the limit beyond 20.
2. The Target Server Memory was always greater than Total Server Memory
Seeing the above data it seems to be memory pressure. But I found that AvailableMemory was always above 200 MB. Also Buffer Cache HitRatio was close to 99.99. What could be the reason for the above behavior?
View 1 Replies
View Related
Oct 31, 1999
Just two questions actually. We have built a ColdFusion based forums package. Currently we have it on two beta sites, and we are usin sql 7 for the db. Firstly the forums are serving 200-300 people at any given time, about 16,000 unique people a day. SQL & seems to stay at around 50% cpu usage on a dual p3 with 512mb ram. Is that normal? Seems like alot of cpu usage. The other thing is it takes 500mb of ram and just about drains the server of all of its ram even though in the memory properties for the sql server its set to 255mb maximum. Any insight is appreciated.
View 2 Replies
View Related
Mar 17, 2008
Hi,
I hope somebody here can help on my problem.
I wrote a MFC application using sql ce and is running prefectly smooth on wince and PPC 2003.
But when i deploy it to Windows Mobile, the database performance drop dramatically.
I knows the drop on performance is due to the I/O speed on the flash memory.(The previous mobile os are using ram instead).
Is there any solution or work around which I can solve this problem.
Recently i solved the performance issues on "insert" to DB by using a commit buffer(Instead of commit to the sdf instantly).
But how about the "select" performance? It's too slow which take about 3 sec to select a record from the DB.
Does Microsoft provide any suggestion on the sqlce perofrmance on windows mobile?
Thanks
Keith
View 3 Replies
View Related
Jan 10, 2008
Hi Experts,
here we are having a very very serious problem,
in SQL 2005, we are having 8 CPU's and in the task manager CPU usage is showing 100% and the performance of server is very very poor and last night the server has got rebooted, still CPU Usage is showing 100%, So how can i improve the performance? it is very very urgent for me, can any one please let me know what is going on and what is the solution.
-Thanks N Regards,
Kanthi.
View 3 Replies
View Related
Sep 24, 2007
Hi -
I am facing 2 problems :
PROBLEM 1 :
We have a few packages that run pretty fast on a desktop server with 2 Gig RAM, Dual processor (approx 4-5 hours). But the same packages run very very slow on the another server containing 8 CPU and 12 Gig RAM (ran for 24 hours without completing).
PROBLEM 2 :
The CPU% ranges from 40-80% and the PF usage is stagnant at 2GB on desktop server for the same package. But in the 8CPU server, the CPU % ranges from 0-10% but the PF Usage raises from 750 MB to 8 GB.
This has become critical to our application.
TIA,
Shabs
View 4 Replies
View Related
May 31, 2007
Hi All,
I'm new at SQL Server, I decided to use it in order to get all the advantages of using Vb.net and Sql. My server is a SQL standard version. I'm using a relational DB most of the time for complex select queries, everytime the server is used it performs 30 or 40 queries at the time, and I have recently realized that server consumes a lot of memory after one or two days of beign up.
Let's say that if I restart the SQL Server memory usage is about 650 Mb, but after two days memory is 1.4Mb, I have used Sql Profiler and Tunning Assistant where it recommended me to create some indexes, which indeed I created them, but that did not solved the memory problem, although some queries run faster.
My questions are:
is this memory usage is normal ?
if not, what should I check out to reduce memory usage?
Thanks in advance
George
View 12 Replies
View Related
Oct 5, 2006
Hi,
I know the SSIS memory problem has probably been covered quite a bit, but being a newbie in using the SSIS, I'm not quite sure how to improved the performance of my SSIS package.
Basically, I've a package that loops through all the subdirectories within a specified directory, and it then loops through each file in the subdirectory and with the use of the Data Flow, process each file (according to their filenames) with a Script Component to insert data into a SQL DB.
Each subdirectory has up to 15 different csv files, but each is less than 5kB. I probably have about 100 subdirectories.
When I run the package, it functioned properly, but the package stalled (no error but just stuck in one Data Flow) after a while, and when I checked my CPU memory, it was running at 100%.
I'm not sure how I could fix it or improved the memory allocation. I was not expecting to have any memory problems as the file size is small and the number of rows of data going into and out of the Script Component is no more than 20.
Any advice? Thanks.
View 1 Replies
View Related
Aug 13, 2007
A query was taking 20 seconds and consuming 70% CPU takes only 1 second after setting Maximum Memory property to 2048 MB - why?
Server:
OS Microsoft(R) Windows(R) Server 2003, Enterprise Edition
Version5.2.3790 Service Pack 1 Build 3790
8 GB memory
Two Dual-core AMD Opteron 285 2.6GHz Processors
Server is not configured for AWE
Fiber channel connection to EMC Clarion - two LUNs - one for MDF, one for LDF
SQL 2005
SQL 2005 32 bit Standard Edition - SP1 (version 9.0.2047)
Three instances installed on server - only one instance in use
Binaries and system databases on local mirrored disk
Database file (MDF) on one EMC LUN - dedicated physical drives
Log file (LDF) on one EMC LUN - dedicated physical drives
Query in question:
SELECT TOP 10 Address.Address1, Address.Address2, Address.City, Address.County, Address.State, Address.ZIPCode, Address.Country, Client.Name,
Quote.Deleted, Client.PrimaryContact, Client.DBA, Client.Type, Quote.Status, Quote.LOB, Client.ClientID, Quote.QuoteID, Quote.PolicyNumber,
Quote.EffectiveDate, Quote.ExpirationDate, Quote.Description, Quote.Description2, Quote.DateModified, Quote.DateAccessed, Quote.CurrentPremium,
Quote.TransactionDate, Quote.CreationDate, Quote.Producer FROM ((Client INNER JOIN Address ON Client.ClientID = Address.ClientID) INNER JOIN Quote ON
Client.ClientID = Quote.ClientID) WHERE (Quote.Deleted = 0) AND ((Address.AddressType)='Mailing') ORDER BY Client.Name
Address table - 161,075 rows
Client table - 161,634 rows
Quote table - 59,145 rows
With default maximum memory setting (2,147,483,647 MB) - query runs in 20 seconds and consumes over 70 % of the CPU.
After changing maximum memory setting to 2048 MB, query runs in less than 1 second.
Question is:
What is the best practice for setting the minimum and maximum memory settings for SQL 2005?
What can be monitored to identify the cause of these type of issues - using profiler, PerfMon, other tool?
Thanks
View 2 Replies
View Related
Jul 5, 2007
Hello,
I'm havin a problem with my database server in the network, i'm running a windows 2003 server standard edition with sql server 2005 standard edition.
the problem is that the server get stock and the performance of the whole network is affected, when i use the tak manager to monitor the performance i can see that the sqlservr.exe proccess is using 1,397,928 k of memory usage, in the performance monitor the graphics get crazy and the cpu usage grows up untill 85%.
Can you please let me know if there is something that i can do to normalize the server performance in order to let the network user work with the applications feeded by this server.
Thanks in advance for your help.
View 6 Replies
View Related
Apr 22, 2014
Getting the following warning in SSIS - SQL 2012:
[SSIS.Pipeline] Warning: Could not open global shared memory to communicate with performance DLL; data flow performance counters are not available. To resolve, run this package as an administrator, or on the system's console.
Microsoft had a fix for SQL Server 2008.
How to get around this in SQL 2012?
View 4 Replies
View Related
Jul 20, 2005
Fellas!!This is a very complicated one and it took me a few days to figure outexactly what's going on, but here's the final story:I have a production environment running on .NET with a SQL Server(2000, SP3). The SQL Server is on a dedicated Proliant computer with2GB RAM (the actual SQLServer.exe process has dynamic memoryassignment and can reach up to 1.6GB RAM). Nothing else is running onthat specific computer.Once the SQLServer is started, it hits 300MB RAM (the minimum that wasset in the configuration of the server - remember, it is dynamicallyaquired).Then there is a .NET program that requests just about all the data theSQL Server contains (apart from a single table that contains roughly1.6 million rows and another table that contains about 10000 rowswhich are all of type IMAGE).Once all the data is retrieved, the RAM is at about 400MB. From thereon, every update I make to the data on the server causes the RAM to goup by a bit (that updates are done in a Transaction which of course iscommitted at the end). It seems that BLOB updates are the majorproblem in all of this. For some reason, uploading a blob of size 9MBcauses the RAM to go up by roughly 20MB and after commit it gose down10MB (total gain of roughly 10MB RAM). Eventually the SQLServerprocess hits its upper limit (1.6GB) and at this point it startsslowing down.Some performance checks showed me the SQLServer has a lot of diskactivity, it seems it is reading and writing pages of data from/to theHD all the time (which causes the queries to be much much muchslower).We have a development environment running the exact same code (it isthe exact same in everything, except for the amount of data stored inthe DB). This does not happen there at all.I have a few questions:1. Why is the RAM going up after BLOB updates?2. Why is the RAM going up at all?3. How can I tell the DB which tables should remain in the RAM at alltime (never swapped back to the HD?) - DBCC PINTABLE does not seem todo the job.It does not seem to have anything to do with the .NET code.Thank you very much,M Yamo.
View 4 Replies
View Related
Nov 17, 2009
I am getting the following warning for my SSIS08 package: Could not open global shared memory to communicate with performance DLL; data flow performance counters are not available. To resolve, run this package as an administrator, or on the system's console. I did check Warning in SSIS 2008 , but didn't find any solution. The package processes data and executes fine , but why do I see this warning? When I run this package on my machine, I see no such warning, it's only when I deploy it to our DEV SSIS server, I get this warning.
View 7 Replies
View Related
Sep 13, 2014
Ok I am faced with working with XML on a regular basis, which is fine.
DECLARE @ViewSN INT
IF NOT EXISTS (select null from tblviews where viewcode = 'loadAtTerm') --<workflowEventType>loadAtTerminal</workflowEventType>
insert into tblviews (ViewName,Description,OutBoundForm,StoredProcSN,TriggersReply,ViewCode,DispXactLayer,DispXactViewType,DispXfcTag,Comments)
select 'QC:WF-LoadAtTerminal','This View Corresponds to the XML for loadAtTerminal in Omnitracs Workflow','0',NULL,'0', 'loadAtTerm','MCOM','MCOM',NULL,NULL
[code]...
What would be really useful is to be able to present any xml file and automatically parse the NODE names into a memory variable table and then the fields of each node in another.
View 7 Replies
View Related
Jan 4, 2004
I've read that table variables give better performance than temporary tables as they are kept in memory and don't need to be recorded in transaction logs ect however I have a stored proceedure which takes 0.183 seconds to execute but when I change the one temporary table used in the proceedure to a table variable the execution time increases to 0.223.
Not much of an increase I admit but it just seems contrary to what I've read.
I want to get the best performance possible so can someone explain to me what is going on ?
View 2 Replies
View Related
Apr 19, 2007
I have a table with about 200 million rows of data. I add a couple million rows of data each week to the table in a single load process. The table is used for reporting purposes only and there are never (not intentionally at least) any updates or deletes to the table. The data is always being added to the "end" of the table with the new AsOfDate being the main factor in the clustered index.
My question is this: Since I'm not "inserting" rows that would split pages, should I have my FILLFACTOR for the table set to 100, or am I missing something? I obviously want to save physical hard drive space, but I also don't want to slow down the import process.
BTW, I'm using SQL2000
View 1 Replies
View Related
Apr 19, 2007
I would like to know people's thoughts on any special network considerations to take for mirroring and the logic behind them. Is it best to segregate mirroring traffic from other network traffic? Use a VLAN? Dedicate one NIC for mirroring and the other for general network traffic or just aggregate the two and let both types of traffic share the bandwidth?
I haven't seen much in this area from Microsoft's best practices and wanted to know what those who have implemented it have done and why. There are pros and cons for each method: Letting everything share one massive pipe with load balancing vs. trying to segregate traffic in some way so that general network connections etc. do not impact the mirroring capability.
I look forward to hearing from you.
-M-
View 4 Replies
View Related
Sep 14, 2007
We have about 150 SQL servers and basically we're considering the pros and cons of installing SSIS on a central SSIS server - that is responsible for all DTS jobs - as opposed to installing SSIS on the local SQL instance.
On the plus side so far:
1./ Central administration, alerting, change management etc
2./ Possible performance gain on the local instance not having SSIS installed?
On the negative side:
1./ Central point of failure
2./ Possibility that it would need to be a clustered...
3./ Compatibility issues may mean having to make the central SSIS server 32-bit?
4./ Possible performance cost of remote SSIS?
5./ With multiple DTS packages running at different times, when would we take the server down for maintenace...?
Would appreciate your thoughts.
View 1 Replies
View Related
Jun 21, 2007
Hi all,
Can a publisher be mirrored? What are the implications, issues, gotchas? Transactional, Merge or Transactional w/ Updating Subscribers is what I'm considering.
Bottom line is I would like to use mirroring, but only one mirror will not suffice.
Thank you in advance.
Ray Nichols
View 1 Replies
View Related
Nov 15, 2006
Hello, I'm developing application which monitors network packets. The monitoring data are saved into table. Monitoring table maintains the data for fixed quantum time,for example during one 1 hour. So, every minute before or after insert new data, I delete the time-expired data. I doubt that the endless delete operation would results in some problems(increasing index,etc..).
Is this mechanism safe to the dbms?
Aren't there round-robin(?) style table?
View 1 Replies
View Related
Apr 12, 2006
I would like to know how to, if at all possible, to reconstruct the following trigger as to be able to handle multiple row insert when a single insert command is used - because the trigger will only be called once...I'm not familiar and don't know anything about cursors and i've read that its not the best way to go.
TRIGGER ON childtable INSTEAD OF INSERT
AS
BEGIN
DECLARE @customkey char(16);
DECLARE @nextchild int;
DECLARE @parent int;
DECLARE @date datetime;
SET @date = getdate();
SELECT @parent = parenttable FROM inserted;
SELECT @nextchild=count(*)+1 FROM childtable WHERE parenttable = @parent;
IF (@nextchild >= 9998) return;
SET @customkey = €˜type€™+ convert(char(4),year(@date)) + convert(char(2),month(@date)) + convert(char(2),day(@date))+convert(char(4),@nextchild + 1);
INSERT INTO childtable (customkey,parent) VALUES (@customkey,@parent);
END
View 7 Replies
View Related
Oct 31, 2006
Can someone point me to a resource for Table Design Considerations for Merge Replication? I have an ASP.Net/SQL2K5 app that I need to run on disconnected machines, then allow dfor data sync through merge replication. I assume that the first step is getting my tables indexed in a replication friendlt manner?
Many Thanks to anyone who can point me in the right direction!
View 3 Replies
View Related
Dec 10, 2007
I may be overthinking this, but I want to make sure this is right. If you have a processor license of SQL Server Standard running both Reporting Service databases and the IIS interface, isn't it true that the underlying licenses of other servers containing your data are irrelivent in the context of serving the reports over the web? Example. Server 1 has SSRS as described above, processor license of Standard. Server 2 has user license of SQL Enterprise and serves data to a couple of reports on Server 1. This does not violate a license, correct? Doesn't Server 1 just take one of the CALs from Server 2?
View 1 Replies
View Related
Feb 13, 2008
in a prior "legacy" life we couldn't imagine 24x7 implementations because it was important to 1) reorganize databases periodically to remove fragmentation that adversely affected performance and 2) back up databases just in case.
In a 24x7 SQL Server 2005 implementation, high level only, how are these and other maintenance related things accomplished with confidence?
I dont think SQL cleanses its own page splits unsolicited. Are DBAs totally reliant on logs in full recovery installations where db must be up 24x7? What if the devices those logs sit on fail? What if the logs become too large? Is it likely that if you want 24x7 you're looking at Enterrise Edition only?
I'm totally aware of and confident in the sliding window partitioning thing but it seems to me there must be more out there in terms of periodic, more frequent maintenance activity.
View 3 Replies
View Related