I have an issue where I have multiple rows of data and I need to reduce a dollar amount by a fixed maximum. I am going to throw some code in here to give a rudimentary idea of the data and what the final result should be.
I need to run an update so that the result of the following query:
select LineNum, Code, Amt, MaxAmt from
@tblLooks like this:
LineNum Code Amt MaxAmt ----------- ---- --------------------- --------------------- 1 AA 10.00 50.00 2 AA 20.00 50.00 3 AA 20.00 50.00
(3 row(s) affected)
I have tried cursors but got unexpected results or the MaxAmt always defaulted to the original even if I updated it. This seems like a simple problem but I have been banging my head against the wall for 2 days now. I've written some pretty complicated updates with less effort than this and I must have some mental block that is keeping me from figuring this out.
I have SQL Server 2000 Standard Edition running on Windows 2000 Server. There are always memory full alter. I am going to configure that.
I hope a clearation about a concept. In my enviroment, does SQL Server take maximum memroy upto 2GB, no matter if QWE is enabled?
BOL said AWE is available only in the SQL Server 2000 Enterprise and Developer editions. Does it mean AWE cannot be configured on SQL Server 2000 Standard Edition?
Out of the box it is set to 2147483647 MB. I assume this is ~2GB. I have been instructed to set it to 1GB less than total server memory which is 16GB. When I change the value it just changes back.
How do I change it and how are these Megabyte numbers calculated?
OS: Windows 2003 R2 Enterprise SQL: SQL Server 2005 Standard
Hello, I have been trying to find out what the maximum memory we can have on SQL7 standard edition server. Have not been able to find it anywhere... Anuy ideas would be greately appreciated. TIA B.
A query was taking 20 seconds and consuming 70% CPU takes only 1 second after setting Maximum Memory property to 2048 MB - why?
Server: OS Microsoft(R) Windows(R) Server 2003, Enterprise Edition Version5.2.3790 Service Pack 1 Build 3790 8 GB memory Two Dual-core AMD Opteron 285 2.6GHz Processors Server is not configured for AWE Fiber channel connection to EMC Clarion - two LUNs - one for MDF, one for LDF
SQL 2005 SQL 2005 32 bit Standard Edition - SP1 (version 9.0.2047) Three instances installed on server - only one instance in use Binaries and system databases on local mirrored disk Database file (MDF) on one EMC LUN - dedicated physical drives Log file (LDF) on one EMC LUN - dedicated physical drives
Query in question:
SELECT TOP 10 Address.Address1, Address.Address2, Address.City, Address.County, Address.State, Address.ZIPCode, Address.Country, Client.Name, Quote.Deleted, Client.PrimaryContact, Client.DBA, Client.Type, Quote.Status, Quote.LOB, Client.ClientID, Quote.QuoteID, Quote.PolicyNumber, Quote.EffectiveDate, Quote.ExpirationDate, Quote.Description, Quote.Description2, Quote.DateModified, Quote.DateAccessed, Quote.CurrentPremium, Quote.TransactionDate, Quote.CreationDate, Quote.Producer FROM ((Client INNER JOIN Address ON Client.ClientID = Address.ClientID) INNER JOIN Quote ON Client.ClientID = Quote.ClientID) WHERE (Quote.Deleted = 0) AND ((Address.AddressType)='Mailing') ORDER BY Client.Name
With default maximum memory setting (2,147,483,647 MB) - query runs in 20 seconds and consumes over 70 % of the CPU.
After changing maximum memory setting to 2048 MB, query runs in less than 1 second.
Question is: What is the best practice for setting the minimum and maximum memory settings for SQL 2005? What can be monitored to identify the cause of these type of issues - using profiler, PerfMon, other tool?
We have several 2005 servers with "Maximum server memory" set to 214 gig, which I believe is the default at installation time. I am told that this means "use all the memory there is including paging." Well, this is nuts but the servers seem to work fine with this setting no matter how much physical memory they have.
One of our 2005 servers recently started paging like crazy, so I reduced "Maximum server memory" to 6000 and the paging disappeared (server has 8 gig of physical memory) and the server appears happy.
I can not explain why only this one server has this paging issue and the others do not. Should I be setting "Maximum server memory" on all my servers? Are there other considerations which might cause the server to eat-up all the memory? As far as I know no other applications run on this box.
I'm creating a temporary table in a Sql 2005 stored procedure that contains the transaction amount entered in a period <= the period the user enters. I can return that amount in my result set. But I also need to separate out by account the amounts just in the period = the period the user enters. There can be many entries or no entries in any period. I populate the temporary table this way:
SELECT t.gl7accountsid, a.accountnumber, a.description, a.category, t.POSTDATE, t.poststatus, t.TRANSACTIONTYPE, t.AMOUNT, case when t.transactiontype=2 then amount * (-1) else amount end as transamount, t.ENCUMBRANCESTATUS, t.gl7fiscalperiodsid
FROM UrsinusCollege.dbo.gl7accounts a
join ursinuscollege.dbo.gl7transactions t on a.gl7accountsid=t.gl7accountsid
where (t.gl7fiscalperiodsid >= 97 And t.gl7fiscalperiodsid<=@FiscalPeriod_identifier) And poststatus in (2,3) and left(a.accountnumber,5) between '2-110' and '2-999' And right(a.accountnumber,4) > 7149 And not(right(a.accountnumber,4)) in ('7171','7897')
order by a.accountnumber
Later I create a temporary table that contains budget information. I join these 2 temporary tables to produce my result set. But I don't know how to get the information for just one period. For example, if the user enters 99 as the FiscalPeriod_identifier, I need a separate field that contains only those amounts(if any) that were entered for each account in Period 99.
Can anyone help? It may be that I am not seeing the forest for the trees, but I can't figure it out.
I have some code I build 2 weeks ago which I’ve been running daily but it’s suddenly stopped working with the following error.
“The table "tbl_Intraday_Tmp" has been created, but its maximum row size exceeds the allowed maximum of 8060 bytes. INSERT or UPDATE to this table will fail if the resulting row exceeds the size limit” When I google this there seems to be a related to tables with vast numbers of columns.
My table tbl_Intraday_tmp is relatively small. It has 7 columns. 1 of varchar(5), 3 of decimal(9,3) and 2 of decimal(18,0). The bit I’m puzzled with is it was working and stopped.
I don’t recall changing anything but I wouldn’t rule that out. I ‘ve inspected the source files and I don’t believe they have changed either.
I have a table named Prescription that consists of attributes like PatientId, MedicineCode, MedicineName, Prices of different drugs, quantity of different drugs(e.g 1,2,3,10), date .
I would like to get a summary of the total number and amount of different drugs in a specific period, the total amount of each type of drug.
SQL Server 2000 8.00.760 (SP3)I've been working on a test system and the following UDF worked fine.It runs in the "current" database, and references another database onthe same server called 127-SuperQuote.CREATE FUNCTION fnGetFormattedAddress(@WorkID int)RETURNS varchar(130)ASBEGINDECLARE@Address1 As varchar(50)@ReturnAddress As varchar(130)SELECT@Address1 = [127-SuperQuote].dbo.tblCompany.Address1FROM[Work] INNER JOIN[127-SuperQuote].dbo.tblCompany ON [Work].ClientID =[127-SuperQuote].dbo.tblCompany.CompanyIDWHERE[Work].WorkID = @WorkIDIF @Address1 IS NOT NULLSET @ReturnAddress = @ReturnAddress + @Address1 + CHAR(13)+ CHAR(10)RETURN @ReturnAddressENDSo now the system has gone live and it turns out that the live"SuperQuote" database is on a different server.I've linked the server and changed the function as below, but I get anerror both in QA and when checking Syntax in the UDF builder:The number name 'Zen.SuperQuote.dbo.tblCompany' contains more than themaximum number of prefixes. The maximum is 3.CREATE FUNCTION fnGetFormattedAddress(@WorkID int)RETURNS varchar(130)ASBEGINDECLARE@Address1 As varchar(50)@ReturnAddress As varchar(130)SELECT@Address1 = Zen.SuperQuote.dbo.tblCompany.Address1FROM[Work] INNER JOINZen.SuperQuote.dbo.tblCompany ON [Work].ClientID =Zen.SuperQuote.dbo.tblCompany.CompanyIDWHERE[Work].WorkID = @WorkIDIF @Address1 IS NOT NULLSET @ReturnAddress = @ReturnAddress + @Address1 + CHAR(13)+ CHAR(10)RETURN @ReturnAddressENDHow can I get round this? By the way, I've rather simplified thefunction to ease readability. Also, I haven't posted any DDL because Idon't think that's the problem!ThanksEdward
"tempdb is skipped. You cannot run a query that requires tempdb"?
We're running a .Net web application with a SQL Server 2000 backend, and we get the error intermittently. Restarting the SQL Server service seems to fix it, as it causes tempdb to be rebuilt, but this isn't a long term solution. Any direction or hints would be greatly appreciated. Thanks! - Mike
In the full recovery model, if i run a transaction that inserts 10MB of data into a table, then 10 MB of data is moved in the data file. Does this mean then that the log file will grow by exactly 10MB as well?
I understand that all transactions are logged to the log file to enable rollback and point in time recovery, but what is actually physically stored in the log file for this transactions record? Is it the text of the command from the transaction or the actual physical data from that transaction?
I ask because say if I have two drives, one with 5MB/s write speed for the log file and one with 10MB/s write speed for the data file, if I start trying to insert 10 MB of data per second into the table, am I going to be limited to 5MB/s by the log file drive, or is SQL server not going to try and log all 10 MB each second to the log file?
Hello. I have received the follwoing error upon an attempt to Browse the Cube. All other tabs are functional, including the Calculations tab. We are running Windows Server 2003 SP2 and SQL Server 2005 SP2. Any suggestions would be greatly appreciated!
**EDIT** - Have confirmed SP1 for VS2005 is installed both locally and on server, also.
Attempted to read or write protected memory. This is often an indication that other memory is corrupt. (Microsoft Visual Studio)
------------------------------ Program Location:
at Microsoft.Office.Interop.Owc11.PivotView.get_FieldSets() at Microsoft.AnalysisServices.Controls.PivotTableFontAdjustor.TransformFonts(Font font) at Microsoft.AnalysisServices.Browse.CubeBrowser.UpdatePivotTable(Boolean translate) at Microsoft.AnalysisServices.Browse.CubeBrowser.UpdateAll(Boolean translate) at Microsoft.AnalysisServices.Browse.CubeBrowser.InitialUpdate() at Microsoft.AnalysisServices.Browse.CubeBrowser.SupportFunctionWhichCanFail(FunctionWhichCanFail function)
I've been researching AWE to determine if we should enable this for our environment.
Currently we have a quad core box with 4 gb of RAM (VMware). OS: Windows 2003 std, SQL Server 2005 std. 3GB is not set but will be as soon as we can perform maintenance on the server.
I have read mixed feedback on AWE, either it works great or grinds you to a hault. I would assume that the grinding to a hault is due to not setting the min/max values correctly or not enabling the lock page in memory setting.
We only have one instance of SQL on the server and this box won't be used for anything else aside from hosting SQL services. We do plan on running SSRS off of this server as well.
1. Will running SSRS and enabling AWE cause me problems? Will I have to reduce the max setting by the SSRS memory usage or will it share and play nice?
2. How do I go about setting the Max value? Should it be less than the physical RAM in the box? Right now its set to the default of 214748364, even if I don't enable AWE should this default value be changed?
3. It seems that even at idle the SQL server holds a lot of memory and the page file grows. If I restart the process in the morning, memory usage in taskmon is at 600mb or so. By the end of the day, its up around 2gb. How can I track down whats causing this, should this even concern me?
4. The lock Page in memory setting worries me. Everything I've read on this seems to give a warning about serious OS and other program support degradation. In some cases to the point where they have to restore the settings on the server before they can bring it back up. What are your thoughts on this.
I have a Windows sever 2012 with sql server 2012 enterprise. Ram size is 22GB. Sometimes SQL sever takes 95% memory.My question, How to reduce memory size without killing any process because it's production server.So there are many background process is running. And,Is there any guides to learn why Memory is raise d so high and how to reduce it.
Hello, I understand that we should use SSMS -> Server Properties -> Memory to put a cap on the SQL server memory usage, therefore it gives some space memory for OS, this is based on the fact if the max memory is not specified, SQL will use whatever available memory and eventually crash the system.
My question is that when a server has SSIS and SSAS services installed along with the SQL service. Would the max memory setting covers the SSIS and SSAS memory usage, or the SSIS and SSAS has to shared the memory with OS?
I am running Visual Studio 2005. I have an SSIS Package which is consuming a huge amount of memory. During the execution of the package the memory keeps increasing. Until finally i get an Out of Memory exception. I have run this package using dtexec, and in the BIDS. No difference. I do have some script components and have added some code to get the assemblies in the current appdomain. I do see that one particular assembly is increasing on every loop. VBAssembly every time it hits the script component is increasing by 6, and along with it the memory is climbing. What is this VBAssembly being used for is there an update to SQL Server Integration Services that I need?
sql server 2000 is running on windows server 2003 ... 4gb of memory on server .... 2003 was allocated 2.3gb nd sql server was allocated (and using all of it) 1.6gb for total of approx 4gb based on idera monitor software ... all memory allocated betweeen the OS and sql server .... then 4 more gb of memory added for total now of 8g ... now idera monitor shows 1.7gb for OS and 1.0 gb for sql server ..... 'system' info shows 8gb memory with PAE ... so I assume that the full 8gb can now be addressed .... why are less resources being used now with more total memory .... especially sql server ..... i thought about specifying a minimum memmry for sql server but i amnot convinced that would even work since it seems that this 1gb limit is artificial .... it it used 1.6 gb before why would it not use at least that much now ??
I've a database with a memory optimized filegroup on it. How can I remove it?I have removed the memory optimized table I had on it, but when I try to remove the filegroup I receive an error.
So I started a new job recently and have noticed a few strange configurations. Typically I would never mess with min memory per query option and index create memory option configuration because i just haven't seen any need to. My typical thought is that if it isn't broke... They have been modified on every single server in my environment.
From Books Online: • This option is an advanced option and should be changed only by an experienced database administrator or certified SQL Server technician. • The index create memory option is self-configuring and usually works without requiring adjustment. However, if you experience difficulties creating indexes, consider increasing the value of this option from its run value.
How can I tell how often a checkpoint is being executed?
I have a lot of data to insert into a table (via a SQL insert from another database on the same server), and I do not want to fill the log ... so I will do a SET ROWCOUNT 100000, insert records, wait for the checkpoint to run which will clear the log, ... and repeat this process untill all the records are inserted.
I have a report which lists sales figures by salesperson, and I need the report to highlight the maximum amount (ie. the person who got the highest sales figures). How can I do this?