How Can I Use Dual Core CPU For Performance Improvement
Jul 17, 2007
Hi,
I am running MSSQL 2005 Standard edition on a two processor Intel Xeon 3GHz (dual-core) with 8GB RAM.
I notice in "Windows task manager CPU performance" while running a long SQL statement (takes 1.5 hours), only 1 logical (out of 4) is utilised at >70%. The remaining 3 logical processors hover around 10%
Using Performance monitor, the average read queue, write queue, and pages/sec also hovers around 25%, indicating no heavy physical disk/memory loading.
How can I set to utilise more physical/logical processor to improve the MSSQL performance ?
We are in process of moving to 64 bit HP servers with sql2005 standard edition. We were just wondering which is better option, to get a server with 2 dual core processor or to get a srver with just 4 processor? How does SQL2005 handle the hypertheading of dual processor?
Hi,Is there a reason why we have to pay more for licensing for a differentkind of processor?Why are we not charged for the Hyperthreading on some processors also.If Oracle is really conserned about the low end business market (smalland medium), then they should drop their attitude on Dual Coreprocessors.If they start charging as if it was a normal processor, and ask thenormal price, then they would get more of this market coming in.As long as Oracle keeps on having the attitude of charging more,because Intel or some other cpu vendor decided to mprove theirprocessors because of overheating problems, I will have the attitudethat I will keep on reoccomending alternatives for Orcle like Mysql /Postgre sql / Sybase, etc to the small/medium sector.Microsoft's pricing model on double core processors suddenly soundallot better.Oracle are shooting themselves in the foot! Or am I the only personfeeling this way?Shaun O'Reilly
ok i have intel dual core i have a conflict only in playing a game black hawk down it gives me a run stop error. locks up and has to be restarted. microsoft gave me a fix but when i do the fix it causes me to get a system dump error on the game. i can update my web site do anything else let daughter play her games or do her school work and nothing happens. i was told i needed to set up the dual core so that my programs dont conflict i am a moron when it comes to computers is there a fix for this or i am i just going to have to go back to single core processor for now thanks
After reading an article today on SQL Server Central about choosing the best connectors for ssis (obviously written to drive sales, but hey it looks like it's working) I downloaded an evaluation copy of DataDirect's 64-bit DB2 driver and started some preliminary testing.
-Rather easy to install (downloaded, briefly read documentation, and was up and running in about 10 mintues) -With Microsoft's IBM OLE DB provider for DB2 (32-bit) we can extract 1 million records in approx. 9 minutes -With the 64-bit driver we can extract 1 million records in approx. 2.25 minutes! 4 times as fast as the 32-bit driver! -Was unable to get the driver to work through linked server. Tech support opened an issue to look into it. -Sales rep didn't have exact pricing, but she thought they charged per core. -DataDirect also has 64-bit Oracle and Sybase drivers available.
For those of you extracting large amounts of data from DB2, Oracle, or Sybase with SSIS and are looking to improve performance I'd recommend at least checking this product out.
Also, I'd be interested to hear if anyone else started testing this or any other 64-bit driver for DB2.
We have a 200MHz Pentium Pro based machine, with 128MB RAM running SQL Server 6.5. Because of performance issues, we are contemplating an upgrade to a dual 200MHz Pentium Pro processor with 256MB RAM. However, the vendor we are dealing with has suggested an upgrade to a single Pentiun II/333MHz first, and if this still causes problems, then to a dual P II/333MHz. Does anyone have any suggestions from similar upgrades that they may have undergone? We have 72MB allocated to SQL Server.
Is performance of web application (ASP.NET + SQL Server 2005 Wrg edition + Win Server 2003 Web edition) running on server with one core duo/4 CPU generally comparable to the performance of the same application running on the same server with 2/4 physical CPU’s?
Hi,I've been creating a db application using MS Access and MSDE. Only twoof us are using the application, and the server and the app both rungreat on my laptop (1.6 GHz Pentium M, 2GB RAM, W2KPro). Only problemis when I take my laptop home, my coworker loses access to the server.We recently purchased a dedicated server to run the db on at theoffice. It's a 2.8 GHz Dual Xeon, 2GB RAM, running XPPro. We alsobought SQL Server, but I installed the Personal Edition becuase we arenot using a server OS. It's my understanding that XP can utilize bothprocessors, and the Personal Edition can use both processors as well.(On a side note, why is Enterprise Manager showing that I have 4processors - why?) In addition, I understand PE has a work-loadgovernor that cripples performance when more than 5 TSQL commands arebeing run simultaneously.I backed up the db on my laptop and restored it on our new server. Butwhen I run the exact same queries with the exact same number of rows,my queries on the new server are take 3x longer(!?). Can someoneplease offer a few suggestions for why this is happening? What can Ido to improve performance on the server machine?Please let me know if I need to supply more information.Thanks,Alex
I am planning to build a server to be used as a SQL Server and web server.Right now I can only use a single box for both.I have read some threads were dual processors are having problems with someparallel queries and the suggestions of having sql server use a single CPU.My budget is limited so I am debating whether to get 2.6G dual xeon 533FSBor dual P4 800FSB (DRR@ ram) or stick with a speedy single cpu.If I get a dual cpu motherboard, is it a good idea to have 1 cpu used forsql server and the other for everything else?John Dalberg
In our business we receive debt from clients that we work in order to try and reclaim some of the debt on their behalf. When a debt file comes in from a client it is loaded onto the system and every entry in that debt file is stamped with the same batch id. Using the batch id we track each debt file and monitor how well we are working that debt file. So as an example we could receive a debt file with 100 records in it each of these records would be stamped with a batch id of say 100001, the next file that is loaded onto the system, each record would then be stamped with 100002. so we can track how each batch is doing by grouping by batch id.
I have written my query that basically groups all the accounts on the system by batch id. This is fine when I only want to return totals and sums in the select list. The way I have the written the query If I want to calculate the payments we have received for each batch or the commission we will make on the payments (or any other info per batch) I have to use subquerys in the select list using the batchid I have grouped by . Is this the best/only way to achieve what I want to do.
(Select sum(da.Amount) from dbo.DebtAdjustment da WITH (NOLOCK), dbo.ImportBatchItem bi WITH (NOLOCK) where bi.ItemID=da.DebtID And bi.ImportBatchID=ib.ImportBatchID) Adjustments, --------------------------------------------------------------- (Select count(dh.DebtID) from dbo.Debthistory dh WITH (NOLOCK), dbo.ImportBatchItem bi WITH (NOLOCK) where --No. OBC's bi.ItemID=dh.DebtID And dh.Note like 'OBC:%' And bi.ImportBatchID=ib.ImportBatchID) OBC_Num, --------------------------------------------------------------- (Select count(dh.DebtID) from dbo.Debthistory dh WITH (NOLOCK), dbo.ImportBatchItem bi WITH (NOLOCK) where --No. ICC's bi.ItemID=dh.DebtID And dh.Note like 'ICC:%' And bi.ImportBatchID=ib.ImportBatchID) ICC_Num, --------------------------------------------------------------- (Select count(dh.DebtID) from dbo.ImportBatchItem bi WITH (NOLOCK), dbo.DebtHistory dh WITH (NOLOCK) Where dh.DebtID=bi.ItemID AND bi.ImportBatchID=ib.ImportBatchID AND dh.UserName='Letter Server' AND dh.Note like '%Letter%') ItemsMailed, --------------------------------------------------------------- Cast((Select sum(CASE WHEN dp.ReceivedByID = 1 THEN dp.Amount * (tF.Rate /100) WHEN dp.ReceivedByID = 2 THEN dp.Amount * (tD.Rate /100) ELSE dp.Amount * ((tF.Rate + tFe.Rate) / 100) END) From dbo.DebtPayment dp JOIN dbo.Debt d ON d.DebtID=dp.DebtID JOIN dbo.ImportBatchItem bi ON dp.DebtID=bi.ItemID
LEFT JOIN dbo.mTrackerFeeChange tF ON tF.ClientID=d.ClientID AND tF.ContractID=d.ContractID AND dp.ReceivedByID=tF.RateType AND ( (dp.PaymentOn >= tF.StartDate AND dp.PaymentOn <= tF.EndDate) OR (dp.PaymentOn >= tF.StartDate AND tF.EndDate IS NULL) )
LEFT JOIN dbo.mTrackerDirectChange tD ON tD.ClientID=d.ClientID AND tD.ContractID=d.ContractID AND dp.ReceivedByID=tD.RateType AND ( (dp.PaymentOn >= tD.StartDate AND dp.PaymentOn <= tD.EndDate) OR (dp.PaymentOn >= tD.StartDate AND tD.EndDate IS NULL) )
LEFT JOIN dbo.mTrackerFieldChange tFe ON tFe.ClientID=d.ClientID AND tFe.ContractID=d.ContractID AND tFe.RateType=dp.ReceivedByID AND ( (dp.PaymentOn >= tFe.StartDate AND dp.PaymentOn <= tFe.EndDate) OR (dp.PaymentOn >= tFe.StartDate AND tFe.EndDate IS NULL) ) where bi.ImportBatchID=ib.ImportBatchID) AS decimal(10,2)) ComRate,
--------------------------------------------------------------- (SELECT sum(Debt.OutstandingValue) TotalValue From Debt WITH (NOLOCK), DebtStatus WITH (NOLOCK), ImportBatchItem bi WITH (NOLOCK) WHERE Debt.Status=DebtStatus.DebtStatusID AND bi.ItemID=Debt.DebtID And bi.ImportBatchID=ib.ImportBatchID AND DebtStatus.Description LIKE ('%' + 'Arrangement' + '%') AND Debt.AgreementPaymentMethodID = 1 AND Debt.OutstandingValue > 0) Girobank, --------------------------------------------------------------- (SELECT sum(Debt.OutstandingValue) TotalValue From Debt WITH (NOLOCK), DebtStatus WITH (NOLOCK), ImportBatchItem bi WITH (NOLOCK) WHERE Debt.Status=DebtStatus.DebtStatusID AND bi.ItemID=Debt.DebtID And bi.ImportBatchID=ib.ImportBatchID AND DebtStatus.Description LIKE ('%' + 'Arrangement' + '%') AND Debt.AgreementPaymentMethodID = 2 AND Debt.OutstandingValue > 0) StandingOrder, --------------------------------------------------------------- (SELECT sum(Debt.OutstandingValue) TotalValue From Debt WITH (NOLOCK), DebtStatus WITH (NOLOCK), ImportBatchItem bi WITH (NOLOCK) WHERE Debt.Status=DebtStatus.DebtStatusID AND bi.ItemID=Debt.DebtID And bi.ImportBatchID=ib.ImportBatchID AND DebtStatus.Description LIKE ('%' + 'Arrangement' + '%') AND Debt.AgreementPaymentMethodID = 3 AND Debt.OutstandingValue > 0) CreditCard, --------------------------------------------------------------- (SELECT sum(Debt.OutstandingValue) TotalValue From Debt WITH (NOLOCK), DebtStatus WITH (NOLOCK), ImportBatchItem bi WITH (NOLOCK) WHERE Debt.Status=DebtStatus.DebtStatusID AND bi.ItemID=Debt.DebtID And bi.ImportBatchID=ib.ImportBatchID AND DebtStatus.Description LIKE ('%' + 'Arrangement' + '%') AND Debt.AgreementPaymentMethodID = 4 AND Debt.OutstandingValue > 0) DirectDebit, --------------------------------------------------------------- (SELECT sum(Debt.OutstandingValue) TotalValue From Debt WITH (NOLOCK), DebtStatus WITH (NOLOCK), ImportBatchItem bi WITH (NOLOCK) WHERE Debt.Status=DebtStatus.DebtStatusID AND bi.ItemID=Debt.DebtID And bi.ImportBatchID=ib.ImportBatchID AND DebtStatus.Description LIKE ('%' + 'Arrangement' + '%') AND Debt.AgreementPaymentMethodID = 5 AND Debt.OutstandingValue > 0) Cheque, --------------------------------------------------------------- (SELECT sum(Debt.OutstandingValue) TotalValue From Debt WITH (NOLOCK), DebtStatus WITH (NOLOCK), ImportBatchItem bi WITH (NOLOCK) WHERE Debt.Status=DebtStatus.DebtStatusID AND bi.ItemID=Debt.DebtID And bi.ImportBatchID=ib.ImportBatchID AND DebtStatus.Description LIKE ('%' + 'Arrangement' + '%') AND Debt.AgreementPaymentMethodID = 6 AND Debt.OutstandingValue > 0) PostalOrder, --------------------------------------------------------------- (SELECT sum(Debt.OutstandingValue) TotalValue From Debt WITH (NOLOCK), DebtStatus WITH (NOLOCK), ImportBatchItem bi WITH (NOLOCK) WHERE Debt.Status=DebtStatus.DebtStatusID AND bi.ItemID=Debt.DebtID And bi.ImportBatchID=ib.ImportBatchID AND DebtStatus.Description LIKE ('%' + 'Arrangement' + '%') AND Debt.AgreementPaymentMethodID = 7 AND Debt.OutstandingValue > 0) Cash, --------------------------------------------------------------- (SELECT sum(Debt.OutstandingValue) TotalValue From Debt WITH (NOLOCK), DebtStatus WITH (NOLOCK), ImportBatchItem bi WITH (NOLOCK) WHERE Debt.Status=DebtStatus.DebtStatusID AND bi.ItemID=Debt.DebtID And bi.ImportBatchID=ib.ImportBatchID AND DebtStatus.Description LIKE ('%' + 'Arrangement' + '%') AND Debt.AgreementPaymentMethodID = 8 AND Debt.OutstandingValue > 0) BankersDraft ---------------------------------------------------------------
From dbo.Client c WITH (NOLOCK), dbo.Debt d WITH (NOLOCK), dbo.ImportBatchItem ib WITH (NOLOCK) Where c.ClientID=d.ClientID AND d.DebtID=ib.ItemID AND (@ClientID IS NULL OR c.ClientID = @ClientID) Group by c.Name, ib.ImportBatchID,CONVERT(VARCHAR(10),ib.UpdatedOn,103),DATENAME(Month, ib.UpdatedOn) + ' ' + DATENAME(Year, ib.UpdatedOn) Order by ib.ImportBatchID
I did a trace on a production DB for many hours, and got more than 7 million of "RPC:Completed" and "SQL:BatchCompleted" trace records. Then I grouped them and obtained only 545 different events (just EXECs and SELECTs), and save them into a new workload file.
To test the workload file, I run DTA just for 30 minutes over a restored database on a test server, and got the following: Date 28-12-2007 Time 18:29:31 Server SQL2K5 Database(s) to tune [DBProd] Workload file C:Tempfiltered.trc Maximum tuning time 31 Minutes Time taken for tuning 31 Minutes Expected percentage improvement 20.52 Maximum space for recommendation (MB) 12874 Space used currently (MB) 7534 Space used by recommendation (MB) 8116 Number of events in workload 545 Number of events tuned 80 Number of statements tuned 145 Percent SELECT statements in the tuned set 77 Percent INSERT statements in the tuned set 13 Percent UPDATE statements in the tuned set 8 Number of indexes recommended to be created 15 Number of statistics recommended to be created 50 Please note that only 80 of the 545 events were tuned and 20% of improvement is expected if 15 indexes and 50 statistics are created.
Then, I run the same analysis for an unlimited amount of time... After the whole weekend, DTA was still running and I had to stop it. The result was: Date 31-12-2007 Time 10:03:09 Server SQL2K5 Database(s) to tune [DBProd] Workload file C:Tempfiltered.trc Maximum tuning time Unlimited Time taken for tuning 2 Days 13 Hours 44 Minutes Expected percentage improvement 0.00 Maximum space for recommendation (MB) 12874 Space used currently (MB) 7534 Space used by recommendation (MB) 7534 Number of events in workload 545 Number of events tuned 545 Number of statements tuned 1064 Percent SELECT statements in the tuned set 71 Percent INSERT statements in the tuned set 21 Percent DELETE statements in the tuned set 1 Percent UPDATE statements in the tuned set 5 This time DTA processed all the events, but no improvement is expected! Neither indexes/statistics creation recomendation.
It does not seem that Tuning Advisor crashed... Usage reports are fine and make sense to me.
What's happening here? It looks like DTA applied the recomendations and iterated, but no new objects where found in DB.
I guess that recomendations from the first try with only 80 events were invalidated by the remaining from the long run.
I am in the process of evaluating 2005 to upgrade a DTS and A/S solution for accounts receivable data imported from a legacy application on various AS/400s in our enterprise. Our current process uses staging and mart databases with views providing a layer of abstraction. Instead of using upgrade wizards, I would prefer to implement a "ground up" rearchitecture that better incorporates the 2005 toolset.
One of the challenges is building a fact table for historical a/r balances since the legacy application will only provide transactions, meaning that there is no data entry for a month where there might be a balance, but no transaction (payment, invoice, credit, etc) occurred. Our current implementation uses a date table as its base, perfoming a left join to the transaction table. I have to use subqueries to calculate previous balances, which is also used to calculate ending balance.
Should I replace the current view with data flow logic that would perform this calculation? If so, what would be the best method? A merge join? A calculated column? If it helps to provide sample schema snipets, I would be happy to do so.
im trying to create a ssis monitoring tool (using sysdtslog90, reporting services and system tables) including sql server agent ssis schedules/history. but im having some problems evaluating sysdtslog90. im using the built in SSIS logging without any custom logging tasks (i only log OnError and OnWarning) to keep the logging table as small as possible (one sysdtslog90 table for all packages running on the server).
and there is the problem:
there are some executions without any events except OnError. the source of this event is the name of the failed task, so how do i get the name and id of the failed package?
"normal" executions have "PackageStart" and "PackageEnd" where the sourceid and source are equal with the id and name of the package
i could add logging for OnPreValidate and catch the first line, because no package can fail before validation!?
but isnt there any better solution?
i would love it to have two more columns in sysdtslog90.. package id and name. would be much easier to evaluate sysdtslog90 :/
Hi,I wonder if you could shed some light into this.I have the following table.Id, ContentId, VersionDate, ContentXmlThere are several ContentIds in the table.SELECT *FROM tblVersionsWHERE (VersionDate =(SELECT MAX(tblVersions2.VersionDate)FROM tblContentVersion AStblVersions2WHERE tblVersions2.ContentiD =tblVersions.ContentiD))ORDER BY ContentIdThis query works to select the latest versions (MAX) of every content,but I do not like it, any other way to do this properly?I also want to do this knowing a set of ids (probably using IN )SELECT *FROM tblVersionsWHERE (VersionDate =(SELECT MAX(tblVersions2.VersionDate)FROM tblContentVersion AStblVersions2WHERE tblVersions2.ContentiD =tblVersions.ContentiD AND tblVersions.ContentiD IN (1, 2, 3, 6, 7, 8)))ORDER BY ContentIdAny ideas for improvements on this query?ContentXml is of ntext typeThanks,/ jorge
Problem: We have a set of sql queries and one core duo processor. We want one subset of queries would be executed on first core of our CPU and the rest of queries would be executed on the second core of CPU. We are using MS SQL Server 2005 – Workgroup edition.
Is there any way how to do this with one instance of SQL Server or with two instances at least (is there any way how to force an instance A to use CPU1 (first core of core duo CPU) and instance B to use CPU2 (second core of core duo CPU)) ?
The number of data files within a single filegroup should equal to the number of CPU cores. So for a quad core CPU there should be 4 data files. Is there anything that one need to take care besides creating the data files for a filegroup. I belieev SQL takes care of itself how to update data files .
We just purchased a Quad Core xeon server. It it my understanding that express can only utilize a single cpu/core. Is there a way I can setup windows or ms sql server to dedicate one particular core soley for ms sql?
I'm wondering if it is possible to have two independent y-axes on a single chart?
I am wanting to plot data using a hybrid bar/line chart and have the values for the vertical bars show on the left y-axis and the values for the points on the line show on the right y-axis.
I've seen this sort of thing done on other platforms before, but don't know if it is possible with SQL Reporting. We are using SQL RS 2000 by the way.
I have created a C# Windows Forms application that can be run connected directly to SQL Server 2005 (publisher) for in-office users and to a SQL Express (subscriber) on a tablet PC for remote users. The server is set through configuration and the remote users sync using replication and it all works.
The issue I'm having is that I've found it necessary to install SQL Server management objects (SQLServer2005_XMO) on the clients who sit at desktops and never use replication or SMO. I believe this is because SMO has to be installed in the GAC and I can't just distribute the required dlls with my app.
Is there any way I can deploy this app to always connected users without installing SMO on their machines?
I configured an SSIS package to collect information about servers in our environment and as a part of it the package collects the Physical and Logical CPU's. Since we are on per-core licensing for SQL Servers, i would need to get the exact core count. I can simply do Logical cpu / Physical CPU to get the Core count assuming that hyperthreading is turned on. What if the hyperthreading is not on, then i would end up getting the wrong Core count.
I query the registry to get this info. I would like to get your inputs for getting the exact core count on Windows servers with Intel and AMD processors.
Hi I am trying to populate a table with 2 FKs as its PK (SiteID and ProductDescID). First 1) I add in all the products whose Manfacturer and Category are supposed to appear on the site and then 2) I add in all the extra products that are needed regardless of their manufacturer or category. The problem I am having is if the product has already been added to the ProductCatTable due to its Manufacturer or Cateogry but is also in the SatForceProduct table. The can’t insert duplicate PK error is thrown. I don’t know how to do this IF NOT EXISTS statement (or what ever else may be needed) so that I can check whether a line from the Forced table needs to be added. I am not passing in any parameters and I am expecting more than 1 line to be inserted in each of the statements. Please help -- 1) Populate INSERT INTO dbo.ProductCatTable (SiteID, ProductDescID) SELECT dbo.SatSite.SiteID, dbo.ProductDesc.ProductDescID FROM dbo.SatManu INNER JOIN dbo.ProductDesc ON dbo.SatManu.Manu = dbo.ProductDesc.Manu INNER JOIN dbo.SatCats ON dbo.ProductDesc.Cat = dbo.SatCats.Cat INNER JOIN dbo.SatSite ON dbo.SatManu.SatID = dbo.SatSite.SiteID AND dbo.SatCats.SatID = dbo.SatSite.SiteID 2) Add Force Ins IF NOT EXISTS(SELECT SiteID, ProductDescID FROM ProductCatTable WHERE ????????) BEGIN INSERT INTO dbo. ProductCatTable (SiteID, ProductDescID) SELECT SiteID, ProductDescID FROM dbo.SatForceProduct END Thanks in advance J
I am building a DTS Package that is moving data from our webstore (written in house) to a Warehouse Management System(WMS - Turnkey) and I've encountered a problem. both pieces of software have an orders table and an Ordered_Items table, related by the order_ID (makes sense so far). Here is the problem. The primary key on the webstore's Ordered_Items table is a single column (basically an Identity variable), while the primary key on the WMS's Ordered_Items table is a dual column primary key, between the Order_ID and the Order_LineID, so the data should be stored like:
OrderID Order_LineID 1 1 2 1 2 2 2 3 3 1 3 2 4 1
Get the Idea? So I have to create this new Order_LineID column. How can I accomplish this with a SQL statement?
We are trying to work with our developers to upgrade to SQL 2000 from SQL 7 for a critical applicaion and all looks good in testing for the most part. The concern that our developers have is that in order for the application to work on the test SQL 2000 server they had to delete a core data type (bigint) for the application to work. It doesn't appear to have any negative affects and we know for sure that the application database does not need that data type at all. Can someone verify that there are no requirements for SQL 2000 needing to have this data type? They are worried that something within SQL may rely on it and we would find out the hard way in production possibly.