Optimizing A Query
Jun 3, 2004
Hello,
I am hoping someone here can help me optimize the following query:
SELECT
INCOMING.DATE_TIME, INCOMING.URL, INCOMING.HITS,
USER_NAMES.USER_LOGIN_NAME,
CATEGORY.NAME
FROM
(wsHQMay2004.dbo.INCOMING INCOMING INNER JOIN wsHQMay2004.dbo.CATEGORY CATEGORY ON INCOMING.CATEGORY = CATEGORY.CATEGORY)
INNER JOIN wsHQMay2004.dbo.USER_NAMES USER_NAMES ON INCOMING.USER_ID = USER_NAMES.USER_ID
WHERE
INCOMING.DATE_TIME >= '2004-05-01 00:00:00.00' AND
INCOMING.DATE_TIME < '2004-06-01 00:00:00.00'
ORDER BY
INCOMING.URL ASC
I am just hoping to get some tips on perhaps a better way to write this query as right now, due to the size of the incoming table, this query just takes forever.
Any advise will be apreciated.
Thanks.
View 5 Replies
ADVERTISEMENT
May 28, 2004
we have an insurance program up and running in our regions and we get random reports of slowness. in an effort to track down all facets of slowness i am looking at all my sql code to make sure it is as efficient as possible. I know a little about SQL and writing SQL statements, enough to help me do my job well. but i do not write optimized code.
if request.form("selPolicyNum") <> "" then
sqlPolicyInfo = "SELECT PIEffectiveDate, PIExpirationdate from PIMaster where PIPolicyNum='" & request.form("selPolicyNum") & "'"
Set rsPolicyInfo = Server.CreateObject("ADODB.Recordset")
Set rsPolicyInfo.ActiveConnection = webLookupConn
'rsPolicyInfo.CursorType = adOpenDynamic
'rsPolicyInfo.LockType = adLockOptimistic
rsPolicyInfo.Source = sqlPolicyInfo
'rsPolicyInfo.CursorLocation = adUseClient
rsPolicyInfo.Open
'response.write sqlPolicyInfo
end if
if request.form("selPolicyNum") <> "" then
sqlRemarkInsert="INSERT INTO clRemarks (PolicyNum, AccountNum, UserID, RemarkBody, CreationDate, PolEffectiveDate, PolExpDate, RemarkCategory1, RemarkCategory2, RemarkCategory3, RemarkCategory4, RemarkCategory5, RemarkCategory6, RemarkCategory7, RemarkCategory8, RemarkCategory9, RemarkCategory10, RemarkCategory11, RemarkCategory12, RemarkCategory13) VALUES ('"
sqlRemarkInsert=sqlRemarkInsert & request.form("selPolicyNum") & "', '" & request.form("txtAcctNum") & "', '" & request("aysmenu")("userid") & "', '" & Replace(request.form("RemarkBody"),"'","''") & "', '" & CurrDate & "', '" & rsPolicyInfo("PIEffectiveDate") & "', '" & rsPolicyInfo("PIExpirationDate") & "', '" & request.form("chkIssuingInstructions") & "', '" & request.form("chkLossControl") & "', '" & request.form("chkLossHistoryClaims") & "', '" & request.form("chkReinsurance") & "', '" & request.form("chkMVRDriverIssues") & "', '" & request.form("chkBilling") & "', '" & request.form("chkGeneral") & "', '" & request.form("chkCancellationDNR") & "', '" & request.form("chkPolicyAmendments") & "', '" & request.form("chkExperienceRatingInfo") & "', '" & request.form("chkDiscretionaryPricing") & "', '" & request.form("chkPremiumAudit") & "', '" & request.form("chkQuotingInstructions") & "')"
else
sqlRemarkInsert="INSERT INTO clRemarks (AccountNum, UserID, RemarkBody, CreationDate, RemarkCategory1, RemarkCategory2, RemarkCategory3, RemarkCategory4, RemarkCategory5, RemarkCategory6, RemarkCategory7, RemarkCategory8, RemarkCategory9, RemarkCategory10, RemarkCategory11, RemarkCategory12, RemarkCategory13) VALUES ('"
sqlRemarkInsert=sqlRemarkInsert & request.form("txtAcctNum") & "', '" & request("aysmenu")("userid") & "', '" & Replace(request.form("RemarkBody"),"'","''") & "', '" & CurrDate & "', '" & request.form("chkIssuingInstructions") & "', '" & request.form("chkLossControl") & "', '" & request.form("chkLossHistoryClaims") & "', '" & request.form("chkReinsurance") & "', '" & request.form("chkMVRDriverIssues") & "', '" & request.form("chkBilling") & "', '" & request.form("chkGeneral") & "', '" & request.form("chkCancellationDNR") & "', '" & request.form("chkPolicyAmendments") & "', '" & request.form("chkExperienceRatingInfo") & "', '" & request.form("chkDiscretionaryPricing") & "', '" & request.form("chkPremiumAudit") & "', '" & request.form("chkQuotingInstructions") & "')"
end if
rsRemarkInsert=ComProConn.Execute (sqlRemarkInsert)
rsPolicyInfo.close
Set rsPolicyInfo=nothing
that is the code used to store a remark into the system. is this code optimized already or should some of the db parameters be changed to make things faster? this is just an example of many of the SQL statements that i may or may not have to fix. thank you for any and all help.
View 1 Replies
View Related
Jun 23, 2008
I have the following query that works fine but i'm wondering if there is a way to optimize it better as when I analyze through sql profiler it is at the top of the list of using the cpu
SELECT DISTINCT site, d, (SELECT COUNT(id) FROM anP aPV2 WHERE aPV2.confirmed=1 and aPV2.stage=2 and aPV2.inserted=0 and aPV2.site=aPV1.site and aPV2.d>=aPV1.d and aPV2.d<=aPV1.d) AS mycount FROM anP aPV1 WHERE confirmed=1 AND stage=2 AND inserted=0 ORDER BY site,d
View 8 Replies
View Related
Sep 18, 2007
Hi,
I have the below query written so that i do not insert entries that is already existing in the table. I am trying to put in 70000 entries at a single shot and it breaks down. Can anybody help me optimize the below query so that it doesnt break? Is there any other way I can write this query?
Please do help me with this. Thanks in advance. The table in which i am inserting the entries has a composite key composed of ACCT_NUM_MIN and ACCT_NUM_MAX. I am getting this from a table which doesnt have a primary key(CORE)
INSERT INTO CRF (CORE_UID,ACCT_NUM_MIN, ACCT_NUM_MAX,BIN, BUS_ID,BUS_NM,ISO_CTRY_CD, REGN_CD, PROD_TYPE_CD, CARD_TYPE)
SELECT UID , LEFT(ACCT_NUM_MIN,16), LEFT(ACCT_NUM_MAX,16), BIN, BUS_ID, BUS_NM, ISO_CTRY_CD, REGN_CD, PROD_TYPE_CD, CARD_TYPE FROM CORE o
WHERE NOT EXISTS (SELECT * FROM CRF i
WHERE o.ACCT_NUM_MIN = i.ACCT_NUM_MIN
AND o.ACCT_NUM_MAX = i.ACCT_NUM_MAX)
View 20 Replies
View Related
Feb 9, 2007
To start with, I'll give a simplified overview of my data.BaseRecord (4mil rows, 25k in each Region)ID | Name | Region | etcOtherData (7.5mil rows, 1 or 2 per ID)ID | Type(1/2) | DataProblemTable (4mil rows)ID | ConcatenatedHistoryThe concatenated history field is a nvarchar with up to 20 differentpipe delimited date/code combinations, eg. '01/01/2007X|11/28/2006Q|11/12/2004Q|'Using left outer joins (all from base, the rest optional) I've got aview something like:View (4mil rows)ID | Name | Region | etc | Data | Data2 | ConcatenatedHistoryQuerying it, it takes about 15-20 seconds to do this:Select ID, Name, Region, etc, Data, Data2, ConcatenatedHistory
Quote:
View 5 Replies
View Related
Feb 2, 2007
if I have the following tables:
Table Customer_tbl
cust_id as int (PK), ...
Table Item_tbl
item_id as int (PK), ...
Table Selected_Items_tbl
selected_item_id as int (PK), cust_id as int (FK), item_id as int (FK), ...
-------
With the following query:
select cust_ID from selected_items_tbl WHERE item_id in (1, 2, n) GROUP BY cust_id, item_id HAVING cust_id in (select cust_id from selected_items_tbl where item_id = 1) AND cust_id in (select cust_id from selected_items_tbl where item_id = 2) AND cust_id in (select cust_id from selected_items_tbl where item_id = n)
-------
Each of these tables has other items included. Selected_Items_tbl holds zero to many of the items from the item_tbl for each customer. If I am searching for a customer who has item 1 AND item 2 AND item n, what would be the most efficient query for this? Currently, the above query is what I am working with. However, it seems that we should be able to do this type of search in a single query (without subqueries).
View 3 Replies
View Related
Sep 27, 2006
I'm trying to get a query to run which looks at completed orders that have not had another paid order in 180 days. The database I'm running it against is very large so I can't get it to complete. Where's what I've got:
select Date =cast(cl1.cl_rundate as datetime(102)),count(cl1.cl_recno) as 'Completed Initials', cl1.cl_status as Status from dbo.vw_Completedorders cl1 where cl1.lob_lineofbusiness = 'aaa'
and cl1.cl_rundate > '20060801' and not exists (
select cl2.cl_company from dbo.vw_Paidorders cl2 where
cl2.lob_lineofbusiness = 'aa'and cl2.cl_company = cl1.cl_order and cl2.cl_rundate > '20060101' and datediff(day,cl2.cl_rundate,cl1.cl_rundate) < 180)
group by cl1.cl_status, cl1.cl_rundate
View 11 Replies
View Related
Nov 7, 2007
Hi
I have one table (tableDemo) with following structure:
TSID bigint (Primary Key)TskID bigint Sequence bigint Version bigint Frequency varchar(500) WOID bigint DateSchedule datetime TimeStandard real MeterEstimated real MeterLast real Description ntext CreatedBy bigint CreatedDate datetime ModifiedBy bigint ModifiedDate datetime Id uniqueidentifier
I have 8000 records in this table. When I fire simple select command (i.e. select * from tableDemo) it takes more than 120 seconds.
Is there any techniques where I can access all these records within 2-3 seconds ?
Regards,
ND
View 2 Replies
View Related
Nov 4, 2000
hi, what are the tools that I can use to Optimize the query/ index.
I know that if I am running a query on a table I create index on the fields where I use in the where clause, is this a right thinking.
Someone told me how do I determine what columns should be indexed, I told him the fields that I use in the where clause should be indexed to speed up the process of retrieving the data. ..... is this answer correct. if not please advice the correct one.
Thanks
Ahmed
View 1 Replies
View Related
Dec 17, 2005
We have SQL Server 2000 and int is an Oracle linked server. I'm trying to run the following query...
SELECT DISTINCT a.auf_nr AS OrderNo,
e.ku_name AS Customer,
d.bestell_dat AS OrdDate,
d.liefer_dat AS DelvDate,
CAST(SUM(b.anz) AS FLOAT) Qty,
CAST(SUM((CAST(c.breite AS FLOAT) / 1000 * CAST(c.hoehe AS FLOAT) / 1000) * b.anz) AS FLOAT) SQM,
CAST(SUM(a.liefer_offen) - (SUM(a.anz) - SUM(b.anz)) AS FLOAT) AvailDelv,
CAST(SUM(a.liefer_anz) AS FLOAT) Delvd,
CAST(SUM(c.sum_brutto*a.anz) AS FLOAT) Value
FROM liorder..LIORDER.AUF_STAT a,
liorder..LIORDER.AUF_LIP_STATUS b,
liorder..LIORDER.AUF_POS c,
liorder..LIORDER.AUF_KOPF d,
liorder..LIORDER.KUST_ADR e
WHERE a.auf_nr = b.auf_nr and
b.auf_nr = c.auf_nr and
c.auf_nr = d.auf_nr and
d.kunr = e.ku_nr and
a.auf_pos = b.auf_pos and
b.auf_pos = c.auf_pos and
b.lip_status = 7 and
c.ver_art !='V' and
a.history = 0 and
a.rg_stat != 2 and
e.ku_name IS not null and
e.ku_vk_ek = 0 and
d.bestell_dat BETWEEN '01/01/2005' and '12/17/2005'
GROUP BY a.auf_nr,
d.liefer_dat,
b.lip_status,
d.bestell_dat,
e.ku_name,
d.kopf_tour,
d.kopf_firma
HAVING CAST(SUM(a.liefer_offen)-(SUM(a.anz)-SUM(b.anz)) AS FLOAT) > 0
..and it takes around 2 minutes to show the results even if the date range is of the same date. I even tried to use an indexed column but I still get the same slow execution time. I even tried to create a UDF so that the WHERE clause would be resolved remotely on the Oracle DB but still the same. Is there anyway I can do it in much more efficient and faster way?
View 1 Replies
View Related
Nov 4, 2007
Hi there,
I've got a SQL 2005 table(DataTable) with 5.5 million rows and a clustered index on an integer column. The column is a foreign key pointing to another table (LookupTable) that associates the integer with an nvarchar(50) column. Both integer and nvarchar(50) columns are in a clustered index.
I'm running equivalent select statements with slightly different where clauses, and am not seeing the performance similarities I'd expect.
select * from DataTable join LookupTable on DataTable.Integer = LookupTable.Integer
1. Setting the where clause to the nvarchar(50) value on the LookupTable takes 64 seconds.
2. Setting the where clause to Integer = (select integer from LookupTable where nvarchar(50) = 'mysearchstring') takes 61 seconds.
3. Setting the where clause to Integer = 526 takes 1 second.
Is this poor design in SQL, or is it a misconfiguration? I would have expected SQL to modify my query to essentially be #3 in all cases before touching the DataTable, which would cause all three versions to take 1 second.
Thanks,
Jason
View 2 Replies
View Related
Sep 11, 2005
Hi everyone!
I am parsing a database structed like the following for duplicates.
Code:
keywordnegativeexactbroadphrase
Alpha0.120.36
Alpha0.120.37
Alpha0.120.37
Alpha0.220.37
Alpha0.120.37
Alpha0.120.37
Alpha0.120.37
Beta0.43212
Charlie0.240.31
Delta0.72212
Epsilon410.31
Foxtrot250.22
Grape420.215
Hotel 230.13
Indigo0.3721
Juliet21220.14
Kilroy0.3110.15
Lemon0.2201
Mario0.2502
Nugget0.130.72
Oprah2141
Polo01225
Polo01224
Polo01225
Q-Bert31442
Romeo12100.1
Salty2200.1
Tommy10.30.132.4
Uri10.30.132
Uri20.20.132
Uri20.30.122
Uri20.30.131
Vamprie0.120.1315
Wilco0.210.21
X-Ray00.220
Yeti020.20
Zebra1310
The duplicates that this thread relates to are the kind with duplicate "keyword" entries AND dissimilar field entries; i.e. :
Code:
keyword negative exact broad Phrase
Polo 0 122 4
Polo 0 122 5
I've come up with an SQL query that seems to return all of these duplicates (save one of each type- the 'real', unique entry). However I think I made the query very inefficient. My SQL is very bad; this query will be running over tens of thousands of rows, so if it can be at all optimized I would greatly appreciate your help!
What I have so far is:
Code:
string query1 = "SELECT * FROM TableName" +
" WHERE EXISTS (SELECT NULL FROM TableName" + " b" +
" WHERE b.[keyword]= " + "TableName"+ ".[keyword]" +
" AND b.[negative]<> " + "TableName"+ ".[negative]" +
" ORb.[keyword]= " + "TableName"+ ".[keyword]" +
" ANDb.[exact]<> " + "TableName"+ ".[exact]" +
" ORb.[keyword] = " + "TableName"+ ".[keyword]" +
" ANDb.[broad]<> " + "TableName"+ ".[broad]" +
" ORb.[keyword]= " +"TableName"+ ".[keyword]" +
" ANDb.[phrase]<> "+"TableName"+ ".[phrase]" +
" GROUP BY b.[keyword], b.[broad], b.[exact]" +
" HAVING Count(b.[keyword]) BETWEEN 2 AND 50000)" ;
the algoritm seems to check every column of every row in order to determine a duplicate. Seems straightforward to me, but alas slow...
Is there a better/faster way I can do this? Thanks for you help!
View 1 Replies
View Related
Jan 6, 2004
Hello everyone!
I've got a problem with a real slow query, I would be very happy if somebody has any idea to improve the speed of it...
The idea is to get the top 2 products, a customer hasn't bought wich are in his interest...
query (simplificated)
-------------------------------------------------
SELECT TOP 2 prodID, Title, Price FROM bestSold7Days WHERE
prodID NOT IN (SELECT prodID FROM orders INNER JOIN orderProducts ON orders.orderID = orderProducts.orderID WHERE (orders.custID=394))
AND
(prodType = COALESCE((SELECT TOP 1 products.prodID FROM orders INNER JOIN orderProducts ON order.orderID = orderProducts.orderID INNER JOIN products ON orderProducts.prodID = products.prodID WHERE (orders.custID=394) GROUP BY products.prodType ORDER BY SUM(orderProducts.PCS) DESC), 2))
-------------------------------------------------
end query
(COALESCE is for replacing if the customer hasnt ordered anything, or hasnt ordered anything of this type)...
Thanks for any time spent!
View 14 Replies
View Related
Jul 20, 2005
I have a DELETE statement that deletes duplicate data from a table. Ittakes a long time to execute, so I thought I'd seek advice here. Thestructure of the table is little funny. The following is NOT the table,but the representation of the data in the table:+-----------+| a | b |+-----+-----+| 123 | 234 || 345 | 456 || 123 | 123 |+-----+-----+As you can see, the data is tabular. This is how it is stored in the table:+-----+-----------+------------+| Row | FieldName | FieldValue |+-----+-----------+------------+| 1 | a | 123 || 1 | b | 234 || 2 | a | 345 || 2 | b | 456 || 3 | a | 123 || 3 | b | 234 |+-----+-----------+------------+What I need is to delete all records having the same "Row" when there existsthe same set of records with a different (smaller, to be precise) "Row".Using the example above, what I need to get is:+-----+-----------+------------+| Row | FieldName | FieldValue |+-----+-----------+------------+| 1 | a | 123 || 1 | b | 234 || 2 | a | 345 || 2 | b | 456 |+-----+-----------+------------+A slow way of doing this seem to be:DELETE FROM XWHERE Row IN(SELECT DISTINCT Row FROM X x1WHERE EXISTS(SELECT * FROM X x2WHERE x2.Row < x1.RowAND NOT EXISTS(SELECT * FROM X x3WHERE x3.Row = x2.RowAND x3.FieldName = x2.FieldNameAND x3.FieldValue <> x1.FieldValue)))Can this be done faster, better, and cheaper?
View 3 Replies
View Related
Sep 4, 2001
I have indexed my SQL Server tables to gain some speed on calling up tables and queries ( using VB and ADO ). It is still very slow...Is there a move I have to make once my tables are indexed or is there any tricks to improve the speed cause I am getting kinda desparate right now :(
View 1 Replies
View Related
Jul 15, 2000
Hi all,
I have a question in regards to optimistic locking:
I have a database conversion that will be running on a SQL 7.0 system. The process needs to be completed ASAP and to this end, I have tried to set up all aspects of the server to be geared towards speed rather than redundancy for the duration of the process (i.e. moving heavily used tables to separate filegroups on a RAID 0 set, dedicating a separate disk for the database log). I was now looking at trying to tweak locking behaviour to enhance performance (as for the duration of the conversion, no other user will be connecting to the database - the only initator of data changes will be the conversion application, which feeds statements serially to the server). As far as I know changing lock settings is something that would be initiated by the application itself, but is there any property I can set on the server to further enhance performance in this area?
Thanks
A
View 1 Replies
View Related
Jun 5, 2002
We are evaluating a tool by Lechotech that can optimize sql statements. It is a pretty good tool, but we would like to compare it against some others. Has anyone seen any other such tools?
View 1 Replies
View Related
Jan 14, 2005
I'm no SQL wiz, just know basics to get me by ... What I'm trying to do is: everytime a record is inserted into an online orders table, that record needs to be inserted into another table in another database, but with added information.
This is the Trigger I came up with:
CREATE TRIGGER OtherDatabaseInsertTrigger
ON dbo.t_order
FOR INSERT AS
DECLARE @CLIENT VARCHAR(30)
DECLARE @OrderNumberID INT
SET @CLIENT = 'DevShed'
INSERT INTO test2.dbo.t_order (order_num,customer_num,
order_date,drop_date,package_id,us_customer,us_system,non_us_customer,
non_us_system,postage_selection,num_pages,num_sides,permit_number,
permit_city,permit_state,permit_zip,bre_company,bre_name_dept,bre_address1,
bre_address2,bre_city,bre_state,bre_zip,brc_company,brc_name_dept,brc_address1,
brc_address2,brc_city,brc_state,brc_zip,rae_company,rae_name_dept,rae_address1,
rae_address2,rae_city,rae_state,rae_zip,return_company,return_name_dept,
return_address1,return_address2,return_city,return_state,return_zip,salutation_other,
printcost,stock,production,personalization,postage_total,listcost,listcharged,
tax_total,order_total,discount,coupon,milestone_1,milestone_2,milestone_3,milestone_4,
milestone_5,milestone_6,has_logo,has_list,has_rad,tag,nonprofit_fee,order_problem,
experian_only,experian_orderid,experian_price,experian_returncode,experian_returnmessage,
experian_queryid,experian_amt_to_purchase,bp_bankname,bp_address1,bp_address2,bp_city,
bp_state,bp_zip,bp_country,bp_phone,bp_signername,bp_signertitle,bp_signeremail,logo_option,
logo_text,signature_option,list_option,layout_id,cd_only,cd_oi,cd_rc,cd_ot,cd_email,
shipping_id,shipping_charge,pfp_pnref,pfp_result,pfp_respmsg,pfp_authcode,pfp_avsaddr,
pfp_avszip,creditcard_num,ccexp_month,ccexp_year,cc_name_on_card,cust_postage)
SELECT order_num,customer_num,
order_date,drop_date,package_id,us_customer,us_system,non_us_customer,
non_us_system,postage_selection,num_pages,num_sides,permit_number,
permit_city,permit_state,permit_zip,bre_company,bre_name_dept,bre_address1,
bre_address2,bre_city,bre_state,bre_zip,brc_company,brc_name_dept,brc_address1,
brc_address2,brc_city,brc_state,brc_zip,rae_company,rae_name_dept,rae_address1,
rae_address2,rae_city,rae_state,rae_zip,return_company,return_name_dept,
return_address1,return_address2,return_city,return_state,return_zip,salutation_other,
printcost,stock,production,personalization,postage_total,listcost,listcharged,
tax_total,order_total,discount,coupon,milestone_1,milestone_2,milestone_3,milestone_4,
milestone_5,milestone_6,has_logo,has_list,has_rad,tag,nonprofit_fee,order_problem,
experian_only,experian_orderid,experian_price,experian_returncode,experian_returnmessage,
experian_queryid,experian_amt_to_purchase,bp_bankname,bp_address1,bp_address2,bp_city,
bp_state,bp_zip,bp_country,bp_phone,bp_signername,bp_signertitle,bp_signeremail,logo_option,
logo_text,signature_option,list_option,layout_id,cd_only,cd_oi,cd_rc,cd_ot,cd_email,
shipping_id,shipping_charge,pfp_pnref,pfp_result,pfp_respmsg,pfp_authcode,pfp_avsaddr,
pfp_avszip,creditcard_num,ccexp_month,ccexp_year,cc_name_on_card,cust_postage
FROM inserted;
SET @OrderNumberID = (SELECT @@IDENTITY)
UPDATE test2.dbo.t_order SET client = @CLIENT WHERE oid = @OrderNumberID;
I don't know if its possible to do an INSERT INTO SELECT with additional fields in the 2nd table, I was trying, but failed. Had to resort to the bottom piece of SQL to get the ID and run a separate query to add the additional items to the new record in table 2.
Any SQL masters out there that can help me make this better, or know of some other way to do this.
Thanks in advance!
View 6 Replies
View Related
Jul 7, 2004
I've tried a bunch of different ways in an effort to stay away from using a cursor, but I haven't been able to accomplish what I need to do without one. So, I coded this process using cursors and performance (as expected) is pretty mediocre. I was wondering if someone could take a quick look and suggest a different approach or maybe suggest ways to optimize the current code.
*Attached* is my code.
TIA
View 4 Replies
View Related
Oct 24, 2005
Here's a sample SELECT statement:
SELECT T1.F3
FROM T1 INNER JOIN T2 ON T1.F4 = T2.F4
WHERE
(T1.F1 > @iNum AND T2.F1 > @iNum)
OR
( @iNum2 * (T1.F1 - T2.F1)/(T1.F2 - T2.F2) ) + (T1.F1 - ((T1.F1 - T2.F1)/(T1.F2 - T2.F2) * T1.F2) ) > @INum
As you can see, the second part of the WHERE (after the OR) is much more complicated than the part before the OR. My query would run a lot faster if it tried the first part of the OR and didn't bother with the second part if the first part was satisfied. Is there any way to do this?
Thanks!
Tyler
View 3 Replies
View Related
May 4, 2006
Kudos to y'all!!! I have this SQL script...
SELECT * FROM OPENQUERY (liorder, '
SELECT DISTINCT
a.AUF_NR AS OrdNo,
e.KU_NAME AS Customer,
a.AUF_POS AS Pos,
f.PC_PANE_NO AS Pane,
f.PC_SGGL_SEQ AS Component,
f.PC_SGGL_COD AS GlassCode,
d.GL_BEZ AS GlassDesc,
a.ANZ AS Qty,
((c.BREITE/1000*c.HOEHE/1000)*a.ANZ) AS SQM,
(a.ANZ*c.SUM_BRUTTO) AS Val,
(CASE
WHEN(SELECT SUM(h.KF_FERT_QTY)
FROM LIPROD.KAPA_AUS_FERT h
WHERE a.AUF_NR = h.KF_ORDER_NO AND
a.AUF_POS = h.KF_ORDER_POS AND
f.PC_PANE_NO = h.KF_SCHEIB_NR AND
f.PC_SGGL_SEQ = CASE
WHEN h.KF_SEQ_NR = 0 THEN 1
ELSE h.KF_SEQ_NR
END AND
h.KF_SCHR_NR IN (2, 402, 502, 602)) IS NULL THEN 0
ELSE(SELECT SUM(h.KF_FERT_QTY)
FROM LIPROD.KAPA_AUS_FERT h
WHERE a.AUF_NR = h.KF_ORDER_NO AND
a.AUF_POS = h.KF_ORDER_POS AND
f.PC_PANE_NO = h.KF_SCHEIB_NR AND
f.PC_SGGL_SEQ = CASE
WHEN h.KF_SEQ_NR = 0 THEN 1
ELSE h.KF_SEQ_NR
END AND
h.KF_SCHR_NR IN (2, 402, 502, 602))
END) AS Done
FROM LIORDER.AUF_STAT a,
LIORDER.AUF_KOPF b,
LIORDER.AUF_POS c,
LIORDER.GLAS_DATEN d,
LIORDER.KUST_ADR e,
LIPROD.AUF_POS_COMP f
WHERE EXISTS
(SELECT g.AUF_NR
FROM LIORDER.AUF_STAT g
WHERE g.AUF_NR = a.AUF_NR AND
g.RG_OFFEN != 0) AND
EXISTS
(SELECT i.KF_ORDER_NO
FROM LIPROD.KAPA_AUS_FERT i
WHERE a.AUF_NR = i.KF_ORDER_NO AND
i.KF_SCHR_NR IN (2, 402, 502, 602)) AND
a.AUF_NR = b.AUF_NR AND
b.AUF_NR = c.AUF_NR AND
c.AUF_NR = f.PC_ORDER_NO AND
a.AUF_POS = c.AUF_POS AND
c.AUF_POS = f.PC_ORDER_POS AND
b.KUNR = e.KU_NR AND
f.PC_SGGL_COD = d.IDNR AND
a.HISTORY = 0 AND
b.AUF_OFF = 0 AND
c.VER_ART != ''V'' AND
e.KU_VK_EK = 0 AND
e.KU_NAME IS NOT NULL
ORDER BY a.AUF_NR DESC,
a.AUF_POS ASC')
...It is retrieving data from an Oracle linked server. But the execution time is so friggin' long! I tried running it and for around 30 minutes it hasn't shown any results. So I could even tell the exact time it would take to return results. Do you have any tips regarding query optimization? Thanks in advance.
View 9 Replies
View Related
Jan 3, 2007
this code takes far too long to run any suggestions:
create table #minatest(
DatabaseName varchar(100),
pagecount varchar(100),
vfillfactor varchar(100),
TableName varchar(200),
IndexName varchar(100),
FragmentPercentage int,
newfragmentvalue int,
index_type_desc varchar(100),
index_level int
)
-- minatest table will contain indexes with fragmentation above 10% which need to be defragged
-- this will go through all databases
-- null indexes will not be affected
exec sp_msforeachdb'
use ?
INSERT INTO #minatest
SELECT
db_name(database_id),
phystat.page_count,
i.fill_factor,
OBJECT_NAME(i.object_id),
i.name,
phystat.avg_fragmentation_in_percent,
newfragmentvalue = 0,
index_type_desc,
index_level
FROM sys.dm_db_index_physical_stats(NULL, NULL, NULL, NULL, ''DETAILED'') phystat
JOIN sys.indexes i ON i.object_id = phystat.object_id AND i.index_id = phystat.index_id
WHERE phystat.avg_fragmentation_in_percent > 10 AND phystat.page_count < 10000
'
DECLARE @Counts int,
@i int,
@DatabaseName varchar(100),
@pagecount varchar(100),
@vfillfactor varchar(100),
@TableName varchar(200),
@IndexName varchar(100),
@sql nvarchar(4000),
@nfsql nvarchar(4000),
@FragmentPercentage int,
@params nvarchar(4000),
@index_type_desc varchar(100),
@index_level int,
@vnewfrag int
SELECT @params = N'@cnt int OUTPUT'
select @Counts = count(Databasename)
from #minatest -- sets the maximum amount of fields to go threw as a number
declare targets cursor -- declare cursor with values to search through
for
select * from #minatest
open targets -- open cursor
fetch next from targets into @DatabaseName,@pagecount,@vfillfactor,@TableName,@IndexName,@FragmentPercentage,@vnewfrag,@index_type_desc,@index_level-- take rows from table
select @i=0
while @@fetch_status=0 and @i<=@Counts-- set loop condition
begin
select @sql = 'USE '+@DatabaseName+'; '+
' ALTER INDEX '+@IndexName+' ON '+ @TableName+
' REBUILD with (ONLINE=ON,SORT_IN_TEMPDB=ON,STATISTICS_NORECOMPUTE=OFF);'
exec sp_executesql @sql
select @nfsql = 'select @cnt = avg_fragmentation_in_percent FROM sys.dm_db_index_physical_stats(NULL,NULL, NULL, NULL, ''DETAILED'') phystat JOIN '+@DatabaseName+'.sys.indexes i ON i.object_id = phystat.object_id AND i.index_id = phystat.index_id WHERE i.name='''+@IndexName+''' and index_type_desc='''+@index_type_desc+''' and index_level='''+CAST(@index_level as varchar(20))+''''
exec sp_executesql @nfsql ,@params, @cnt=@vnewfrag OUTPUT
update #minatest set newfragmentvalue = @vnewfrag where IndexName = @IndexName and TableName = @TableName
select @i=@i+1
fetch next from targets into @DatabaseName,@pagecount,@vfillfactor,@TableName,@IndexName,@FragmentPercentage,@vnewfrag,@index_type_desc,@index_level-- take next field of table
end
close targets
DEALLOCATE targets
ALTER TABLE #minatest DROP COLUMN index_type_desc,index_level
select * from #minatest -- displays which indexes where defraged and their new frag value
drop table #minatest
View 5 Replies
View Related
Mar 1, 2006
I have two tables.One has approx 90,000 rows with a field .. let's call in BigInt (and itis defined as a bigint data type).I have a reference table, with approx 10,000,000 rows. In thisreference table, I have starting_bigint and ending_bigint fields. Iwant to pull out all of the reference data from the reference table forall 90,000 rows in the transaction table where the BigInt from thetransaction table is between the starting_bigint and ending_bigint inthe reference table.I have the join working now, but it is not as optimized as I wouldlike. It appears no matter what I do, the query does a full table scanon the 10,000,000 rows in the reference table.Sample codeSELECT ref.*, tran.bigintfrom transactiontable tranINNER JOIN referencetable ref on tran.bigint betweenref.starting_bigint and ending_bigintYes, all 3 of the fields are indexed. I even have a composite index onthe reference table with the starting_bigint and ending_bigint fieldsselected as the composite.Any help would be appreciated.Robert H. KershbergIT DirectorTax Credit CompanyJoin Bytes! or Join Bytes! or Join Bytes!
View 5 Replies
View Related
Mar 29, 2006
Hello all,I have a table with thousands of rows and is in this format:id col1 col2 col3 col4--- ------ ----- ------ ------1 nm 78 xyz pir2 bn 45 abc dirI now want to get the data from this table in this format:field val---------------------------col1 nmcol1 bncol2 78col2 45col3 xyzcol3 abccol4 pircol4 dirIn order to do this I am doing a union:select * into #tempUpdate(select 'col1' as field, col1 as val from table1unionselect 'col2' as field, col2 as val from table1unionselect 'col3' as field, col3 as val from table1)the above example query is smaller - I have a much bigger table withabout 80 columns (Imagine the size of my union query :) and this takesa lot of time to execute. Can someone please suggest a better way to dothis?The results of this union query are selected into a temp table, which Ithen use to update another table. I am using SQL Server 2000.my main concern is performance. any ideas please?thanks
View 6 Replies
View Related
Jul 20, 2005
I have an application that's allows user input, and is translating it bystripping out the html tags and also doing some code translations. The useris able to later edit their input. However it's unfeasible to reversetranslate it back as the logic would be too complicated, and there areinstances where it won't be possible.So, what I'm thinking to do to speed up performance is to duplicate the userdata, one for native data, and the other for the translated data. When useredits their input, the native data is shown. When the application isshowing the data in a page, the translated data is shown.My question is, would it make a performance difference if I store the nativedata and the translated data in the same table, or would it be better tostore the cached data in another table?
View 1 Replies
View Related
Apr 24, 2007
I have combined three reports into one big report. I would like to someway cache the big report, and then create little reports from the cached report. What would be the best way to go about doing this?
View 3 Replies
View Related
Jul 9, 2006
Hi,
We have 3 tables in sql server for simillar information about 3 different countries. some times I select from all countries so I need to use all tables and some times just one country so I select from one table.
I want to know that is it good to combine these three tables into one and add one field to define the country name?
Which way is better, my own way or this new one?
Please let me know the advantages and disadvantages of each of these ways.
Thanks
View 4 Replies
View Related
Feb 2, 2006
I'm having problems optimizing a sql select statement that uses a LIKE statement coupled with an OR clause. For simplicity sake, I'll demonstrate this with a scaled down example:
table Company, fields CompanyID, CompanyName
table Address, fields AddressID, AddressName
table CompanyAddressAssoc, fields AssocID, CompanyID, AddressID
CompanyAddressAssoc is the many-to-many associative table for Company and Address. A search query is required that, given a search string ( i.e. 'TEST' ), return all Company -> Address records where either the CompanyName or AddressName starts with the parameter:
Select c.CompanyID, c.CompanyName, a.AddressName
FROM Company c
LEFT OUTER JOIN CompanyAddressAssoc caa ON caa.CompanyID = c.CompanyID
LEFT OUTER JOIN Address a ON a.AddressID = caa.AddressID
WHERE ((c.CompanyName LIKE 'TEST%') OR (a.AddressName LIKE 'TEST%))
There are proper indexes on all tables. The execution plan creates a hash table on one LIKE query, then meshes in the other LIKE query. This takes a very long time to do, given a dataset of 500,000+ records in Company and Address.
Is there any way to optimize this query, or is it a problem with the base table implementation?
Any advice would be appreciated.
View 6 Replies
View Related
Feb 16, 2007
Hi, I'm trying to optimize an ASP.NET 2.0 app where, with a load of 100 users, the CPU usage on the web server is a reasonable <20% but the db server (SQL Server 2005) often goes up to 100% and never below 20%. When viewing the Performance Monitor counter SQL Server :General Statistics/User Connections this shows that there are 50-65 connections. When the number of connections go up the db server's CPU usage also goes up along with web page response times.When viewing the Sql server Activity Monitor it, for all connections, always says that the Login Time and the Last Batch are the same. This has got me wondering if no connections are pooled and that all connections always are closed and reopened and that would explain why the db server is so jammed.Could someone give me some pointers as what to check to find optimization points. Regards,Mathias
View 2 Replies
View Related
Feb 4, 2008
Hi all,
I have have this stored procedure which gets records from the SQL Server database which are then displayed on a web page. In the second part of the stored procedure, it has the exact same SELECT statement as in first part but it just returns the total duration. I have noticed that when the number of records is very large, the CPU time goes up to 100% for a second (while its fetching data) and then goes back to normal.
Please guide me to what optimizations can be done to this procedure. CREATE Procedure GetCDR_LikeTollFree
(
@theDuration int = null, --to find duration more than theDuration
@totalDuration bigint = null, --to find the total duration for the result set
@callerName varchar(255) = null,
@callingPartyNumber varchar(255) = null,
@originalCalledPartyNumber varchar(255) = null,
@finalCalledPartyNumber varchar(255) = null,
@dateTimeConnect datetime = null,
@dateTimeDisconnect datetime = null,
@clientMatterCode varchar(255) = null
)
AS
BEGIN
SELECT callingPartyNumber, AlertingName, originalCalledPartyNumber, finalCalledPartyNumber,
dateTimeConnect,
dateTimeDisconnect,
CONVERT(char(8), DATEADD(second, duration, '0:00:00'), 108) AS duration,
clientMatterCode
FROM CDR1.dbo.CallDetailRecord t1
LEFT JOIN CDR2.dbo.NumPlan t2 ON t1.callingPartyNumber=t2.DNorPattern
WHERE
(t1.callingPartyNumber LIKE ISNULL(@callingPartyNumber, t1.callingPartyNumber) + '%') AND
(t1.originalCalledPartyNumber LIKE ISNULL(@originalCalledPartyNumber, t1.originalCalledPartyNumber) + '%') AND
(t1.finalCalledPartyNumber LIKE ISNULL(@finalCalledPartyNumber, t1.finalCalledPartyNumber) + '%') AND
(t1.clientMatterCode LIKE ISNULL(@clientMatterCode, t1.clientMatterCode) + '%') AND
(@callerName is NULL OR t2.AlertingName LIKE '%' + @callerName + '%') AND
(t1.duration >= @theDuration) AND
((t1.datetimeConnect) >= ISNULL(convert(bigint,
datediff(ss, '01-01-1970 00:00:00', @dateTimeConnect)), t1.datetimeConnect)) AND
((t1.dateTimeDisconnect) <= ISNULL(convert(bigint,
datediff(ss, '01-01-1970 00:00:00', @dateTimeDisconnect)), t1.dateTimeDisconnect))
SET @totalDuration = (SELECT SUM(duration)
FROM CDR1.dbo.CallDetailRecord t1
LEFT JOIN CDR2.dbo.NumPlan t2 ON t1.callingPartyNumber=t2.DNorPattern
WHERE
(callingPartyNumber LIKE ISNULL(@callingPartyNumber, callingPartyNumber) + '%') AND
(originalCalledPartyNumber LIKE ISNULL(@originalCalledPartyNumber, originalCalledPartyNumber) + '%') AND
(finalCalledPartyNumber LIKE ISNULL(@finalCalledPartyNumber, finalCalledPartyNumber) + '%') AND
(clientMatterCode LIKE ISNULL(@clientMatterCode, clientMatterCode) + '%') AND
(@callerName is NULL OR t2.AlertingName LIKE '%' + @callerName + '%') AND
(duration >= @theDuration) AND
((datetimeConnect) >= ISNULL(convert(bigint,
datediff(ss, '01-01-1970 00:00:00', @dateTimeConnect)), datetimeConnect)) AND
((dateTimeDisconnect) <= ISNULL(convert(bigint,
datediff(ss, '01-01-1970 00:00:00', @dateTimeDisconnect)), dateTimeDisconnect))
)
RETURN @totalDuration
END
GO
View 13 Replies
View Related
May 17, 2006
I have the following store proc and was wondering if I can optimized it by using a SELECT CASE instead of all those IF? I tried but don't know how to write it.
Thanks set ANSI_NULLS ON
set QUOTED_IDENTIFIER ON
go
ALTER PROCEDURE [dbo].[Get_Cl_SearchMultiColumn]
(
@strSearchTermColumnNamenvarchar (50),
@strSearchTermSearchTermnvarchar (200)
)
as
if (@strSearchTermColumnName = 'Monitor')
begin
SELECT CustomerID,SystemId,CompanyName,City,State,Country,Monitor1,Monitor2
FROM Cl_Systems
WHERE contains(Monitor1,@strSearchTerm) or contains(Monitor2,@strSearchTerm)
return 0
end
if (@strSearchTermColumnName = 'MonitorSerial')
begin
SELECT CustomerID,SystemId,CompanyName,City,State,Country,Monitor1Serial,Monitor2Serial
FROM Cl_Systems
WHERE contains(Monitor1Serial,@strSearchTerm) or contains(Monitor2Serial,@strSearchTerm)
return 0
end
if (@strSearchTermColumnName = 'Microscope')
begin
SELECT CustomerID,SystemId,CompanyName,City,State,Country,Microscope1,Microscope2
FROM Cl_Systems
WHERE contains(Microscope1,@strSearchTerm) or contains(Microscope2,@strSearchTerm)
return 0
end
if (@strSearchTermColumnName = 'SerialMicroscope')
begin
SELECT CustomerID,SystemId,CompanyName,City,State,Country,SerialMicroscope1,SerialMicroscope2
FROM Cl_Systems
WHERE contains(SerialMicroscope1,@strSearchTerm) or contains(SerialMicroscope2,@strSearchTerm)
return 0
end
if (@strSearchTermColumnName = 'Controller')
begin
SELECT CustomerID,SystemId,CompanyName,City,State,Country,Controller1,Controller2
FROM Cl_Systems
WHERE contains(Controller1,@strSearchTerm) or contains(Controller2,@strSearchTerm)
return 0
end
if (@strSearchTermColumnName = 'ControllerFirmware')
begin
SELECT CustomerID,SystemId,CompanyName,City,State,Country,Cont1Firmware,Cont2Firmware
FROM Cl_Systems
WHERE contains(Cont1Firmware,@strSearchTerm) or contains(Cont2Firmware,@strSearchTerm)
return 0
end
if (@strSearchTermColumnName = 'SerialController')
begin
SELECT CustomerID,SystemId,CompanyName,City,State,Country,SerialController1,SerialController2
FROM Cl_Systems
WHERE contains(SerialController1,@strSearchTerm) or contains(SerialController2,@strSearchTerm)
return 0
end
if (@strSearchTermColumnName = 'Joystick')
begin
SELECT CustomerID,SystemId,CompanyName,City,State,Country,Joystick1,Joystick2
FROM Cl_Systems
WHERE contains(Joystick1,@strSearchTerm) or contains(Joystick2,@strSearchTerm)
return 0
end
if (@strSearchTermColumnName = 'JoystickFirmware')
begin
SELECT CustomerID,SystemId,CompanyName,City,State,Country,Joy1Firmware,Joy2Firmware
FROM Cl_Systems
WHERE contains(Joy1Firmware,@strSearchTerm) or contains(Joy2Firmware,@strSearchTerm)
return 0
end
if (@strSearchTermColumnName = 'SerialJoystick')
begin
SELECT CustomerID,SystemId,CompanyName,City,State,Country,SerialJoystick1,SerialJoystick2
FROM Cl_Systems
WHERE contains(SerialJoystick1,@strSearchTerm) or contains(SerialJoystick2,@strSearchTerm)
return 0
end
if (@strSearchTermColumnName = 'Camera')
begin
SELECT CustomerID,SystemId,CompanyName,City,State,Country,Camera1,Camera2,Camera3,Camera4
FROM Cl_Systems
WHERE contains(Camera1,@strSearchTerm) or contains(Camera2,@strSearchTerm) or contains(Camera3,@strSearchTerm) or contains(Camera4,@strSearchTerm)
return 0
end
if (@strSearchTermColumnName = 'CameraSerial')
begin
SELECT CustomerID,SystemId,CompanyName,City,State,Country,Camera1Serial,Camera2Serial,Camera3Serial,Camera4Serial
FROM Cl_Systems
WHERE contains(Camera1Serial,@strSearchTerm) or contains(Camera2Serial,@strSearchTerm) or contains(Camera3Serial,@strSearchTerm) or contains(Camera4Serial,@strSearchTerm)
return 0
end
if (@strSearchTermColumnName = 'ZMotor')
begin
SELECT CustomerID,SystemId,CompanyName,City,State,Country,ZMotor1,ZMotor2,ZMotor3
FROM Cl_Systems
WHERE contains(ZMotor1,@strSearchTerm) or contains(ZMotor2,@strSearchTerm) or contains(ZMotor3,@strSearchTerm)
return 0
end
if (@strSearchTermColumnName = 'Stage')
begin
SELECT CustomerID,SystemId,CompanyName,City,State,Country,Stage1,Stage2,Stage3
FROM Cl_Systems
WHERE contains(Stage1,@strSearchTerm) or contains(Stage2,@strSearchTerm) or contains(Stage3,@strSearchTerm)
return 0
end
if (@strSearchTermColumnName = 'Lens')
begin
SELECT CustomerID,SystemId,CompanyName,City,State,Country,Lens1,Lens2,Lens3
FROM Cl_Systems
WHERE contains(Lens1,@strSearchTerm) or contains(Lens2,@strSearchTerm) or contains(Lens3,@strSearchTerm)
return 0
end
View 4 Replies
View Related
Aug 9, 1999
I want to know how I can speed up inserting rows. Will stored procedures help at all? Any ideas are wanted. Thanks.
View 1 Replies
View Related