Please Help Speed Up My Query!

Jul 20, 2005

Hi there,

had to rush some sql and am now going back to it due to a slow db
performance. I have a db for sales leads and have created 3 views based
on the data I need to produce.

However one o the views, which has subqueries to the other views is VERY
slow and it needs to be speeded up, but am unsure how, can anyone
help... below is the sql?

SELECT CGAMain.CGAMainId,CGAMain.Salutation,CGAMain.First Name,

CGAMain.LastName,CGAMain.Salutation2,CGAMain.First Name2,CGAMain.LastName
2,
cgamain.telephone,CGAMain.[Post Code],
other.dbo.users.UserName, other.dbo.agents.[agent_name],
LEADSTATUS.CURRENTSTATUS,leadstatus.activestatus,L eadstatus.[no
answer],CGAMain.rescode,
Agent_displayname As 'Called By'
FROM Leadstatus INNER JOIN CGAMain ON Leadstatus.CGAMainId =
CGAMain.CGAmainId
INNER JOIN other.dbo.users ON Leadstatus.LTAid = other.dbo.users.userref
INNER JOIN other.dbo.agents ON Leadstatus.agentid = other.dbo.agents.id
WHERE
(StatusAssigned = 1)
AND (leadstatus.activestatus = 'Active')

please help???

*** Sent via Developersdex http://www.developersdex.com ***
Don't just participate in USENET...get rewarded for it!

View 2 Replies


ADVERTISEMENT

¿What Improves SQL Server Performance? HD Speed, Processor Speed Or Ram?

Oct 18, 2007



Hi,

I have several data bases on a server (SQL Server 2000 only, no web server installed) and lately, as the company keeps gowing, my users complain saying the server gets slow, (this dbs are well designed and recieve optimizations and integrity checks, etc) because of this, Im thinking about getting a new server to repleace my old ProLiant ML 330 which was bought 4 years ago but Im concerned about what server arquitecture or characteristic can help me best to improve response performance, is it HD speed? Processor speed? or more Ram? I want to make a good decision, so I´d really appreciate your help...

Thanks, Luis Luevano

View 1 Replies View Related

Speed Up Query

Jan 17, 2007

Hi,
Can anyone tell me a way to speed up these querys?
//This is selecting a number of records (sent by user) from a table and randomizing those
tempSQL.Text = "select top " + amount.Text + " number from [" + src.Text + "] Where pull='N' order by newID()";
 
SqlConnection conn2 = new SqlConnection(ConfigurationManager.ConnectionStrings["MyDB"].ConnectionString);
conn2.Open();
SqlCommand cmd3 = new SqlCommand(tempSQL.Text, conn2);
cmd3.CommandTimeout = 1000;
SqlDataReader dr = cmd3.ExecuteReader();
//Then I open a data reader that uses the records
SqlConnection conn2a = new SqlConnection(ConfigurationManager.ConnectionStrings["MyDB"].ConnectionString);
conn2a.Open();
while (dr.Read())
{
//the records are then placed 1 by one into a temp table
string fillresultID = "Insert into [" + src.Text + "_Additional_Temp] (number) Values('" + dr["number"] + "')";
SqlCommand cmd4 = new SqlCommand(fillresultID, conn2a);
cmd4.CommandTimeout = 0;
cmd4.ExecuteNonQuery();
//then the original table that held the numbers is marked as used(again one by one)
string update = "Update [" + src.Text + "] set pull='Y' where number='" + dr["number"] + "'";
SqlCommand cmd5 = new SqlCommand(update, conn2a);
cmd5.CommandTimeout = 0;
cmd5.ExecuteNonQuery();
}
dr.Close();
conn2.Close();
conn2a.Close();
Thanks,
Doug

View 5 Replies View Related

How Can I Speed Up This Query

Apr 22, 2008

Hi, how can i speed up this query, it seems to be taking a very long time to bring back the reults;
--This stored procedure retrieves access rights for usersCREATE PROCEDURE wc_User_Access_Right_List 
ASSELECT      dbo.tblRep.Rep_ID, RTRIM(dbo.tblRep.Rep_Forename) + ' ' + RTRIM(dbo.tblRep.Rep_Surname) AS User_Full_Name, dbo.tblAccessRight.Access_Right, dbo.tblAccessRight.Access_Right_IDFROM         dbo.tblRep LEFT OUTER JOIN                      dbo.tblAccessRight ON dbo.tblRep.Access_Right_ID = dbo.tblAccessRight.Access_Right_ID ORDER BY User_Full_Name
 
--Make sure this has saved, if not return 10 as this is unexpected error
IF @@rowcount = 0 return 10 
DECLARE @RETURN_VALUE tinyintIF @@error <>0 RETURN @@errorGO

View 21 Replies View Related

Query Speed

May 20, 2008

Hello,
I am working with some third party software and trying to rebuild my own queries for my own reports but using the business logic that is behind their queries so that I can be sure the numbers still match up and are useable. The reports in the third party software seem to be built on the fly as I am finding when I run sql profiler because there are no stored procedures. I have a query which returns a basic result set but it takes 52 seconds to get it back and I feel like that is way to long for such a simple result set of returning a count by location. The query creates a few temporary tables and combines and updates some data. I want to update this query to make it faster but am not sure of the best approach to take. The temporary tables create alot of columns that I am not even going to need in the final result set or the update statements so I thought by removing some it would help. I am going to paste in the original query and if anyone has any ideas to make this more up to date with sql 2005 standards and just faster please feel free to let me know. Thanks! :)

CREATE TABLE #FI28EF3WE3DN
(practice_id char(4) null, location_id uniqueidentifier null, rendering_id uniqueidentifier null, service_item_id char(12) null, service_item_lib_id uniqueidentifier null, service_item_desc varchar(80) null, cpt4_code_id char(12) null, charge_amt numeric(19,2) null, quantity dec(19,2) null, begin_date_of_service char(8) null, closing_date char(8) null, create_timestamp datetime null, rvu1 float null, rvu1_total float null, rvu2 float null, rvu2_total float null, rvu3 float null, rvu3_total float null, rvu4 float null, rvu4_total float null, end_date_of_service char(8) null, midlevel_id uniqueidentifier null, unit_price numeric(19,2) null, override_ind char(1) null, modifier_1 char(2) null, modifier_2 char(2) null, modifier_3 char(2) null, modifier_4 char(2) null, link_id uniqueidentifier null, source_id uniqueidentifier null, enc_nbr numeric null, source_type char(1) null, name varchar(150) null, person_id uniqueidentifier null, date_of_birth char(8) null, patient_age char(8) null, address_line_1 varchar(55) null, address_line_2 varchar(55) null, city varchar(35) null, state varchar(3) null, zip char(9) null, country_id uniqueidentifier null, county_id uniqueidentifier null, county_desc varchar(100) null, sex char(1) null, sex_desc varchar(30) null, batch_info varchar(40) null, override_closing_date char(8) null, ud_demo1_id uniqueidentifier null, ud_demo2_id uniqueidentifier null, ud_demo3_id uniqueidentifier null, ud_demo4_id uniqueidentifier null, ud_demo5_id uniqueidentifier null, ud_demo6_id uniqueidentifier null, ud_demo7_id uniqueidentifier null, ud_demo8_id uniqueidentifier null, ud_demo9_id uniqueidentifier null, ud_demo10_id uniqueidentifier null, icd9cm_code_id varchar(10) null, icd9cm_code_id_2 varchar(10) null, icd9cm_code_id_3 varchar(10) null, icd9cm_code_id_4 varchar(10) null, icd9cm_code_desc varchar(255) null, icd9cm_code_desc_2 varchar(255) null, icd9cm_code_desc_3 varchar(255) null, icd9cm_code_desc_4 varchar(255) null, quadrant char(2) null, tooth char(2) null, surface_description varchar(200) null, form char(4) null, fin_class_id uniqueidentifier, person_payer_id uniqueidentifier, department_id uniqueidentifier, outsource_ind char(1) null, outsource_date char(8) null, outsource_agency_id uniqueidentifier, outsource_agency_desc varchar(40) null, fqhc_enc_ind char(1) null, fin_class varchar(100) null, payer_name varchar(40) null, person_nbr char(12) null, pat_prov_name_1 varchar(75), pat_prov_name_2 varchar(75), pat_prov_name_3 varchar(75), pat_prov_name_4 varchar(75), pat_prov_name_5 varchar(75), pat_prov_name_6 varchar(75), pat_prov_name_7 varchar(75), pat_prov_name_8 varchar(75), pat_prov_name_9 varchar(75), pat_prov_name_10 varchar(75), pat_prov_name_11 varchar(75), pat_prov_name_12 varchar(75))

INSERT into #FI28EF3WE3DN
(practice_id, source_id, source_type, location_id, rendering_id, service_item_id, service_item_lib_id, cpt4_code_id, charge_amt, quantity, begin_date_of_service, closing_date, create_timestamp, rvu1_total, rvu2_total, rvu3_total, rvu4_total, end_date_of_service, modifier_1, modifier_2, modifier_3, modifier_4, midlevel_id, unit_price, override_ind, link_id, person_id, override_closing_date, batch_info, icd9cm_code_id, icd9cm_code_id_2, icd9cm_code_id_3, icd9cm_code_id_4, icd9cm_code_desc, icd9cm_code_desc_2, icd9cm_code_desc_3, icd9cm_code_desc_4, quadrant, tooth, surface_description, form, fin_class_id, person_payer_id, outsource_ind, outsource_date, outsource_agency_id, fqhc_enc_ind)

SELECT c.practice_id, c.source_id, c.source_type, c.location_id, c.rendering_id, c.service_item_id, c.service_item_lib_id, c.cpt4_code_id, amt, quantity, begin_date_of_service, closing_date, c.create_timestamp, 0.0000, 0.0000, 0.0000, 0.0000, end_date_of_service, c.modifier_1, c.modifier_2, modifier_3, modifier_4, midlevel_id, unit_price, override_ind, link_id, c.person_id, override_closing_date, batch_info, c.icd9cm_code_id, c.icd9cm_code_id_2, c.icd9cm_code_id_3, c.icd9cm_code_id_4, icd9cm_code_desc, icd9cm_code_desc_2, icd9cm_code_desc_3, icd9cm_code_desc_4, quadrant, tooth, surface_description, c.form, p.financial_class, pe.cob1_person_payer_id, outsource_ind, outsource_date, outsource_agency_id, pe.fqhc_enc_ind

FROM charges c, patient_encounter pe, payer_mstr p

WHERE source_type = 'V' and c.practice_id = pe.practice_id and c.source_id = pe.enc_id and pe.cob1_payer_id = p.payer_id and begin_date_of_service >= '20080401' and begin_date_of_service < '20080501' and (c.link_id is null) AND c.practice_id = '0001'

INSERT into #FI28EF3WE3DN
(practice_id, source_id, source_type, location_id, rendering_id, service_item_id, service_item_lib_id, cpt4_code_id, charge_amt, quantity, begin_date_of_service, closing_date, create_timestamp, rvu1_total, rvu2_total, rvu3_total, rvu4_total, end_date_of_service, modifier_1, modifier_2, modifier_3, modifier_4, midlevel_id, unit_price, override_ind, link_id, person_id, override_closing_date, batch_info, icd9cm_code_id, icd9cm_code_id_2, icd9cm_code_id_3, icd9cm_code_id_4, icd9cm_code_desc, icd9cm_code_desc_2, icd9cm_code_desc_3, icd9cm_code_desc_4, quadrant, tooth, surface_description, form, outsource_ind, outsource_date, outsource_agency_id, fqhc_enc_ind) SELECT c.practice_id, c.source_id, c.source_type, c.location_id, c.rendering_id, c.service_item_id, c.service_item_lib_id, c.cpt4_code_id, amt, quantity, begin_date_of_service, closing_date, c.create_timestamp, 0.0000, 0.0000, 0.0000, 0.0000, end_date_of_service, c.modifier_1, c.modifier_2, modifier_3, modifier_4, midlevel_id, unit_price, override_ind, link_id, c.person_id, override_closing_date, batch_info, c.icd9cm_code_id, c.icd9cm_code_id_2, c.icd9cm_code_id_3, c.icd9cm_code_id_4, icd9cm_code_desc, icd9cm_code_desc_2, icd9cm_code_desc_3, icd9cm_code_desc_4, quadrant, tooth, surface_description, c.form, outsource_ind, outsource_date, outsource_agency_id, pe.fqhc_enc_ind FROM charges c, patient_encounter pe

WHERE source_type = 'V' and c.practice_id = pe.practice_id and c.source_id = pe.enc_id and pe.cob1_person_payer_id is null and begin_date_of_service >= '20080401' and begin_date_of_service < '20080501' and (c.link_id is null) AND c.practice_id = '0001'

INSERT into #FI28EF3WE3DN
(practice_id, source_id, source_type, location_id, rendering_id, service_item_id, service_item_lib_id, cpt4_code_id, charge_amt, quantity, begin_date_of_service, closing_date, create_timestamp, rvu1_total, rvu2_total, rvu3_total, rvu4_total, end_date_of_service, modifier_1, modifier_2, modifier_3, modifier_4, midlevel_id, unit_price, override_ind, link_id, person_id, override_closing_date, batch_info, icd9cm_code_id, icd9cm_code_id_2, icd9cm_code_id_3, icd9cm_code_id_4, icd9cm_code_desc, icd9cm_code_desc_2, icd9cm_code_desc_3, icd9cm_code_desc_4, quadrant, tooth, surface_description, form, outsource_ind)

SELECT c.practice_id, c.source_id, c.source_type, c.location_id, c.rendering_id, c.service_item_id, c.service_item_lib_id, c.cpt4_code_id, amt, quantity, begin_date_of_service, closing_date, c.create_timestamp, 0.0000, 0.0000, 0.0000, 0.0000, end_date_of_service, c.modifier_1, c.modifier_2, modifier_3, modifier_4, midlevel_id, unit_price, override_ind, link_id, c.person_id, override_closing_date, batch_info, c.icd9cm_code_id, c.icd9cm_code_id_2, c.icd9cm_code_id_3, c.icd9cm_code_id_4, icd9cm_code_desc, icd9cm_code_desc_2, icd9cm_code_desc_3, icd9cm_code_desc_4, quadrant, tooth, surface_description, c.form, c.outsource_ind FROM charges c

WHERE source_type = 'I' and begin_date_of_service >= '20080401' and begin_date_of_service < '20080501' and (c.link_id is null) AND c.practice_id = '0001'

UPDATE #FI28EF3WE3DN
SET name = isnull(per.last_name,'') + ', ' + isnull(per.first_name,'') + ' ' + isnull(per.middle_name,''), date_of_birth = per.date_of_birth, patient_age = per.date_of_birth, city = per.city, zip = per.zip, country_id = per.country_id, sex = per.sex, address_line_1 = per.address_line_1, address_line_2 = per.address_line_2, state = per.state, county_id = per.county_id, person_nbr = per.person_nbr
FROM person per
WHERE #FI28EF3WE3DN.person_id = per.person_id

UPDATE #FI28EF3WE3DN
SET name = emp.name, city = emp.city, zip = emp.zip, address_line_1 = emp.address_line_1, address_line_2 = emp.address_line_2, state = emp.state, county_id = emp.county_id
FROM invoices inv, accounts ac, employer_mstr emp
WHERE #FI28EF3WE3DN.source_type = 'I' and #FI28EF3WE3DN.practice_id = inv.practice_id and #FI28EF3WE3DN.source_id = inv.invoice_id and inv.practice_id = ac.practice_id and inv.acct_id = ac.acct_id and ac.guar_type = 'E' and ac.guar_id = emp.employer_id

CREATE TABLE #GP28EF3WE3DQ
(practice_id char(4) null, location_id uniqueidentifier null, practice_name varchar(40) null, location_desc varchar(40) null, rendering_id uniqueidentifier null, attend_first_name varchar(25) null, attend_middle_name varchar(25) null, attend_last_name varchar(25) null, service_item_id char(12) null, service_item_desc varchar(80) null, cpt4_code_id char(12) null, cpt4_desc varchar(50) null, charge_amt numeric(19,2) null, quantity dec(19,2) null, begin_date_of_service char(8) null, closing_date char(8) null, create_timestamp datetime null, rvu1 float null, rvu1_total float null, rvu2 float null, rvu2_total float null, rvu3 float null, rvu3_total float null, rvu4 float null, rvu4_total float null, end_date_of_service char(8) null, midlevel_id uniqueidentifier null, unit_price numeric(19,2) null, override_ind char(1) null, modifier_1 char(2) null, modifier_2 char(2) null, modifier_3 char(2) null, modifier_4 char(2) null, link_id uniqueidentifier null, mid_last_name varchar(25) NULL, mid_first_name varchar(25) NULL, mid_middle_name varchar(25) NULL, link_ind char(1) null, name varchar(150) null, date_of_birth char(8) null, patient_age char(8) null, address_line_1 varchar(55) null, address_line_2 varchar(55) null, city varchar(35) null, state varchar(3) null, zip char(9) null, country_id uniqueidentifier null, county_id uniqueidentifier null, county_desc varchar(100) null, sex char(1) null, sex_desc varchar(30) null, batch_info varchar(40) null, override_closing_date char(8) null, location_subgrouping1_id uniqueidentifier null, location_subgrouping1_desc varchar(100) null, location_subgrouping2_id uniqueidentifier null, location_subgrouping2_desc varchar(100) null, provider_subgrp1_id uniqueidentifier null, provider_subgrp1_desc varchar(100) null, provider_subgrp2_id uniqueidentifier null, provider_subgrp2_desc varchar(100) null, ud_demo1_id uniqueidentifier null, ud_demo1 char(100) null, ud_demo2_id uniqueidentifier null, ud_demo2 char(100) null, ud_demo3_id uniqueidentifier null, ud_demo3 char(100) null, ud_demo4_id uniqueidentifier null, ud_demo4 char(100) null, ud_demo5_id uniqueidentifier null, ud_demo5 char(100) null, ud_demo6_id uniqueidentifier null, ud_demo6 char(100) null, ud_demo7_id uniqueidentifier null, ud_demo7 char(100) null, ud_demo8_id uniqueidentifier null, ud_demo8 char(100) null, ud_demo9_id uniqueidentifier null, ud_demo9 char(100) null, ud_demo10_id uniqueidentifier null, ud_demo10 char(100) null, icd9cm_code_id varchar(10) null, icd9cm_code_id_2 varchar(10) null, icd9cm_code_id_3 varchar(10) null, icd9cm_code_id_4 varchar(10) null, icd9cm_code_desc varchar(100) null, icd9cm_code_desc_2 varchar(100) null, icd9cm_code_desc_3 varchar(100) null, icd9cm_code_desc_4 varchar(100) null, quadrant char(2) null, tooth char(2) null, form char(4) null, quad_desc varchar(200) null, tooth_desc varchar(200) null, surface_description varchar(200) null, form_desc varchar(200) null, department_id uniqueidentifier, fin_class varchar(100) null, payer_name varchar(40) null, department varchar(100) null, outsource_ind char(1) null, outsource_date char(8) null, outsource_agency_desc varchar(40) null, fqhc_enc_ind char(1) null, person_nbr char(12) null, pat_prov_name_1 varchar(75), pat_prov_name_2 varchar(75), pat_prov_name_3 varchar(75), pat_prov_name_4 varchar(75), pat_prov_name_5 varchar(75), pat_prov_name_6 varchar(75), pat_prov_name_7 varchar(75), pat_prov_name_8 varchar(75), pat_prov_name_9 varchar(75), pat_prov_name_10 varchar(75), pat_prov_name_11 varchar(75), pat_prov_name_12 varchar(75))

INSERT into #GP28EF3WE3DQ
(practice_id, location_id, rendering_id, service_item_id, service_item_desc, cpt4_code_id, charge_amt, quantity, begin_date_of_service, closing_date, create_timestamp, rvu1, rvu1_total, rvu2, rvu2_total, rvu3, rvu3_total, rvu4, rvu4_total, end_date_of_service, modifier_1, modifier_2, modifier_3, modifier_4, midlevel_id, unit_price, override_ind, link_id, name, date_of_birth, patient_age, address_line_1, address_line_2, city, state, zip, country_id, county_id, sex, override_closing_date, batch_info, ud_demo1_id, ud_demo2_id, ud_demo3_id, ud_demo4_id, ud_demo5_id, ud_demo6_id, ud_demo7_id, ud_demo8_id, ud_demo9_id, ud_demo10_id, icd9cm_code_id, icd9cm_code_id_2, icd9cm_code_id_3, icd9cm_code_id_4, icd9cm_code_desc, icd9cm_code_desc_2, icd9cm_code_desc_3, icd9cm_code_desc_4, quadrant, tooth, surface_description, form, department_id, outsource_ind, outsource_date, outsource_agency_desc, fqhc_enc_ind, fin_class, payer_name, person_nbr, pat_prov_name_1, pat_prov_name_2, pat_prov_name_3, pat_prov_name_4, pat_prov_name_5, pat_prov_name_6, pat_prov_name_7, pat_prov_name_8, pat_prov_name_9, pat_prov_name_10, pat_prov_name_11, pat_prov_name_12)

SELECT practice_id, location_id, rendering_id, null , null , null , sum(isnull(charge_amt,0)), sum(isnull(quantity,0)), begin_date_of_service, null , null , null , sum(isnull(rvu1_total,0)), null , sum(isnull(rvu2_total,0)), null , sum(isnull(rvu3_total,0)), null , sum(isnull(rvu4_total,0)), null , null , null , null , null , null , null , null , null , name, date_of_birth, null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null

FROM #FI28EF3WE3DN group by practice_id, location_id, rendering_id, begin_date_of_service, name, date_of_birth

UPDATE #GP28EF3WE3DQ

SET attend_first_name = pm.first_name, attend_middle_name = pm.middle_name, attend_last_name = pm.last_name, provider_subgrp1_id = pm.provider_subgrouping1_id, provider_subgrp2_id = pm.provider_subgrouping2_id FROM provider_mstr pm

WHERE #GP28EF3WE3DQ.rendering_id = pm.provider_id

UPDATE #GP28EF3WE3DQ
SET practice_name = p.practice_name FROM practice p
WHERE #GP28EF3WE3DQ.practice_id = p.practice_id

UPDATE #GP28EF3WE3DQ
SET location_desc = l.location_name, location_subgrouping1_id = l.location_subgrouping1_id, location_subgrouping2_id = l.location_subgrouping2_id
FROM location_mstr l
WHERE #GP28EF3WE3DQ.location_id = l.location_id

UPDATE #GP28EF3WE3DQ SET link_ind = 'Y'
WHERE link_id is not null

UPDATE #GP28EF3WE3DQ

SET #GP28EF3WE3DQ.department = ml.mstr_list_item_desc
FROM mstr_lists ml
WHERE #GP28EF3WE3DQ.department_id = ml.mstr_list_item_id and mstr_list_type = 'department'

UPDATE #GP28EF3WE3DQ
SET county_desc = ml.mstr_list_item_desc
FROM mstr_lists ml
WHERE #GP28EF3WE3DQ.county_id = ml.mstr_list_item_id and ml.mstr_list_type = 'county'

select location_desc,null,null,null,null,null,null,null,null,null,null,null,null,null,null,null,null,null,null,null,null,null, count(*) from #GP28EF3WE3DQ GROUP BY location_desc Order By location_desc

View 5 Replies View Related

Need To Speed Up Query.

Jul 20, 2005

OK Guys. I'm fed up of the query below taking too much time. I CANTchange the query since it is generated by a 3rd party product. I canchange indexes and add new indexes though.The schema of the tables is given below. The most expensive operationis a bookmark lookup on VGNCCB_ROLE_JT. I created the speed_up_loginindex as a covering index to cover thequery but that has not seemed to help.Any ideas, suggestions are most welcome ....selectROLE_ID,NAME,DESCRIPTION,CREATE_DATE,MODIFIED_DATEFROMvign.VGNCCB_ROLEWHEREROLE_ID in (select ROLE_IDFROMvign.VGNCCB_ROLE_JTWHEREUSER_NAME = 'XXXX' or GROUP_ID in (select GROUP_IDFROMvign.VGNCCB_GROUP_USER_JTWHEREUSER_NAME = 'XXXX) )************************************************** ************VGNCCB_ROLE_JTColumn_nameTypeIDintROLE_IDintUSER_NAMEnvarcharGROUP_IDintPK__VGNCCB_ROLE_JT__218BE82Bclustered, unique, primary key located onPRIMARYIDspeed_up_loginnonclustered located on PRIMARYUSER_NAME, GROUP_ID,ROLE_IDVGNCCB_ROLE_JT_INDEX1nonclustered located on PRIMARYUSER_NAMEVGNCCB_ROLE_JT_INDEX2nonclustered located on PRIMARYGROUP_ID************************************************** *************VGNCCB_GROUP_USER_JTColumn_nameTypeComputedLengthPrecScaleNullableTrimTrailingBlanksFixedLenNullInSourceCollationIDintGROUP_IDintUSER_NAMEnvarcharPK__VGNCCB_GROUP_USE__1DBB5747clustered, unique, primary key locatedon PRIMARYIDVGNCCB_GROUP_USER_JT_INDEX1nonclustered located on PRIMARYGROUP_IDVGNCCB_GROUP_USER_JT_INDEX2nonclustered located on PRIMARYUSER_NAME************************************************** *****************

View 3 Replies View Related

Query Speed

May 25, 2006

I am facing some strange query speed problem. I have a ASP form that query SQL Express base on date and branchID. I get different speed in result for different date. Even using CTP to check, I have the following problem, so it can not be the ASP problem:

1) on my May 20th version database on a particular date eg May18, if I query on all branch ie, no restriction on branchID, then the query is fast (3 sec), but if I query on one particular branch, it can take 3mins.

2) on the same query with the selected branch base on May 18 date as above, I run the query on May 18th backup database, the query run at 3sec. Speed different!

3) On query base on same selected branch, I search base on date May 10, 11, both query run about 3 min in 18th and 20th database. Strange!

I could not understand this strange query speed different. Anyone has encounter this strange thing?

Vincent

View 1 Replies View Related

How To Speed Up The Query

Apr 21, 2008

I am doing a query on SQL Server that is taking so long because of the huge no. of records being fetched. I'm already using stored procedures. I'm trying to avoid using these:

select * from table limit 1000

or

select top 1000 from table

The above methods require a trip to the sql server every time the next batch is fetched. Is there a way to do just one trip to the server but enable me to show some records while the rest are still being fetched? In other words, if I'm getting a million records, can I show a thousand to the user while waiting for the rest? When I do a query in SQL server management studio, it shows me some records immediately while the rest are still being fetched. Is there a way in C# (code behind) to get these first batch of records from sql server before it finish getting all the records? This way, the first batch of records can be shown in a Datagrid already and then later update the Datagrid when all the records are in.
Thanks in advance.

View 3 Replies View Related

Sql Query Execution Speed

Sep 25, 2000

Hi,

Is there anything to be gained in a single table query by using :

tablename.columnname1, tablename.columnname2 etc

vs just columnname1,columnname2 ?

Thanks,
Judith

View 1 Replies View Related

Improving The Speed Of A LIKE Query

May 16, 2007

Hi all,

My colleague and I are having some difficulties regarding the speed of a LIKE query on our intranet system. Please see the quote below as posted on another SQL forum (To which no one has responded yet ).

Quote: Howdy,

I've taken control of an SQL Database with an ASP front end. The CPU usage of the SQL Server process periodically jumps up to between 75-99%. I've managed to identify the ASP page that is causing it and it's a search page.

It's basically doing something like this:

SELECT JobNumber, CustomerName, CustomerAddress, CustomerPostCode
FROM Customers
WHERE CustomerName LIKE '%query%'
OR CustomerAddress LIKE '%query%'
OR CustomerPostCode LIKE '%query%'

Where query is the value the user has searched on.

The CustomerName and CustomerPostCode fields are varchar fields, but the CustomerAddress field is a text field.

I know if I replace the CustomerAddress text field with a series of varchar fields then I could make them non-clustered indexes, but would this help speed things up with a LIKE query?

There's a lot of work required on the ASP side if I'm going to split the field out, so I only want to do it if I'm gonna get a significant benefit from doing it.

Failing this, is there a better way of performing a search like this?

Any help is much appreciated,

Thanks!

View 9 Replies View Related

How To Speed Up The Query Execution In Sp.

Aug 21, 2004

Hi,

I am using a store procedure and in this sp i am having a simple select statement. Now i found that when i executes this sp in query analyzer it takes about 8-10 min to show the output. Table is having thousands of records. I can rebuild indexes on table, but apart from this what else i can do to speed up the query.

I know there is something like we can use indexes explicitly in sql query. Is it true? if yes plz show me how to use it, by giving example

Also is there any other way to run the query much faster.

Plz help me, its very urgent

Thanks And Regards,
Shailesh

View 6 Replies View Related

Query Execution Speed

Jul 23, 2005

Hi there - i'm hoping someone can help me!I'm having a problem with a live database that i'm running on MSDE - Itseems to have slowed down quite considerably from the test environment(even when all the data is the same). The is notably different on oneparticular query that takes 1 sec on the test machine and almost 1 minon the live machineThe total number of user connections on the live machine is normally 4or so (found out through the Performance monitor). So I can't see thatit's MSDE's performance throttler...Has anybody got any ideas on things i can check for??Many thanksJames

View 8 Replies View Related

Low Query Excuting Speed

Dec 14, 2006

Hi.
Some time I get delay in running particular querry.I check in my App and also in sql visual stodiu.I'm using sqlexpress edition.If I use Sql developer does speed go up?

View 1 Replies View Related

Is Order By Affect The Query Speed

Aug 7, 2007

hello,

I have a query that insert insert into new table , and then i select from this table,

if i add ORDER BY in the INSERT INTO script , does it affect the speed of the SELECT


i have big table that take about 70 sec

View 14 Replies View Related

Compressed Folder && Query Speed

Nov 23, 2005

One way to save storage space is to put the SQL data files into a compressedfile. Has anyone got any idea how this will affect the query speed?

View 2 Replies View Related

Does 'Group By' Affect The Query Speed?

Nov 16, 2006

i have a table such sa below:

Name1, Name2, Name3, Nam4, C1, C2,.., C100

and in this table, i have found index for Name1-Nam4,

i don't why sql below is very slow?

select
Name1, sum(C1), ...., Sum(C100)
from
(
select
Name1, Name2, sum(C1) as C1, ...., Sum(C100) as C100
from
(
select
Name1, Name2, Name3, sum(C1) as C1, ...., Sum(C100) as C100
from
(
select
Name1, Name2, Name3, Name4, C1, ...., C100
from
My_Table
group by Name1, Name2, Name3, Name4
) as T
group by Name1, Name2, Nam3
) as T
group by Name1, Name2
) as T
group by Name1



Does 'Group By' affect the speed of query?

View 1 Replies View Related

Query Speed From ISQL/w For 15000 Records

Oct 27, 2000

The simple 'select' from ISQL/W is taking more than 2 mins. for getting 20000 records. Is it normal?
The time that is shown at bottom of window is the correct measure of response time or is there any other way?

Thanks

View 1 Replies View Related

Large Database, Slow Query Speed. Help!

May 29, 2008

Hi guys,

I am asking this question on behalf of a friend. I have little knowledge of SQL 2005 but my friend is quite knowledgeable, although this is the first time he is dealing with large database for a client. So here's the story.

His client has a database containing 1.5 million books. Now he is setting up a website which will enable users to search books. Searching by ISBN is no problem as it only takes 1 seconds. The problem is, searching by Title takes more than 20seconds, which is unacceptable. My friend has only done smaller database and he just recently thought of implementing indexing and now looking for other ideas.

Each row contains book details such as Title, Author1, Author2, Author3, Publisher, Publication Date, ISBN, etc.

Can anyone who are more experienced in doing large database share with me some design ideas? His client is aiming for 8seconds or less.

Thanks in advance!

View 14 Replies View Related

Query Optimization: CPU Speed Or Logical Reads Better?

Dec 12, 2005

How do I determine which method I should use ifI want to optimize the performance of a database.I took Northwind's database to run my example.My query is I want to retrieve the Employees' Firstand Last Names that sold between $100,000 and$200,000.First let me create a function that takes the EmployeeIDas the input parameter and returns the Employee'sFirst and Last name:CREATE FUNCTION dbo.GetEmployeeName(@EmployeeID INT)RETURNS VARCHAR(100)ASBEGINDECLARE @NAME VARCHAR(100)SELECT @NAME = FirstName + ' ' + LastNameFROM EmployeesWHERE EmployeeID = @EmployeeIDRETURN ISNULL(@NAME, '')ENDMy first method to run this:SELECT EmployeeID, dbo.GetEmployeeName(EmployeeID) ASEmployee, SUM(UnitPrice * Quantity) AS AmountFROM OrdersJOIN [Order Details] ON Orders.OrderID =[Order Details].OrderIDGROUP BY EmployeeID,dbo.GetEmployeeName(EmployeeID)HAVING SUM(UnitPrice * Quantity) BETWEEN100000 AND 200000It's running in 4 seconds time. And here are theStatistics IO and Time results:SQL Server Execution Times:CPU time = 0 ms, elapsed time = 0 ms.SQL Server Execution Times:CPU time = 0 ms, elapsed time = 0 ms.SQL Server Execution Times:CPU time = 0 ms, elapsed time = 0 ms.SQL Server parse and compile time:CPU time = 17 ms, elapsed time = 17 ms.(3 row(s) affected)Table 'Order Details'. Scan count 1, logical reads 10,physical reads 0, read-ahead reads 0.Table 'Orders'. Scan count 1, logical reads 21,physical reads 0, read-ahead reads 0.SQL Server Execution Times:CPU time = 3844 ms, elapsed time = 3934 ms.SQL Server Execution Times:CPU time = 3844 ms, elapsed time = 3935 ms.SQL Server Execution Times:CPU time = 3844 ms, elapsed time = 3935 ms.SQL Server parse and compile time:CPU time = 0 ms, elapsed time = 0 ms.Now my 2nd method:IF (SELECT OBJECT_ID('tempdb..#temp_Orders')) IS NOT NULLDROP TABLE #temp_OrdersGOSELECT EmployeeID, SUM(UnitPrice * Quantity) AS AmountINTO #temp_OrdersFROM OrdersJOIN [Order Details] ON Orders.OrderID =[Order Details].OrderIDGROUP BY EmployeeIDHAVING SUM(UnitPrice * Quantity) BETWEEN100000 AND 200000GOSELECT EmployeeID, dbo.GetEmployeeName(EmployeeID),AmountFROM #temp_OrdersGOIt's running in 0 seconds time. And here are the Statistics IOand Time results:SQL Server Execution Times:CPU time = 0 ms, elapsed time = 0 ms.SQL Server Execution Times:CPU time = 0 ms, elapsed time = 0 ms.SQL Server Execution Times:CPU time = 0 ms, elapsed time = 0 ms.SQL Server parse and compile time:CPU time = 0 ms, elapsed time = 0 ms.SQL Server Execution Times:CPU time = 0 ms, elapsed time = 0 ms.SQL Server Execution Times:CPU time = 0 ms, elapsed time = 0 ms.SQL Server Execution Times:CPU time = 0 ms, elapsed time = 0 ms.SQL Server parse and compile time:CPU time = 0 ms, elapsed time = 0 ms.Table '#temp_Orders0000000000F1'. Scan count 0, logicalreads 1, physical reads 0, read-ahead reads 0.Table 'Order Details'. Scan count 830, logical reads 1672,physical reads 0, read-ahead reads 0.Table 'Orders'. Scan count 1, logical reads 3, physical reads 0,read-ahead reads 0.QL Server Execution Times:CPU time = 15 ms, elapsed time = 19 ms.(3 row(s) affected)SQL Server Execution Times:CPU time = 15 ms, elapsed time = 19 ms.SQL Server Execution Times:CPU time = 15 ms, elapsed time = 20 ms.SQL Server parse and compile time:CPU time = 0 ms, elapsed time = 1 ms.(3 row(s) affected)Table '#temp_Orders0000000000F1'. Scan count 1,logical reads 2, physical reads 0, read-ahead reads 0.SQL Server Execution Times:CPU time = 0 ms, elapsed time = 3 ms.SQL Server Execution Times:CPU time = 0 ms, elapsed time = 3 ms.SQL Server Execution Times:CPU time = 0 ms, elapsed time = 3 ms.SQL Server parse and compile time:CPU time = 0 ms, elapsed time = 0 ms.By the way why "SQL Server Execution Times"exists 3 times and not just one time?Summary:The first code is clean, 1 single SELECT statement buttakes 4 long seconds to execute. The logical reads arevery few compared to the second method.The second code is less clean and uses a temp table buttakes 0 second to execute. The logical reads are waytoo high compared to the first method.What am I supposed to conclude in this example?Which method should I use over the other and why?Are both methods good depending on which I prefer?If I can wait four seconds, it's better to reduce the logicalreads in order to provide less Blocking on the live tablesin a heavily accessed database?Which method should I choose on my own database?Calling a function like dbo.GetEmployeeName getsprocessed per each returned row, correct? That meansIf i had a scenario where 1000 records were to be returnedwould it be better to dump 1000 records to a temp tablevariable and then call a function to process each recordone at a time?Or would the direct approach without usinga temp table cause slower processing and moreblocking/deadlocks because I am calling the functionper each row as I am accessing directly from the tables?Thank you

View 1 Replies View Related

Sql Query Speed 2000 Versus 2005

Aug 15, 2007

We have an interesting problem. We are attempting to migrate from sql 2000 to sql 2005. the schema we have is exactly the same. the new 2005 box is more powerful than our 2000 box.

here is our schema:

tbl_Items
ItemID int pk
ReferenceID int
sessionid varchar(255)
StatusID int

tbl_ItemsStatus
statusid int pk
isinternalstatus bit

there is an index on (ReferenceID, SessionID, StatusID) and (SessionID, StatusID)

this is the query:

DECLARE @referenceid INTEGER
SET @referenceid = 1019

SELECT MAX(i2.itemid)
FROM tbl_Items i2 (NOLOCK)
JOIN tbl_ItemsStatus s (NOLOCK)
ON i2.StatusID = s.StatusID
WHERE
s.IsInternalStatus = 0
AND i2.referenceid = @referenceid
AND i2.sessionid IN (
SELECT i3.sessionid
FROM tbl_Items i3 (NOLOCK)
WHERE
i3.referenceid = @referenceid
AND i3.status <> 7
AND i3.status <> 8
AND i3.status <> 10
AND i3.itemid IN (
SELECT max(i4.itemid)
FROM tbl_Items i4 (NOLOCK)
WHERE i4.referenceid = @referenceid
GROUP BY i4.sessionid
)
AND i3.itemid NOT IN (
SELECT MAX(i7.itemid )
FROM tbl_Items i7 (NOLOCK)
WHERE
i7.referenceid = @referenceid
AND i7.SessionID IN (
SELECT i5.SessionID
FROM tbl_Items i5 (NOLOCK)
WHERE
i5.status <> 11
AND i5.referenceid = @referenceid
AND i5.itemid IN (
SELECT MAX(i6.itemid)
FROM tbl_Items i6 (NOLOCK)
WHERE
i6.referenceid = @referenceid
AND i6.status IN (7,11,8)
GROUP BY i6.sessionid
)
)
GROUP BY i7.SessionID
)
)

GROUP BY i2.sessionid

we know this query is pretty bad and can be optimized. however, if we run this query as is on 2005 it takes about 2 hours to run...if we run the exact same query on 2000 it takes 9 seconds.

so this query on 2005 if run takes 2 hours..however, if we omit the s.IsInternalStatus = 0 or the i2.referenceid = @referenceid line it takes about 9 seconds.

why would this be? it makes no sense why omitting one of those where clauses would increase the performance of the query by 2 hours? we know its a bad query...but this doesnt make sense.

any one else run into this problem?

View 1 Replies View Related

GROUP BY Speed Up Query On Data That Is Already At Lowest Granularity?

Jul 9, 2015

I have just been running a query which I was planning on improving by removing a redundant GROUP BY (there are about 20 columns, and one of the columns returned is atomic, so will mean that the "group by" will never manage to group any of the data) but when I modified the query to remove the grouping, this actually seems to slow the query, and I can't see why this would be the case.

Both queries return the same number of rows (69000), as I expected, and looking at the query plan, then they look nearly identical, other than at the start, there is a "stream aggregate" and "sort" being performed. The estimated data size is 64MB for the non-grouped query (runs in 6 min 41 secs), vs 53MB for the aggregated query (runs in 5 min 31 secs), and the estimated row size is smaller when aggregated.

Can rationalise this? In my mind, the data that is being pulled is identical, plus there is extra computation for doing an unnecessary aggregation, so the aggregated query should be unquestionably slower, but the database engine has other ideas; it seems to be able to work more quickly when it needs to do unnecessary work :) Perhaps something to do with an inefficient query plan for the non-aggregated query? I would have thought looking at the actual execution plan might have made this apparent, but both plans look very similar.

Edit: More information, the "group by" query had two aggregations on it, a count of one of the columns, and an average of another one. I changed this so that it was just "1" instead of the count, and for the average, I changed it to be the expression within the average aggregate, since the aggregation effectively does not do anything.

View 2 Replies View Related

Write A Query To Group Records Based On Speed (specific Value Of Zero)?

Jun 28, 2012

I need to write a query to group records based on speed (specific value of zero). Consider the following scenario:

Table - Vehicle_Event

Vehicle_Id____Date_Time______________Speed
C1____________2012-06-28_10:10:00____5
C1____________2012-06-28_10:11:00____0
C1____________2012-06-28_10:12:00____0
C1____________2012-06-28_10:13:00____4
C1____________2012-06-28_10:14:00____3

[code].....

OUTPUT_Required:

Vehicle_Id____Date_Time___________________________ __________Speed
C1____________2012-06-28_10:10:00___________________________5
C1____________2012-06-28_10:11:00_to_2012-06-28_10:12:00____0
C1____________2012-06-28_10:13:00___________________________4
C1____________2012-06-28_10:14:00___________________________3
C1____________2012-06-28_10:15:00_to_2012-06-28_10:18:00____0

[Code] .....

I need the start and end time of consecutive records of the same vehicle with 0 speed ordered by date_time. If there is more than one consecutive record with zero speed it needs to be grouped together.

View 6 Replies View Related

SQL Server 2008 :: How To Improve Speed Of Initial Query Vice Subsequent Queries

Apr 23, 2015

I have a pretty large DB and a fairly complex query. If I drop buffers and clear cache the query runs in 20 seconds returning 25K rows. Subsequent runs are 2 seconds. Is this the result of the results being cached, execution being cached, other? Are there good ways to close the gap between the initial and later runs? Does the cache stay present until the service restarts or does SQL recycle the memory and if so, based on what criteria?

View 5 Replies View Related

Transact SQL :: Query To Avoid IF And CASE And Calculations Carried Out Once Only To Speed Up With Common Comparing Columns

Oct 22, 2015

Got a query taking too much time because of lack of cross columns MAX/MIN functions. Consider a similar example where a View is required to reflect distribution of Water among different towns each having four different levels of distribution reservoir tanks of different sizes:In this case the basic table has columns like:

PurchaseDate
TownName
QuantityPurchased
Tank1_Size
Tank2_Size
Tank3_Size
Tank4_Size

Now suppose I need a query to distribute QuantityPurchased in the Four additional Columns computed on the basis depending on the sizes declared in the last four fields,in the same order of preference.For example: I have to use IIFs to check: whether the quantity purchased is less than Tank_A if yes then Qty Purchased otherwise Tank_A_Size itself for Tank_A_Filled

then again IIF but this time to check:

Whether the quantity purchased less Tank_A_Filled (Which again needs to be calculated as above) is less than Tank_B if yes then Tank_A_Filled (Which again needs to be calculated as above) otherwise Tank_B_Size itself for Tank_B_Filled

View 2 Replies View Related

SQL Speed

Sep 19, 2006

Andy writes "I have data with more than 4 milions. How to speed up query it ?"

View 3 Replies View Related

Speed

Mar 9, 2007

hello,
i need some opinion on how to sum up or group by more than 2k records faster.. eg, how do i optimize this?

SELECT DISTINCT r.ClientID,c.ClientName, r.ItemID, r.StockID,r.StockName, r.ExpectedQty,r.QCQty,r.AVAQty,r.PNDQty as pnd, r.VMIQCQty,r.VMIAVAQty,r.VMIPNDQty as vmipnd,

(Select isnull( SUM(d.HoldQty) ,0) FROM tblItemdetail d WHERE d.itemid=r.itemid AND d.ConsignorID=@ClientID AND d.Ownerstatus='VMI') AS VMIPNDQty,

(Select isnull( SUM(d.HoldQty) ,0) FROM tblItemdetail d WHERE d.itemid=r.itemid AND d.ConsignorID=@ClientID AND d.Ownership= i.Supplier AND d.Ownerstatus='VMI') AS PNDQty,

(Select isnull(SUM(d.OriginQty - d.PickQty -d.HoldQty -d.qcqty),0) FROM tblItemDetail d WHERE d.ConsignorID=@ClientID AND d.Ownership= i.Supplier AND d.Ownerstatus='OWN') AS StockAtCustAVAQty,

(Select isnull(SUM(d.HoldQty),0) FROM tblItemDetail d WHERE d.ConsignorID=@ClientID AND d.Ownership= i.Supplier AND d.Ownerstatus='OWN') AS StockAtCustPNDQty,

(Select isnull(SUM(d.qcqty),0) FROM tblItemDetail d WHERE d.ConsignorID=@ClientID AND d.Ownership= i.Supplier AND d.Ownerstatus='OWN') AS StockAtCustQCQty

FROM tblItemCrossRef r
INNER JOIN tblClient c ON c.ClientID=r.ClientID
INNER JOIN tblItemClients i on i.Supplier=r.ClientID
WHERE r.ClientID=@ClientID AND r.StockID LIKE @StockID+'%'

~~~Focus on problem, not solution~~~

View 5 Replies View Related

Speed Up UDF

Jun 22, 2006

Hello all-Given the following UDF, in sql 2000 can it be sped up, complied oranything of the like. A query returning 300,000 + rows times out whenran through the udf, inline case statements returns the rows in 5seconds.Thanks!JeffCREATE FUNCTION dbo.TimeFormat(@input datetime,@groupformatvarchar(20) --DAY, WEEK, MONTH)RETURNS datetimeASBEGINdeclare @dtvar as datetimeif @groupformat = 'DAY'set @dtvar = CAST(CONVERT(char(10), @input, 101) AS datetime)else if @groupformat = 'WEEK'set @dtvar = CAST(DATEADD([DAY], 1 - DATEPART(dw, CONVERT(char(10),@input, 101)), CONVERT(char(10), @input, 101)) AS datetime)else if @groupformat = 'MONTH'set @dtvar = CAST(CONVERT(CHAR(6), @input, 112) + '01' AS datetime)return @dtvarEND

View 9 Replies View Related

I Need BETWEEN On Speed

Jul 20, 2005

This is x-posted in:alt.php.sqlcomp.databases.ms-sqlservermicrosoft.public.sqlserver.programmingI have events that occur during the day. I want to be able to search thoseby a form with checkboxes (multiple select).Let's say for instance an event is happening from 3-10pm. When someonesearches for 4-6 (checkbox option) it needs to show up.I don't need code so much as I just need theory. My theory that I coded outand worked, just a missight in theory is as follows. I did a BETWEEN callthat pulled any event that began BETWEEN 4 AND 6 or ended BETWEEN 4 AND 6.As you can see. The event spans that time, but does not start or stopbetween 4 and 6, thus was not pulled. Ooops.So if someone call tell me of another function or perhaps just a better wayto use BETWEEN that would be great. I don't think that code is necessary atthis juncture, so save the 'Please post code' post :) Thanks.

View 12 Replies View Related

Sp_executesql And Speed

Jul 12, 2006

Hello,
 
I am using sp_executesql this to pass parameter to sql string and I am seeing deadlock between sp_prepexec which does UPDATE with another UPDATE done by another process. When it comes to speed and deadlock, would you recomand not using sp_executesql?

View 1 Replies View Related

Variables And Speed

May 28, 2002

I have a select statement that has many 'ands' where one side uses the same thing for ex.

where
t1.column1=r3.other and
t1.column1=e5.new and
t1.column1=k9.old etc...is there any speed gained by putting the value for column one into a variable and using that variable each time instead of t1.columns1?

Thanks,
Eddie

View 1 Replies View Related

Optimizing Speed

Sep 4, 2001

I have indexed my SQL Server tables to gain some speed on calling up tables and queries ( using VB and ADO ). It is still very slow...Is there a move I have to make once my tables are indexed or is there any tricks to improve the speed cause I am getting kinda desparate right now :(

View 1 Replies View Related

Tables And Speed

Dec 10, 1999

I am in the planning stages of a website and want to design my database to allow for optimum performance in case the site becomes popular. In my database I will have around 5 main categories with each category having around 25 or 30 subcategories and each subcategory having around 100 to 300 items. I plan to use MS-SQL Server7. Would it be best to have one large table or have 5 tables for the main categories or have around 100 tables one for each subcategory? Database usage will be simple (no complex queries) but most pages will hit the database and I need to allow for a potential of 4 or 5 million page views per month. The SQL Server will be a shared one.

Randall

View 3 Replies View Related

Speed Up Blobs

Jan 4, 1999

Howdy,

Any one got any tricks on speeding up storing and retrieving image data? I have a typical 2MB image and I am trying to improve speed... it now takes 8 seconds to store the blob.

thanks

Dick Butler

View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved