How To Speed Up Database
Nov 8, 2006I have database that contains over 500000 rows. The speed of data excution is extremely slow. Is there any way to speed up the operation?
Thanks.
I have database that contains over 500000 rows. The speed of data excution is extremely slow. Is there any way to speed up the operation?
Thanks.
Hi,
I have several data bases on a server (SQL Server 2000 only, no web server installed) and lately, as the company keeps gowing, my users complain saying the server gets slow, (this dbs are well designed and recieve optimizations and integrity checks, etc) because of this, Im thinking about getting a new server to repleace my old ProLiant ML 330 which was bought 4 years ago but Im concerned about what server arquitecture or characteristic can help me best to improve response performance, is it HD speed? Processor speed? or more Ram? I want to make a good decision, so I´d really appreciate your help...
Thanks, Luis Luevano
Our DBAs want to speed up SQL database backup of a database of 4 GB data size into a network drive on another server. We are using WINNT 4.0 (SP3)& SQL Server 6.5 (SP4). We know that "Backup buffer size" SQL configuration parameter is a way of doing it. Below are results of increasing the value:
1. If the value is more than 10, the runnable value is always 10 even when stopping and starting SQL server services.
2. The SQL backup took much longer time than before when increasing the value from 5 to 10.
How to properly do it ? What could be the problem ? and are there any other considerations to meet ?
Note that total RAM of servers is 512 MB out of which 128 MB is allocated to SQL server.
Anyone's experience and cooperation will be highly appreciated.
Regards,
Einas (DBA)
Well good morning/afternoon to everyone. It's been a while sinse I've posted here and it seems that the site is a lot faster now. Good to see. :) Anyways, I'm working a current problem here at work with our database being quite slow. I've done some research already and will continue to do so but i wanted to get some of your opinions. Right now, I've run the 'DBCC SHOWCONTIG' command and it is telling the following in the first 3 system tables: DBCC SHOWCONTIG scanning 'sysobjects' table... Table: 'sysobjects' (1); index ID: 1, database ID: 6 TABLE level scan performed. - Pages Scanned................................: 34 - Extents Scanned..............................: 12 - Extent Switches..............................: 33 - Avg. Pages per Extent........................: 2.8 - Scan Density [Best Count:Actual Count].......: 14.71% [5:34] - Logical Scan Fragmentation ..................: 41.18% - Extent Scan Fragmentation ...................: 83.33% - Avg. Bytes Free per Page.....................: 2303.6 - Avg. Page Density (full).....................: 71.54% DBCC SHOWCONTIG scanning 'sysindexes' table... Table: 'sysindexes' (2); index ID: 1, database ID: 6 TABLE level scan performed. - Pages Scanned................................: 72 - Extents Scanned..............................: 16 - Extent Switches..............................: 59 - Avg. Pages per Extent........................: 4.5 - Scan Density [Best Count:Actual Count].......: 15.00% [9:60] - Logical Scan Fragmentation ..................: 50.00% - Extent Scan Fragmentation ...................: 81.25% - Avg. Bytes Free per Page.....................: 4184.9 - Avg. Page Density (full).....................: 48.30% DBCC SHOWCONTIG scanning 'syscolumns' table... Table: 'syscolumns' (3); index ID: 1, database ID: 6 TABLE level scan performed. - Pages Scanned................................: 323 - Extents Scanned..............................: 50 - Extent Switches..............................: 299 - Avg. Pages per Extent........................: 6.5 - Scan Density [Best Count:Actual Count].......: 13.67% [41:300] - Logical Scan Fragmentation ..................: 48.61% - Extent Scan Fragmentation ...................: 96.00% - Avg. Bytes Free per Page.....................: 4527.0 - Avg. Page Density (full).....................: 44.07% DBCC SHOWCONTIG scanning 'systypes' table... Table: 'systypes' (4); index ID: 1, database ID: 6 TABLE level scan performed. - Pages Scanned................................: 1 - Extents Scanned..............................: 1 - Extent Switches..............................: 0 - Avg. Pages per Extent........................: 1.0 - Scan Density [Best Count:Actual Count].......: 100.00% [1:1] - Logical Scan Fragmentation ..................: 100.00% - Extent Scan Fragmentation ...................: 0.00% - Avg. Bytes Free per Page.....................: 6712.0 - Avg. Page Density (full).....................: 17.07% According to the DBCC SHOWCONTIG command documentation, there should be no fragmentation at all. Some questions: 1. would system performance be severly negatively reduced with the above fragmentation (logical and extent)? 2. can the 'DBCC INDEXDEFRAG(dbname, tablename, indexname)' command be issued against those system tables without consequences? 3. is there some other command that can defrag the entire database without having to specify which tables? Also, I have also used the index tuning wizard after a profile trace but that failed with some unknown error. Thats it for now, please let me know if you have some info I could use to help speed up my database.
View 8 Replies View RelatedHi guys,
I am asking this question on behalf of a friend. I have little knowledge of SQL 2005 but my friend is quite knowledgeable, although this is the first time he is dealing with large database for a client. So here's the story.
His client has a database containing 1.5 million books. Now he is setting up a website which will enable users to search books. Searching by ISBN is no problem as it only takes 1 seconds. The problem is, searching by Title takes more than 20seconds, which is unacceptable. My friend has only done smaller database and he just recently thought of implementing indexing and now looking for other ideas.
Each row contains book details such as Title, Author1, Author2, Author3, Publisher, Publication Date, ISBN, etc.
Can anyone who are more experienced in doing large database share with me some design ideas? His client is aiming for 8seconds or less.
Thanks in advance!
Dear All,We have a procedure that takes12 minutes to run on the first server butthat same procedure now takes 3 hours to run on the second server using thesame data. Does anyone have any suggestions why this is happening and howto make the procedure faster on the second serverThanks in advance.Jeff Magouirk
View 1 Replies View RelatedAndy writes "I have data with more than 4 milions. How to speed up query it ?"
View 3 Replies View Relatedhello,
i need some opinion on how to sum up or group by more than 2k records faster.. eg, how do i optimize this?
SELECT DISTINCT r.ClientID,c.ClientName, r.ItemID, r.StockID,r.StockName, r.ExpectedQty,r.QCQty,r.AVAQty,r.PNDQty as pnd, r.VMIQCQty,r.VMIAVAQty,r.VMIPNDQty as vmipnd,
(Select isnull( SUM(d.HoldQty) ,0) FROM tblItemdetail d WHERE d.itemid=r.itemid AND d.ConsignorID=@ClientID AND d.Ownerstatus='VMI') AS VMIPNDQty,
(Select isnull( SUM(d.HoldQty) ,0) FROM tblItemdetail d WHERE d.itemid=r.itemid AND d.ConsignorID=@ClientID AND d.Ownership= i.Supplier AND d.Ownerstatus='VMI') AS PNDQty,
(Select isnull(SUM(d.OriginQty - d.PickQty -d.HoldQty -d.qcqty),0) FROM tblItemDetail d WHERE d.ConsignorID=@ClientID AND d.Ownership= i.Supplier AND d.Ownerstatus='OWN') AS StockAtCustAVAQty,
(Select isnull(SUM(d.HoldQty),0) FROM tblItemDetail d WHERE d.ConsignorID=@ClientID AND d.Ownership= i.Supplier AND d.Ownerstatus='OWN') AS StockAtCustPNDQty,
(Select isnull(SUM(d.qcqty),0) FROM tblItemDetail d WHERE d.ConsignorID=@ClientID AND d.Ownership= i.Supplier AND d.Ownerstatus='OWN') AS StockAtCustQCQty
FROM tblItemCrossRef r
INNER JOIN tblClient c ON c.ClientID=r.ClientID
INNER JOIN tblItemClients i on i.Supplier=r.ClientID
WHERE r.ClientID=@ClientID AND r.StockID LIKE @StockID+'%'
~~~Focus on problem, not solution~~~
Hello all-Given the following UDF, in sql 2000 can it be sped up, complied oranything of the like. A query returning 300,000 + rows times out whenran through the udf, inline case statements returns the rows in 5seconds.Thanks!JeffCREATE FUNCTION dbo.TimeFormat(@input datetime,@groupformatvarchar(20) --DAY, WEEK, MONTH)RETURNS datetimeASBEGINdeclare @dtvar as datetimeif @groupformat = 'DAY'set @dtvar = CAST(CONVERT(char(10), @input, 101) AS datetime)else if @groupformat = 'WEEK'set @dtvar = CAST(DATEADD([DAY], 1 - DATEPART(dw, CONVERT(char(10),@input, 101)), CONVERT(char(10), @input, 101)) AS datetime)else if @groupformat = 'MONTH'set @dtvar = CAST(CONVERT(CHAR(6), @input, 112) + '01' AS datetime)return @dtvarEND
View 9 Replies View RelatedThis is x-posted in:alt.php.sqlcomp.databases.ms-sqlservermicrosoft.public.sqlserver.programmingI have events that occur during the day. I want to be able to search thoseby a form with checkboxes (multiple select).Let's say for instance an event is happening from 3-10pm. When someonesearches for 4-6 (checkbox option) it needs to show up.I don't need code so much as I just need theory. My theory that I coded outand worked, just a missight in theory is as follows. I did a BETWEEN callthat pulled any event that began BETWEEN 4 AND 6 or ended BETWEEN 4 AND 6.As you can see. The event spans that time, but does not start or stopbetween 4 and 6, thus was not pulled. Ooops.So if someone call tell me of another function or perhaps just a better wayto use BETWEEN that would be great. I don't think that code is necessary atthis juncture, so save the 'Please post code' post :) Thanks.
View 12 Replies View RelatedHello,
I am using sp_executesql this to pass parameter to sql string and I am seeing deadlock between sp_prepexec which does UPDATE with another UPDATE done by another process. When it comes to speed and deadlock, would you recomand not using sp_executesql?
Hi,
Can anyone tell me a way to speed up these querys?
//This is selecting a number of records (sent by user) from a table and randomizing those
tempSQL.Text = "select top " + amount.Text + " number from [" + src.Text + "] Where pull='N' order by newID()";
SqlConnection conn2 = new SqlConnection(ConfigurationManager.ConnectionStrings["MyDB"].ConnectionString);
conn2.Open();
SqlCommand cmd3 = new SqlCommand(tempSQL.Text, conn2);
cmd3.CommandTimeout = 1000;
SqlDataReader dr = cmd3.ExecuteReader();
//Then I open a data reader that uses the records
SqlConnection conn2a = new SqlConnection(ConfigurationManager.ConnectionStrings["MyDB"].ConnectionString);
conn2a.Open();
while (dr.Read())
{
//the records are then placed 1 by one into a temp table
string fillresultID = "Insert into [" + src.Text + "_Additional_Temp] (number) Values('" + dr["number"] + "')";
SqlCommand cmd4 = new SqlCommand(fillresultID, conn2a);
cmd4.CommandTimeout = 0;
cmd4.ExecuteNonQuery();
//then the original table that held the numbers is marked as used(again one by one)
string update = "Update [" + src.Text + "] set pull='Y' where number='" + dr["number"] + "'";
SqlCommand cmd5 = new SqlCommand(update, conn2a);
cmd5.CommandTimeout = 0;
cmd5.ExecuteNonQuery();
}
dr.Close();
conn2.Close();
conn2a.Close();
Thanks,
Doug
Hi, how can i speed up this query, it seems to be taking a very long time to bring back the reults;
--This stored procedure retrieves access rights for usersCREATE PROCEDURE wc_User_Access_Right_List
ASSELECT dbo.tblRep.Rep_ID, RTRIM(dbo.tblRep.Rep_Forename) + ' ' + RTRIM(dbo.tblRep.Rep_Surname) AS User_Full_Name, dbo.tblAccessRight.Access_Right, dbo.tblAccessRight.Access_Right_IDFROM dbo.tblRep LEFT OUTER JOIN dbo.tblAccessRight ON dbo.tblRep.Access_Right_ID = dbo.tblAccessRight.Access_Right_ID ORDER BY User_Full_Name
--Make sure this has saved, if not return 10 as this is unexpected error
IF @@rowcount = 0 return 10
DECLARE @RETURN_VALUE tinyintIF @@error <>0 RETURN @@errorGO
I have a select statement that has many 'ands' where one side uses the same thing for ex.
where
t1.column1=r3.other and
t1.column1=e5.new and
t1.column1=k9.old etc...is there any speed gained by putting the value for column one into a variable and using that variable each time instead of t1.columns1?
Thanks,
Eddie
I have indexed my SQL Server tables to gain some speed on calling up tables and queries ( using VB and ADO ). It is still very slow...Is there a move I have to make once my tables are indexed or is there any tricks to improve the speed cause I am getting kinda desparate right now :(
View 1 Replies View RelatedI am in the planning stages of a website and want to design my database to allow for optimum performance in case the site becomes popular. In my database I will have around 5 main categories with each category having around 25 or 30 subcategories and each subcategory having around 100 to 300 items. I plan to use MS-SQL Server7. Would it be best to have one large table or have 5 tables for the main categories or have around 100 tables one for each subcategory? Database usage will be simple (no complex queries) but most pages will hit the database and I need to allow for a potential of 4 or 5 million page views per month. The SQL Server will be a shared one.
Randall
Howdy,
Any one got any tricks on speeding up storing and retrieving image data? I have a typical 2MB image and I am trying to improve speed... it now takes 8 seconds to store the blob.
thanks
Dick Butler
I have migrated a database from ms access to ms ms sql. But I am encountering a problem with the website loading speed. It was running much faster with ms access then it is running with ms sql.
Any inputs or thoughts are appreciated.
Treat it as urgent.
We are currently running Backup Exec v7.0 and are backing up two SQL 6.5 servers on our network. The backups
have begun taking excessive amounts of time considering the limited amounts of data on the servers. Does anyone have
an idea of parameters we could check on the servers or other settings that should be checked in Backup Exec? Also, this
problem seems to have surfaced after we loaded Network Associates VirusScan NT onto our NT Servers and Workstations.
Has anyone had problems with this product on their SQL servers?
Hi
I've recently been given the task to maintain a website that runs off mssql using vb and asp .net. One of the main task is to improve the mssql access time. I am fairly new to mssql, can anyone give me some insight on the procedure to improve access time to mssql. Obviously sql query plays a big part, but what about database config, asp/vb .net commands or design patterns (for large amount of data pulling and join?)
Any suggestions or pointer will be greatly appreciated.
I have a single UPDATE statement that has been running for 13+ hours and I have no idea of when it will complete.
Recipients has 80 million records
UpdatedStagingRecipients has 34 million records.
Why would this possibly take so long? Is there anything that I can do at all?
UPDATE Recipients
SET Recipients.First = UpdatedStagingRecipients.First
, Recipients.Last = UpdatedStagingRecipients.Last
, Recipients.StreetAddress = UpdatedStagingRecipients.StreetAddress
, Recipients.City = UpdatedStagingRecipients.City
, Recipients.State = UpdatedStagingRecipients.State
, Recipients.Postal = UpdatedStagingRecipients.Postal
, Recipients.Country = UpdatedStagingRecipients.Country
, Recipients.DOB = UpdatedStagingRecipients.DOB
, Recipients.Obscene = UpdatedStagingRecipients.Obscene
, Recipients.Gender = UpdatedStagingRecipients.Gender
, Recipients.IPv4 = UpdatedStagingRecipients.IPv4
, Recipients.NameSourceID = UpdatedStagingRecipients.NameSourceID
, Recipients.NameLine = UpdatedStagingRecipients.NameLine
, Recipients.AddressSourceID = UpdatedStagingRecipients.AddressSourceID
, Recipients.AddressLine = UpdatedStagingRecipients.AddressLine
, Recipients.RecordCreationSourceID = UpdatedStagingRecipients.RecordCreationSourceID
FROM Recipients INNER JOIN UpdatedStagingRecipients ON (Recipients.UserName = UpdatedStagingRecipients.UserName AND Recipients.DomainID = UpdatedStagingRecipients.DomainID)
TRUNCATE TABLE UpdatedStagingRecipients
it is working but takes about 3-4 seconds per exec.
CREATE PROCEDURE isp_ap_calc_apt_totals
@p_comp char(2),
@p_vend char(6),
@p_asofdatechar(8)
as
if (@p_asofdate <= '00000000')
begin
set @p_asofdate = '99999999'
end
delete from XAPAPTTOT
where xapt_comp = @p_comp and xapt_vend = @p_vend and xapt_asof_date = @p_asofdate
insert into XAPAPTTOT
select apph_comp, apph_vend, apph_type, apph_id, @p_asofdate,
sum(apph_paymnts),
sum(apph_discts),
sum(apph_adjts),
count(apph_paymnts),
sum(apph_paymnts)+ sum(apph_discts) + sum(apph_adjts) +
b.apt_gross,
0,
max(str(yy,4) + replace(str(mm,2),' ','0') + replace(str(dd,2),' ','0'))
from APPHISTF.a join APTRANF.b on b.apt_comp = a.apph_comp and b.apt_vend = a.apph_vend and b.apt_type = a.apph_type and b.apt_id = a.apph_id
where ((a.apph_comp = @p_comp) and (a.apph_vend = @p_vend) and (a.apph_unpost_dt = 0)
and (str(a.yy,4) + replace(str(a.mm,2),' ','0') + replace(str(a.dd,2),' ','0') <= @p_asofdate))
or ((a.apph_unpost_dt > 0 and a.apph_unpost_dt <= @p_asofdate and b.apt_unposted_fg = 1 and b.apt_comp = @p_comp and b.apt_vend = @p_vend and b.apt_type = a.apph_type and b.apt_id = a.apph_id))
or (((str(a.yy,4) + replace(str(a.mm,2),' ','0') + replace(str(a.dd,2),' ','0') <= @p_asofdate) and a.apph_unpost_dt > @p_asofdate and b.apt_comp = @p_comp and b.apt_vend = @p_vend and b.apt_type = a.apph_type and b.apt_id = a.apph_id))
group by apph_comp, apph_vend, apph_type, apph_id
update XAPAPTTOT
set xapt_last_payck =
(select max(apph_payck) from APPHISTF
where apph_comp = xapt_comp and apph_vend = xapt_vend and apph_type = xapt_type
and apph_id = xapt_id
and str(yy,4) + replace(str(mm,2),' ','0') + replace(str(dd,2),' ','0') = xapt_last_paydt )
where xapt_comp = @p_comp and xapt_vend = @p_vend and xapt_asof_date = @p_asofdate
GO
I'm not sure if this is the right forum, but I have a general question about running/storing databases. I have been running a process with 60+ million records in one table and another 16 million in another table and it is taking forever to get everything imported in and run the appropriate queries. I've been doing this all on a desktop and I am anxious to learn of a more efficient, faster method of processing this amount of data.
What solution should I pursue if I am doing this work a few times a year so that it doesn't take three full days of processing to reach an answer with the data?
Thanks.
Has anyone started using encryption yet? Is it noticably slower than not using it? For the record Im not refering too column encryption, but "network" (not sure what else to call it) encryption when it's encrypted between SQL Server and the client.
View 1 Replies View RelatedHi Everyone -
We were discussing select speeds the other day,
the question comes in at...
what is faster, a standard query with specifies the search critera
i.e. select * from xyz where a='A' or a='B'
or is the IN a better way to go....
select * from xyz where a in ('A', 'B')
we ran the tests in performance analyzer,
but they were the same results....
maybe i'm smoking the stuff - but i thought the
useage of the IN causes a full table scan for each
of the items in the in clause
please advise
take care
tony
This may be more appropriate on another forum, but wanted to see if anyone could give me any ideas.
We have SQL 7 on NT4. We are restricted from upgrading for a couple of particular reasons, but things nevertheless run fairly stable.
We have roughly 100 - 140 simultaneous connections.
Some users have very slow access times where others have a fast connection time. This is not restricted to DB access, but also for file transferring.
Just like you can test your DSL/Cable speeds, are there any similar tests that can be ran on client machines to find their access speeds to the SQL database server?
I really appreciate your suggestions!
I need to speed this process up.
Insert into dbo.temp_table
(record_number, etc....)
select record_number, etc....
from
dbo.incoming_temp
set Indentity _insert temp_table on
It's about 10 million records and usally runs 2-3 hours. Any ideas???????????
Hello,
I am working with some third party software and trying to rebuild my own queries for my own reports but using the business logic that is behind their queries so that I can be sure the numbers still match up and are useable. The reports in the third party software seem to be built on the fly as I am finding when I run sql profiler because there are no stored procedures. I have a query which returns a basic result set but it takes 52 seconds to get it back and I feel like that is way to long for such a simple result set of returning a count by location. The query creates a few temporary tables and combines and updates some data. I want to update this query to make it faster but am not sure of the best approach to take. The temporary tables create alot of columns that I am not even going to need in the final result set or the update statements so I thought by removing some it would help. I am going to paste in the original query and if anyone has any ideas to make this more up to date with sql 2005 standards and just faster please feel free to let me know. Thanks! :)
CREATE TABLE #FI28EF3WE3DN
(practice_id char(4) null, location_id uniqueidentifier null, rendering_id uniqueidentifier null, service_item_id char(12) null, service_item_lib_id uniqueidentifier null, service_item_desc varchar(80) null, cpt4_code_id char(12) null, charge_amt numeric(19,2) null, quantity dec(19,2) null, begin_date_of_service char(8) null, closing_date char(8) null, create_timestamp datetime null, rvu1 float null, rvu1_total float null, rvu2 float null, rvu2_total float null, rvu3 float null, rvu3_total float null, rvu4 float null, rvu4_total float null, end_date_of_service char(8) null, midlevel_id uniqueidentifier null, unit_price numeric(19,2) null, override_ind char(1) null, modifier_1 char(2) null, modifier_2 char(2) null, modifier_3 char(2) null, modifier_4 char(2) null, link_id uniqueidentifier null, source_id uniqueidentifier null, enc_nbr numeric null, source_type char(1) null, name varchar(150) null, person_id uniqueidentifier null, date_of_birth char(8) null, patient_age char(8) null, address_line_1 varchar(55) null, address_line_2 varchar(55) null, city varchar(35) null, state varchar(3) null, zip char(9) null, country_id uniqueidentifier null, county_id uniqueidentifier null, county_desc varchar(100) null, sex char(1) null, sex_desc varchar(30) null, batch_info varchar(40) null, override_closing_date char(8) null, ud_demo1_id uniqueidentifier null, ud_demo2_id uniqueidentifier null, ud_demo3_id uniqueidentifier null, ud_demo4_id uniqueidentifier null, ud_demo5_id uniqueidentifier null, ud_demo6_id uniqueidentifier null, ud_demo7_id uniqueidentifier null, ud_demo8_id uniqueidentifier null, ud_demo9_id uniqueidentifier null, ud_demo10_id uniqueidentifier null, icd9cm_code_id varchar(10) null, icd9cm_code_id_2 varchar(10) null, icd9cm_code_id_3 varchar(10) null, icd9cm_code_id_4 varchar(10) null, icd9cm_code_desc varchar(255) null, icd9cm_code_desc_2 varchar(255) null, icd9cm_code_desc_3 varchar(255) null, icd9cm_code_desc_4 varchar(255) null, quadrant char(2) null, tooth char(2) null, surface_description varchar(200) null, form char(4) null, fin_class_id uniqueidentifier, person_payer_id uniqueidentifier, department_id uniqueidentifier, outsource_ind char(1) null, outsource_date char(8) null, outsource_agency_id uniqueidentifier, outsource_agency_desc varchar(40) null, fqhc_enc_ind char(1) null, fin_class varchar(100) null, payer_name varchar(40) null, person_nbr char(12) null, pat_prov_name_1 varchar(75), pat_prov_name_2 varchar(75), pat_prov_name_3 varchar(75), pat_prov_name_4 varchar(75), pat_prov_name_5 varchar(75), pat_prov_name_6 varchar(75), pat_prov_name_7 varchar(75), pat_prov_name_8 varchar(75), pat_prov_name_9 varchar(75), pat_prov_name_10 varchar(75), pat_prov_name_11 varchar(75), pat_prov_name_12 varchar(75))
INSERT into #FI28EF3WE3DN
(practice_id, source_id, source_type, location_id, rendering_id, service_item_id, service_item_lib_id, cpt4_code_id, charge_amt, quantity, begin_date_of_service, closing_date, create_timestamp, rvu1_total, rvu2_total, rvu3_total, rvu4_total, end_date_of_service, modifier_1, modifier_2, modifier_3, modifier_4, midlevel_id, unit_price, override_ind, link_id, person_id, override_closing_date, batch_info, icd9cm_code_id, icd9cm_code_id_2, icd9cm_code_id_3, icd9cm_code_id_4, icd9cm_code_desc, icd9cm_code_desc_2, icd9cm_code_desc_3, icd9cm_code_desc_4, quadrant, tooth, surface_description, form, fin_class_id, person_payer_id, outsource_ind, outsource_date, outsource_agency_id, fqhc_enc_ind)
SELECT c.practice_id, c.source_id, c.source_type, c.location_id, c.rendering_id, c.service_item_id, c.service_item_lib_id, c.cpt4_code_id, amt, quantity, begin_date_of_service, closing_date, c.create_timestamp, 0.0000, 0.0000, 0.0000, 0.0000, end_date_of_service, c.modifier_1, c.modifier_2, modifier_3, modifier_4, midlevel_id, unit_price, override_ind, link_id, c.person_id, override_closing_date, batch_info, c.icd9cm_code_id, c.icd9cm_code_id_2, c.icd9cm_code_id_3, c.icd9cm_code_id_4, icd9cm_code_desc, icd9cm_code_desc_2, icd9cm_code_desc_3, icd9cm_code_desc_4, quadrant, tooth, surface_description, c.form, p.financial_class, pe.cob1_person_payer_id, outsource_ind, outsource_date, outsource_agency_id, pe.fqhc_enc_ind
FROM charges c, patient_encounter pe, payer_mstr p
WHERE source_type = 'V' and c.practice_id = pe.practice_id and c.source_id = pe.enc_id and pe.cob1_payer_id = p.payer_id and begin_date_of_service >= '20080401' and begin_date_of_service < '20080501' and (c.link_id is null) AND c.practice_id = '0001'
INSERT into #FI28EF3WE3DN
(practice_id, source_id, source_type, location_id, rendering_id, service_item_id, service_item_lib_id, cpt4_code_id, charge_amt, quantity, begin_date_of_service, closing_date, create_timestamp, rvu1_total, rvu2_total, rvu3_total, rvu4_total, end_date_of_service, modifier_1, modifier_2, modifier_3, modifier_4, midlevel_id, unit_price, override_ind, link_id, person_id, override_closing_date, batch_info, icd9cm_code_id, icd9cm_code_id_2, icd9cm_code_id_3, icd9cm_code_id_4, icd9cm_code_desc, icd9cm_code_desc_2, icd9cm_code_desc_3, icd9cm_code_desc_4, quadrant, tooth, surface_description, form, outsource_ind, outsource_date, outsource_agency_id, fqhc_enc_ind) SELECT c.practice_id, c.source_id, c.source_type, c.location_id, c.rendering_id, c.service_item_id, c.service_item_lib_id, c.cpt4_code_id, amt, quantity, begin_date_of_service, closing_date, c.create_timestamp, 0.0000, 0.0000, 0.0000, 0.0000, end_date_of_service, c.modifier_1, c.modifier_2, modifier_3, modifier_4, midlevel_id, unit_price, override_ind, link_id, c.person_id, override_closing_date, batch_info, c.icd9cm_code_id, c.icd9cm_code_id_2, c.icd9cm_code_id_3, c.icd9cm_code_id_4, icd9cm_code_desc, icd9cm_code_desc_2, icd9cm_code_desc_3, icd9cm_code_desc_4, quadrant, tooth, surface_description, c.form, outsource_ind, outsource_date, outsource_agency_id, pe.fqhc_enc_ind FROM charges c, patient_encounter pe
WHERE source_type = 'V' and c.practice_id = pe.practice_id and c.source_id = pe.enc_id and pe.cob1_person_payer_id is null and begin_date_of_service >= '20080401' and begin_date_of_service < '20080501' and (c.link_id is null) AND c.practice_id = '0001'
INSERT into #FI28EF3WE3DN
(practice_id, source_id, source_type, location_id, rendering_id, service_item_id, service_item_lib_id, cpt4_code_id, charge_amt, quantity, begin_date_of_service, closing_date, create_timestamp, rvu1_total, rvu2_total, rvu3_total, rvu4_total, end_date_of_service, modifier_1, modifier_2, modifier_3, modifier_4, midlevel_id, unit_price, override_ind, link_id, person_id, override_closing_date, batch_info, icd9cm_code_id, icd9cm_code_id_2, icd9cm_code_id_3, icd9cm_code_id_4, icd9cm_code_desc, icd9cm_code_desc_2, icd9cm_code_desc_3, icd9cm_code_desc_4, quadrant, tooth, surface_description, form, outsource_ind)
SELECT c.practice_id, c.source_id, c.source_type, c.location_id, c.rendering_id, c.service_item_id, c.service_item_lib_id, c.cpt4_code_id, amt, quantity, begin_date_of_service, closing_date, c.create_timestamp, 0.0000, 0.0000, 0.0000, 0.0000, end_date_of_service, c.modifier_1, c.modifier_2, modifier_3, modifier_4, midlevel_id, unit_price, override_ind, link_id, c.person_id, override_closing_date, batch_info, c.icd9cm_code_id, c.icd9cm_code_id_2, c.icd9cm_code_id_3, c.icd9cm_code_id_4, icd9cm_code_desc, icd9cm_code_desc_2, icd9cm_code_desc_3, icd9cm_code_desc_4, quadrant, tooth, surface_description, c.form, c.outsource_ind FROM charges c
WHERE source_type = 'I' and begin_date_of_service >= '20080401' and begin_date_of_service < '20080501' and (c.link_id is null) AND c.practice_id = '0001'
UPDATE #FI28EF3WE3DN
SET name = isnull(per.last_name,'') + ', ' + isnull(per.first_name,'') + ' ' + isnull(per.middle_name,''), date_of_birth = per.date_of_birth, patient_age = per.date_of_birth, city = per.city, zip = per.zip, country_id = per.country_id, sex = per.sex, address_line_1 = per.address_line_1, address_line_2 = per.address_line_2, state = per.state, county_id = per.county_id, person_nbr = per.person_nbr
FROM person per
WHERE #FI28EF3WE3DN.person_id = per.person_id
UPDATE #FI28EF3WE3DN
SET name = emp.name, city = emp.city, zip = emp.zip, address_line_1 = emp.address_line_1, address_line_2 = emp.address_line_2, state = emp.state, county_id = emp.county_id
FROM invoices inv, accounts ac, employer_mstr emp
WHERE #FI28EF3WE3DN.source_type = 'I' and #FI28EF3WE3DN.practice_id = inv.practice_id and #FI28EF3WE3DN.source_id = inv.invoice_id and inv.practice_id = ac.practice_id and inv.acct_id = ac.acct_id and ac.guar_type = 'E' and ac.guar_id = emp.employer_id
CREATE TABLE #GP28EF3WE3DQ
(practice_id char(4) null, location_id uniqueidentifier null, practice_name varchar(40) null, location_desc varchar(40) null, rendering_id uniqueidentifier null, attend_first_name varchar(25) null, attend_middle_name varchar(25) null, attend_last_name varchar(25) null, service_item_id char(12) null, service_item_desc varchar(80) null, cpt4_code_id char(12) null, cpt4_desc varchar(50) null, charge_amt numeric(19,2) null, quantity dec(19,2) null, begin_date_of_service char(8) null, closing_date char(8) null, create_timestamp datetime null, rvu1 float null, rvu1_total float null, rvu2 float null, rvu2_total float null, rvu3 float null, rvu3_total float null, rvu4 float null, rvu4_total float null, end_date_of_service char(8) null, midlevel_id uniqueidentifier null, unit_price numeric(19,2) null, override_ind char(1) null, modifier_1 char(2) null, modifier_2 char(2) null, modifier_3 char(2) null, modifier_4 char(2) null, link_id uniqueidentifier null, mid_last_name varchar(25) NULL, mid_first_name varchar(25) NULL, mid_middle_name varchar(25) NULL, link_ind char(1) null, name varchar(150) null, date_of_birth char(8) null, patient_age char(8) null, address_line_1 varchar(55) null, address_line_2 varchar(55) null, city varchar(35) null, state varchar(3) null, zip char(9) null, country_id uniqueidentifier null, county_id uniqueidentifier null, county_desc varchar(100) null, sex char(1) null, sex_desc varchar(30) null, batch_info varchar(40) null, override_closing_date char(8) null, location_subgrouping1_id uniqueidentifier null, location_subgrouping1_desc varchar(100) null, location_subgrouping2_id uniqueidentifier null, location_subgrouping2_desc varchar(100) null, provider_subgrp1_id uniqueidentifier null, provider_subgrp1_desc varchar(100) null, provider_subgrp2_id uniqueidentifier null, provider_subgrp2_desc varchar(100) null, ud_demo1_id uniqueidentifier null, ud_demo1 char(100) null, ud_demo2_id uniqueidentifier null, ud_demo2 char(100) null, ud_demo3_id uniqueidentifier null, ud_demo3 char(100) null, ud_demo4_id uniqueidentifier null, ud_demo4 char(100) null, ud_demo5_id uniqueidentifier null, ud_demo5 char(100) null, ud_demo6_id uniqueidentifier null, ud_demo6 char(100) null, ud_demo7_id uniqueidentifier null, ud_demo7 char(100) null, ud_demo8_id uniqueidentifier null, ud_demo8 char(100) null, ud_demo9_id uniqueidentifier null, ud_demo9 char(100) null, ud_demo10_id uniqueidentifier null, ud_demo10 char(100) null, icd9cm_code_id varchar(10) null, icd9cm_code_id_2 varchar(10) null, icd9cm_code_id_3 varchar(10) null, icd9cm_code_id_4 varchar(10) null, icd9cm_code_desc varchar(100) null, icd9cm_code_desc_2 varchar(100) null, icd9cm_code_desc_3 varchar(100) null, icd9cm_code_desc_4 varchar(100) null, quadrant char(2) null, tooth char(2) null, form char(4) null, quad_desc varchar(200) null, tooth_desc varchar(200) null, surface_description varchar(200) null, form_desc varchar(200) null, department_id uniqueidentifier, fin_class varchar(100) null, payer_name varchar(40) null, department varchar(100) null, outsource_ind char(1) null, outsource_date char(8) null, outsource_agency_desc varchar(40) null, fqhc_enc_ind char(1) null, person_nbr char(12) null, pat_prov_name_1 varchar(75), pat_prov_name_2 varchar(75), pat_prov_name_3 varchar(75), pat_prov_name_4 varchar(75), pat_prov_name_5 varchar(75), pat_prov_name_6 varchar(75), pat_prov_name_7 varchar(75), pat_prov_name_8 varchar(75), pat_prov_name_9 varchar(75), pat_prov_name_10 varchar(75), pat_prov_name_11 varchar(75), pat_prov_name_12 varchar(75))
INSERT into #GP28EF3WE3DQ
(practice_id, location_id, rendering_id, service_item_id, service_item_desc, cpt4_code_id, charge_amt, quantity, begin_date_of_service, closing_date, create_timestamp, rvu1, rvu1_total, rvu2, rvu2_total, rvu3, rvu3_total, rvu4, rvu4_total, end_date_of_service, modifier_1, modifier_2, modifier_3, modifier_4, midlevel_id, unit_price, override_ind, link_id, name, date_of_birth, patient_age, address_line_1, address_line_2, city, state, zip, country_id, county_id, sex, override_closing_date, batch_info, ud_demo1_id, ud_demo2_id, ud_demo3_id, ud_demo4_id, ud_demo5_id, ud_demo6_id, ud_demo7_id, ud_demo8_id, ud_demo9_id, ud_demo10_id, icd9cm_code_id, icd9cm_code_id_2, icd9cm_code_id_3, icd9cm_code_id_4, icd9cm_code_desc, icd9cm_code_desc_2, icd9cm_code_desc_3, icd9cm_code_desc_4, quadrant, tooth, surface_description, form, department_id, outsource_ind, outsource_date, outsource_agency_desc, fqhc_enc_ind, fin_class, payer_name, person_nbr, pat_prov_name_1, pat_prov_name_2, pat_prov_name_3, pat_prov_name_4, pat_prov_name_5, pat_prov_name_6, pat_prov_name_7, pat_prov_name_8, pat_prov_name_9, pat_prov_name_10, pat_prov_name_11, pat_prov_name_12)
SELECT practice_id, location_id, rendering_id, null , null , null , sum(isnull(charge_amt,0)), sum(isnull(quantity,0)), begin_date_of_service, null , null , null , sum(isnull(rvu1_total,0)), null , sum(isnull(rvu2_total,0)), null , sum(isnull(rvu3_total,0)), null , sum(isnull(rvu4_total,0)), null , null , null , null , null , null , null , null , null , name, date_of_birth, null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null , null
FROM #FI28EF3WE3DN group by practice_id, location_id, rendering_id, begin_date_of_service, name, date_of_birth
UPDATE #GP28EF3WE3DQ
SET attend_first_name = pm.first_name, attend_middle_name = pm.middle_name, attend_last_name = pm.last_name, provider_subgrp1_id = pm.provider_subgrouping1_id, provider_subgrp2_id = pm.provider_subgrouping2_id FROM provider_mstr pm
WHERE #GP28EF3WE3DQ.rendering_id = pm.provider_id
UPDATE #GP28EF3WE3DQ
SET practice_name = p.practice_name FROM practice p
WHERE #GP28EF3WE3DQ.practice_id = p.practice_id
UPDATE #GP28EF3WE3DQ
SET location_desc = l.location_name, location_subgrouping1_id = l.location_subgrouping1_id, location_subgrouping2_id = l.location_subgrouping2_id
FROM location_mstr l
WHERE #GP28EF3WE3DQ.location_id = l.location_id
UPDATE #GP28EF3WE3DQ SET link_ind = 'Y'
WHERE link_id is not null
UPDATE #GP28EF3WE3DQ
SET #GP28EF3WE3DQ.department = ml.mstr_list_item_desc
FROM mstr_lists ml
WHERE #GP28EF3WE3DQ.department_id = ml.mstr_list_item_id and mstr_list_type = 'department'
UPDATE #GP28EF3WE3DQ
SET county_desc = ml.mstr_list_item_desc
FROM mstr_lists ml
WHERE #GP28EF3WE3DQ.county_id = ml.mstr_list_item_id and ml.mstr_list_type = 'county'
select location_desc,null,null,null,null,null,null,null,null,null,null,null,null,null,null,null,null,null,null,null,null,null, count(*) from #GP28EF3WE3DQ GROUP BY location_desc Order By location_desc
Hello all,
I am very astonished about the following (recplus has 4.2 Million records).
if I do this :
select sum (quantityInKg) from recplus where element_ID in (160,
161,
162,
164,
165,
166,
1756,
21707,
22052,
22063)
it takes about 2 seconds.
If I do this
select sum (quantityInKg) from recplus where element_ID in (select element_2_id from element_element where element_id = '159')
it takes 1 second. (select element_2_id from element_element where element_id = '159' gives the same list as in the previous query)
Now, why is there a speed diffrence, or better, why in this way? I would have expected it to be the othere way around !
Thanks for your feedbacks.
Regards,
Fabianus
my favorit hoster is ASPnix : www.aspnix.com !
Hello all
I am running a query where the same table is join 4 times. Due to this number of records goes to millions. It take more than 5 min for the query to run. I am trying to optimize the query.
Anybody please help me in the regards
Thank You
Hello,
I have a question regarding the SQL Lite Speed which primarily used for the backup setups.
Is there any document that I can get OR through the help threads where I can learn to use the SQL Lite Speed.
Thanks.
Vasanth David P.
Hi,
The ssis package takes 5 seconds to complete when run from the designer.
This ssis package is now triggered from within a scheduled job which takes 30 seconds to complete.
I do not see why there is a difference of 25 seconds.
Any thoughts please?
Thanks
Ok,This script is something I wrote for bringing up a report in reportingservices and it is really slow...Is their any problems with it or istheir better syntax to speed it up and still provide the same reportresults?:SELECT t1.MemberId, t1.PeriodID, t8.start_date, t6.amount_type_id,t6.amount_type,SUM(CASE WHEN t2.amountTypeId = 7 THEN t2.amountWHEN t2.amountTypeId = 23 THEN - t2.amount END) AS Purchase,SUM(CASE WHEN t2.amountTypeId = 8 THEN t2.amountWHEN t2.amountTypeId = 24 THEN - t2.amount END) AS Matrix,SUM(CASE WHEN t2.amountTypeId = 20 THEN t2.amountWHEN t2.amountTypeId = 21 THEN - t2.amount END) AS QualiFly,SUM(CASE WHEN t2.amountTypeId = 9 THEN t2.amountWHEN t2.amountTypeId = 25 THEN - t2.amount END) AS Dist,SUM(CASE WHEN t2.amountTypeId = 10 THEN t2.amountWHEN t2.amountTypeId = 26 THEN - t2.amount END) AS SM,SUM(CASE WHEN t2.amountTypeId = 11 THEN t2.amountWHEN t2.amountTypeId = 27 THEN - t2.amount END) AS BreakAway,SUM(CASE WHEN t2.amountTypeId = 13 THEN t2.amountWHEN t2.amountTypeId = 14 THEN - t2.amount END) AS Transfer,SUM(CASE WHEN t2.amountTypeId = 28 THEN t2.amountWHEN t2.amountTypeId = 15 THEN - t2.amount END) AS SpentFROM tblTravelDetail t1 INNER JOINtblTravelDetailAmount t2 ON t1.TravelDetailId =t2.TravelDetailId INNER JOINtblTravelDetail t3 ON t2.TravelDetailId =t3.TravelDetailId INNER JOINtblTravelDetailMember t4 ON t3.TravelDetailId =t4.TravelDetailId INNER JOINtblTravelEvent t5 ON t1.TravelEventId =t5.TravelEventId INNER JOINamount_type t6 ON t2.amountTypeId =t6.amount_type_id INNER JOINperiod t8 ON t1.PeriodID = t8.period_idWHERE (t1.MemberId = @MemberId) AND (t2.amount <> 0)GROUP BY t1.MemberId, t1.PeriodID, t8.start_date, t6.amount_type_id,t6.amount_typeAny help is appreciated.Thanks,Trint
View 4 Replies View Related