Report Running Very Slow Compared To Query Analyzer - High TimeDataRetrieval

Jun 26, 2007

Hi,



I have a report in SQL Reporting Services 2005 which calls a stored proc and the report takes a very long time to run and sometimes returns zero records. But when i run the stored proc in query analyzer it takes about 4 seconds!!



I have checked the execution log on the RS using the below sql:






Code Snippet

use ReportServer

Select * from ExecutionLog with (nolock) order by TimeStart DESC



It shows that i have a large amount of time for the dataretrieval (601309ms, about 10mins) and does not return any records most likely because of a query timeout:



TimeDataRetrieval TimeProcessing TimeRendering Source Status ByteCount RowCount
601309 2227 3 1 rsSuccess 4916 0



The weird thing is that when i run it in query analyzer, i get about 400 records in 4 seconds !!



I dont understand what RS is doing to take up so much time like this to retrieve data.



The report is very simple - it basically returns the records straight out into a table.



The only thing I somewhat suspected was a parameter data type conflict between RS and SQL, specifically dates. I have a start and end date parameter in the report - i tried specifying this as date and string to see if it made any difference but it didn't.



Any help would be greatly appreciated.

View 19 Replies


ADVERTISEMENT

Slow Running Query With A High Lob Logical Reads, Lob Data

Jun 1, 2007

we had some slow down complaints lately and this query seems to be the culprit almost every single time. The estimated execution plan is a clustered index seek as there is a clustered index on the uidcustomerid column. setting profile statistics on shows that every time it executes it does an index seek.

profiler session showed a huge number of reads for these queries depending on the value being looked up. 1500 through 50000. i set up profile io on and the culprit is lob logical reads. everything else is 0 or very low. in this case lob logical reads is over 1700.

3 of the columns in the select statement are text columns. when i take them out of the query the lob logical reads drops to 0 and goes up incrementally as i add each column back in.

is there anyway to improve the performance without changing data types to varchar(max)?


select SID,Last_name,Name_2,First_name,Middle_initial,Descriptives,Telephone_number,mainline,Residence,ADL,
DID_number,Svce_street,Svce_town,Svce_state,Svce_appt,Mailing_street,Mailing_town,Mailing_state,Mailing_appt,
Mailing_zip,Listing,Addl_listing,Published,Listed,Gold_number,PIN,status,SSnumber,tax_jurisdiction,
Bill_date,Past_balance,Service_start_date,Service_end_date,LOA,FCC_type,Line_type,I_W,Jacks,Voice_messaging,
vms_ring_cycles,CCS,phonesmarts,ringmate,voice_dialing,Bill_detail,Contact_Number,Contact_extension,
Best_Time,suspend,suspend_start,suspend_end,credits_allowed,credits_granted,home_region,Calling_Plan,Local_Plan,
Local_Plan_Rate,Flat_Rate,Sales_agent,Community,Building_Mgmt,How_Heard,Incentive_1,Incentive_1a,Incentive_1b,
Incentive_1c,Incentive_2,Incentive_2a,Incentive_2b,Incentive_2c,Incentive_3,Incentive_3a,Incentive_3b,
Incentive_3c,block_operator,block_collect,block_group,block_adult,block_call_return,block_repeat_dialing,
block_call_trace,block_caller_id,block_anonymous,block_all_high_toll,block_regional_and_ld,block_DA_Call_Completion,
block_DA,block_3rd_party,bank,prepayment,dial_around_number,custid,waive_interest,Financial_Treatment,
Other_Feature_1_code,Other_Feature_1_rate,Other_Feature_2_code,Other_Feature_2_rate,Other_Feature_3_code,
Other_Feature_3_rate,Other_Feature_4_code,Other_Feature_4_rate,Partial_Account,mail_date,snp_1_date,snp_2_date,
terminate_date,snp1notified,snp1peak,snp1offpeak,snp2notified,snp2peak,snp2offpeak,avg_days_paid,Pulled_Ld,SNP1,
SNP2,Treatment,Collections,Installment,Nynex_BTN,LD_rate,local_discount,to_month,rounds_up,full_package_made,
local_made,PIC,LPIC,tax_exempt_local,tax_exempt_federal,CommissionedAgent,LDRateID,UidCustomerId,
accVchLineClassUSOC,block_Inter_Reg_LD,block_international,block_DA_3rd_Collect,block_DH2,block_ISP_2,block_ISP_3,
block_ISP4_3_GBAS,block_ISP3_3_GBAS,block_collect_only,block_LD_Reg_DA,block_usage_based,block_ISP5_3_GBAS,
block_ISP5_2_GBAS,block_group_adult,csr_PIC,csr_LPIC,csr_SA,csr_exception,cutover_status,cutover_datetime,
OutsideAgent,prfVchAttributes,uidResellerID,Category,uidDealID
from profiles where UidCustomerID in (352199267)

View 3 Replies View Related

SQL Server Running Slow On A High End System

Mar 7, 2004

I have about a 447 MB SQL server 2000 database on a desktop PC acting as a QA server. The hardware specs of the QA box are as follows:

CPU: P4 2.4 GHz
Memory: 1GB
Drives: 80 GB IDE

I recently purchased a Dell PowerEdge 2650 server to act as the staging box. The staging box has

CPU: P4 2.4 GHz
Memory: 2GB
Drives: 40GB SCSI, mirrored

I made a backup of the database on the QA box, and restored it on the staging box. Yet when I run something as simple as a select query (select * from <table>), the less powerful QA box is faster.

I figured maybe the statistics are different on the staging box. I ran dbcc showcontig to make sure the statistics were identical. Also ran RedGate's SQL compare and data compare to make sure everything was identical.

I figured maybe the query optimizer needs to be tweaked. I recreated the indexes and updated statistics on the staging box. The queries actually got slower as a result.

I thought maybe SCSI drives are slower. Tried breaking the mirror on the staging box. No luck. Put the mirror back in place, ran a test where I copied a large folder from one directory to another on the staging box. Repeated the same test with the same data on the QA box. The staging box was more than twice as fast than the QA box.

It doesnt appear to be a problem with the query, adjusting memory in SQL server has not effect, both boxes are using SQL server 2000 SP3, why is the bigger machine running queries hundreds of milliseconds slower than the smaller machine? Any help will be appreciated!

View 14 Replies View Related

SQL 7 High Page Faults/sec Compared To 6.5 ?

Aug 11, 1999

I am running an application on one NT Server, running against SQL Server 6.5 sp 3, and SQL 7 with sp1 applied.

The application is a 'data migration' type application - ie heavy insert and update workload - against many (50+ tables) with many different SQL statements.

The SQL 7 server is configured with 'floating' memory.

On SQL 7 - I am experiencing very high page faults/second for the sqlservr process - sometimes peaking at over 1,000. I was under the impression any number greater than 10 indicates a problem with system performance.

The same application, same data, same NT configuration etc against SQL 6.5 does not page fault. SQL Server 6.5 completes the work faster than 7.

Could anyone help me understand what's going on ?

Thanks in advance.

View 2 Replies View Related

Very Slow Query Analyzer

Aug 12, 2006

Please help me out:
It is textremely slow when I run a query in My SQL Server 2000 Query Analyzer on my laptop. But when I turn off the wireless card on the laptop, the query runs instantly.
Could you please tell me how can I make the server running faster when my computer is connected to the internet?

View 1 Replies View Related

Query Analyzer Starts Slow

Apr 16, 2006

why query analyzer is starting really slow on my computer?
i wasnt like this before. what can i do to make it run fast as before.

View 5 Replies View Related

Extreme Slow Sql Query Analyzer

Jan 12, 2007

My sql query analyzer my windows xp machine is very slow. I run a query in my collegues computer, it takes about 2 mins to finish. This might means the sql server has no problems. On my computer, the same query took about 3 hours to finish.
Can someone give me some suggestion how to fix this problem.

THX
Jeff

View 10 Replies View Related

Query Analyzer Very Slow For Even Trivial Queries

Oct 12, 2005

I've been using MS-SQL Server for many years but never come across this problem before.

When I try and run a very simple query from Query Analyzer it takes a LONG time. Even when there are no tables involved!

Even:-

select 1
go

takes 28 seconds to return '1' when running against the local server. i.e. both QA and the Server are running on the same machine.

Can anyone help explain how to get my performance back! Thanks.

View 1 Replies View Related

Query Analyzer, Slow Query Responses

Sep 25, 2006

Hi there

Running query analyzer against two different server.

the first only need 1-2 secs to return the query result,

the other return 7-8 secs for the query result.

plz advice what could cause this slow performance?



thx



View 1 Replies View Related

Running A DLL In The Query Analyzer

Nov 23, 2007

Hi,I have a DLL that I want to run from the Query Analyzer. I tried thefollowing:USE master;EXEC sp_addextendedproc BLAH, '\Other-ServerPathToBlahBlah.dll'and I get the error:ODBC: Msg 0, Level 16, State 1Cannot load the DLL \Other-ServerPathToBlahBlah.dll, or one ofthe DLLs it references. Reason: 126(The specified module could not befound.).The DLL was written in Cobol. The "Other-Server" in the Path above isthe server that Cobol (and the DLL) is located on.I looked at the dependencies for the DLL and it includes (notsurprisingly) Cobol Dll's. It should find those on the other server.What can I do?Thanks!

View 5 Replies View Related

Data Access Very Slow In .net As Compared To VB

Jan 9, 2006

 

Hi,

I have migrated my app from VB to VB.Net. A 3-tier app with remoting and COM+.

I am experiencing a long wait time of about 3 times higher than what it would take in the VB App.

I am using DataAdapter.FiLL method to fill the datatable.

I have tried.

Using DataReader ( Made the things worse )

Using BeginLoadData and EndLoadData

Creating a Dataset and calling fill with the dataset so that the round trip to the middletier is saved to bring the SQL.

But i feel now that whatever is done. the problem is with the fill method only?

Is there any alternative?

Please suggest. It is one of the most important thing which if not possible may lead to scrapping up idea of upgrading to .Net.

Shri

View 10 Replies View Related

VERY Slow Generate Scripts On SQL 2005 Compared To

Jun 10, 2008

i was using sql 2000, the database contains 500+ tables, 3000+ sp.
i moved to sql 2005 and found problem on generating script (right click database -> tasks -> generate scripts).
i need to generate the table relations.... it is very very slow compared to sql 2000 which is done in about 30 seconds to few minutes.
i already tried many ways including set options to false which in my thought could speed up a lot...but still very slow.

average generate script time with sql 2005 (sp 2): 70-90 minutes.
average generate script time with sql 2000 (sp 4): 2-3 minutes.

can anyone tell why ? thx in advance

View 9 Replies View Related

Re-display Result Set Without Re-running Query In Query Analyzer?

Apr 9, 2006

I hope I am not asking about something that has been done before, but Ihave searched and cannot find an answer. What I am trying to do is torun a query, and then perform some logic on the rowcount and thenpossibly display the result of the query. I know it can be done withADO, but I need to do it in Query Analyzer. The query looks like this:select Varfrom DBwhere SomeCriteriaif @@Rowcount = 0select 'n/a'else if @@Rowcount = 1select -- this is the part where I need to redisplay the resultfrom the above queryelse if @@Rowcount > 1-- do something elseThe reason that I want to do it without re-running the query is that Iwant to minimize impact on the DB, and the reason that I can't useanother program is that I do not have a develpment environment where Ineed to run the queries. I would select the data into a temp table, butagain, I am concerned about impacting the DB. Any suggestions would begreatly appreciated. I am really hoping there is something as simple as@@resultset, or something to that effect.

View 6 Replies View Related

Problem With Sp Running As Job Versus In Query Analyzer

Jul 8, 2002

Hello,

Specifically, I'm having an issue that I can't resolve using the database space utilization procedures recently submitted by Paul Matthews. The servers are appropriately linked and the procedures have been created on the required servers (mix of 7.0 and 2000 systems).

Executing the sp_dbspaceall procedure from query analyzer is successful but it fails when called from a SQL Agent job. It only returns the information of the local server in that instance.

The error message tells nothing of the problem and the logs show nothing at all about the incident. Can someone help me out?

View 3 Replies View Related

Error In Running DTS Package From Query Analyzer

Apr 27, 2001

I am trying to run DTS package from query analyzer and getting following error .

Command from query analyzer is

go
master..xp_cmdshell 'dtsrun /Sirhahadb01 /Usa /P<sa password> /Ntestpack'

Output error message is

output
DTSRun: Loading...

DTSRun: Executing...

DTSRun OnStart: DTSStep_DTSDataPumpTask_1

DTSRun OnError: DTSStep_DTSDataPumpTask_1, Error = -2147217887 (80040E21)

Error string: Errors occurred

Error source: Microsoft OLE DB Provider for SQL Server

Help file:

Help context: 0



Error Detail Records:



Error: -2147217887 (80040E21); Provider Error: 0 (0)

Error string: Errors occurred

Error source: Microsoft OLE DB Provider for SQL Server

Help file:

Help context: 0



DTSRun OnFinish: DTSStep_DTSDataPumpTask_1

DTSRun: Package execution complete.



Sujit

View 2 Replies View Related

SQL 2000 Running Scripts In Query Analyzer

Mar 25, 2008

Does anyone know how to run a script to get the log size and log space?

When I run the following command it does not work and I don't know why.

DBCC SQLPERF (logspace)

USE [master]
GO
SELECT *
FROM sysfiles
WHERE name LIKE '%LOG%'
GO

EXEC sp_spaceused

If I run this command on a different server I get the ouput that I need. Any help would be greatly appreciated.

msr976

View 4 Replies View Related

Problem With Writting A Report (SQL Slow Running)

Jul 20, 2005

HiI am trying to write a report that calculates the average number of salesover 7, 14, 31 and 365 days for each hourly period of the day. the problemis it takes over 4 minutes to run.e.g.Average Xactions per Hour7 Days 14 Days 31 Days 365 Days00:00 - 01:00 1,141.6 579.2 261.6 28.801:00 - 02:00 1,298.0 649.6 293.4 30.0The report was use to be purely ASP running SQL Statements.I then changed it to ASP Running a SP 24 times - this reduced running timeby about 1 minute.I then changed it so that the stored proc looped internally 24 times andreturns the data.I have ran the Index Tuning Wizard on the SQL and Implemented the indexessuggested - this actually increase execution time by 20 seconds.Below is the stored procedure I am currently using that loops internally 24times and returns the data.Can anyone suggest a better way / any improvements I could make ?Many ThanksSteve-----------------------------------------------------------------------------------------------------CREATE procedure ams_RPT_Gen_Stats@strResult varchar(8) = 'Failure' output,@strErrorDesc varchar(512) = 'SP Not Executed' output,@strTest varchar(1),@strCurrency varchar(3),@strVFEID varchar(16)asdeclare @strStep varchar(32)set @strStep = 'Start of Stored Proc'/* start insert sp code here */create table ##Averages (TheHour varchar(2),Day7Avge float ,Day14Avge float ,Day31Avge float ,Day365Avge float)declare @numHour varchar(2)declare @strSQL varchar(2000)declare @Wholesalers varchar(64)declare MyHours cursor FORWARD_ONLY READ_ONLY forselect convert(char(2), timestamp,14) as TheHourfrom xactionsgroup by convert(char(2), timestamp,14)order by convert(char(2), timestamp,14)if @strTest = 'Y'select @Wholesalers = VALUE FROM BUSINESSRULES WHERE NAME ='TEST_Wholesalers'open MyHoursfetch next from MyHours into @numHourwhile @@fetch_status = 0beginset @strSQL = 'insert into ##Averages (TheHour, Day7Avge) ( select ''' +@numHour + ''', ' +'count(*) / 7.00 ' +'FROM ' +'XACTIONS INNER JOIN ' +'RETAILER ON XACTIONS.RETAILERID = RETAILER.RETAILERID ' +'WHERE ' +'(DATEDIFF(DAY , xactions.timestamp , GETDATE() ) < 8) and ' +'xactions.xactiontotal <> 0 and ' +' convert(char(2) , timestamp, 14) = ''' + @numHour + ''' 'if @strTest = 'Y'set @strSQL = @strSQL + ' and retailer.BillOrgID not in (' +@Wholesalers + ') 'if @strCurrency <> '*'set @strSQL = @strSQL + ' and xactions.XACTIONCURRENCY = ''' +@strCurrency + ''' 'if @strVFEID <> '*'set @strSQL = @strSQL + ' and xactions.VFEID = ''' + @strVFEID + ''''set @strSQL = @strSQL + ')'exec ( @strSQL )set @strSQL = 'update ##Averages set Day14Avge = ( select ' +'count(*) / 14.00 ' +'FROM ' +'XACTIONS INNER JOIN ' +'RETAILER ON XACTIONS.RETAILERID = RETAILER.RETAILERID ' +'WHERE ' +'(DATEDIFF(DAY , xactions.timestamp , GETDATE() ) < 15) and ' +'xactions.xactiontotal <> 0 and ' +' convert(char(2) , timestamp, 14) = ''' + @numHour + ''' 'if @strTest = 'Y'set @strSQL = @strSQL + ' and retailer.BillOrgID not in (' +@Wholesalers + ') 'if @strCurrency <> '*'set @strSQL = @strSQL + ' and xactions.XACTIONCURRENCY = ''' +@strCurrency + ''' 'if @strVFEID <> '*'set @strSQL = @strSQL + ' and xactions.VFEID = ''' + @strVFEID + ''' 'set @strSQL = @strSQL + ') where TheHour = ''' + @numHour + ''' 'exec ( @strSQL )set @strSQL = 'update ##Averages set Day31Avge = ( select ' +'count(*) / 31.00 ' +'FROM ' +'XACTIONS INNER JOIN ' +'RETAILER ON XACTIONS.RETAILERID = RETAILER.RETAILERID ' +'WHERE ' +'(DATEDIFF(DAY , xactions.timestamp , GETDATE() ) < 32) and ' +'xactions.xactiontotal <> 0 and ' +' convert(char(2) , timestamp, 14) = ''' + @numHour + ''' 'if @strTest = 'Y'set @strSQL = @strSQL + ' and retailer.BillOrgID not in (' +@Wholesalers + ') 'if @strCurrency <> '*'set @strSQL = @strSQL + ' and xactions.XACTIONCURRENCY = ''' +@strCurrency + ''' 'if @strVFEID <> '*'set @strSQL = @strSQL + ' and xactions.VFEID = ''' + @strVFEID + ''' 'set @strSQL = @strSQL + ' ) where TheHour = ''' + @numHour + ''' 'exec ( @strSQL )set @strSQL = 'update ##Averages set Day365Avge = ( select ' +'count(*) / 365.00 ' +'FROM ' +'XACTIONS INNER JOIN ' +'RETAILER ON XACTIONS.RETAILERID = RETAILER.RETAILERID ' +'WHERE ' +'(DATEDIFF(DAY , xactions.timestamp , GETDATE() ) < 366) and ' +'xactions.xactiontotal <> 0 and ' +' convert(char(2) , timestamp, 14) = ''' + @numHour + ''' 'if @strTest = 'Y'set @strSQL = @strSQL + ' and retailer.BillOrgID not in (' +@Wholesalers + ') 'if @strCurrency <> '*'set @strSQL = @strSQL + ' and xactions.XACTIONCURRENCY = ''' +@strCurrency + ''' 'if @strVFEID <> '*'set @strSQL = @strSQL + ' and xactions.VFEID = ''' + @strVFEID + ''' 'set @strSQL = @strSQL + ' ) where TheHour = ''' + @numHour + ''' 'exec ( @strSQL )fetch next from MyHours into @numHourend -- while fetchclose MyHoursdeallocate MyHoursselect * from ##Averages order by TheHourdrop table ##Averages/* end insert sp code here */if (@@error <> 0)beginset @strResult = 'Failure'set @strErrorDesc = 'Fail @ Step :' + @strStep + ' Error : ' +CONVERT(VARCHAR,@@Error)return -1969endelsebeginset @strResult = 'Success'set @strErrorDesc = ''endreturn 0GO

View 1 Replies View Related

VERY Slow Generate Scripts On SQL 2005 Compared To SQL 2000

Aug 1, 2007

i was using sql 2000, the database contains 500+ tables, 3000+ sp.
i moved to sql 2005 and found problem on generating script (right click database -> tasks -> generate scripts).
i need to generate the table relations.... it is very very slow compared to sql 2000 which is done in about 30 seconds to few minutes.
i already tried many ways including set options to false which in my thought could speed up a lot...but still very slow.

average generate script time with sql 2005 (sp 2): 70-90 minutes.
average generate script time with sql 2000 (sp 4): 2-3 minutes.

can anyone tell why ? thx in advance.

View 2 Replies View Related

Data Transfer From Lotus Notes Very Slow Compared To SQL 2000 DTS

Sep 7, 2006

To extract data from an ODBC source, try the following:

Add an ADO.Net Connection Manager.
Edit the Connection Manager editor and select the ODBC Data Provider
Configure the Connection Manager to use your DSN or connection string
Add a Data Flow Task to your package.
Add a Data Reader Source adapter to your data flow
Edit the Data Reader source adapter to use the ADO.Net connection manager that you added.
Edit the Data Reader source to query for the data you wish to extract.

hth

Donald

Using the steps outlined above as described by Donald Farmer in another post on this forum, I have created an SSIS package which retrieves data from Lotus Notes 6.55. The DSN referenced by the ADO.Net Connection Manager connects to Lotus Notes via the NotesSQL ODBC driver 3.02g.

When I execute the dataflow, data is transferred from Lotus Notes, but the data transfer rate is extremely slow compared to SQL 2000 DTS. In SQL 2000 DTS, we can retrieve just under half a million records from Lotus Notes in about 13 minutes. Utilizing the same DSN on the same machine, SQL 2005 SSIS completes the transfer in about 57 minutes.

Is there anything that can be done to improve the performance in SSIS to retrieve data from Lotus Notes via ADO.Net ODBC?

Thanks!

View 3 Replies View Related

Query Running Slow

Aug 24, 2007

hi

if suppose one query is running slow than what steps we can follow to optimize it...

this question is asked me in interview.

thanx

View 3 Replies View Related

Update Query Running Very Slow

Apr 17, 2008

Hi all,

I'm having what you might call an optimisation issue - but I'm also not sure if my approach to this problem is right. I've spent the whole day trying various methods but none seem to be performing as admirably as I'd hoped.

Basically, here's the scenario:

1. Log files are being inserted into a SQL table via Log Parser 2.2.
2. I have a lookup table of ip address ranges to countries and other geographic data.
3. Once the log row has been inserted from Log Parser, I want to update the same log table with data from the lookup table.

The main problem seems to be the updating (the initial insert from Log Parser is lightning quick).

I initially wrote a trigger on AFTER INSERT on the log table that converted the actual user's IP address into IPNumber format using a simple algorithm, then pumped the IPNumber into a quick select query which retrieved the geolocation data. Then, in the same trigger, there was an update statement which basically said:

update dbo.Logs
set [Country] = @Country
where [IpNumber] = @IpNumber and [Country] = Null

Therein lies the problem. The update statement slows everything down to almost a standstill and also seems to obtain a lock on the table.

Critical factors:

1. The log table must remain readable.
2. The query must execute in seconds -- not over half hour :)

I've experimented with various combinations of indexing, so far to no avail.

Any suggestions would be very much appreciated.

Regards

View 10 Replies View Related

Slow Running SELECT TOP Query

Aug 27, 2007

I have a SELECT TOP query in order to return x number of top records from a table which has the indexing service enabled on it, such as this :


SELECT TOP(15) * FROM [TableName] ORDER BY [ColumnName]


and also there are not that many records(MAX 100 rows) kept in the table at the moment however, it will grow later.
The issue stems out from the ORDER BY [ColumnName] part of the syntax that changes the TOP selection order which makes the query to run very slow as I have also analyzed in the SQL SERVER QUERY ANALYZER.
Anyhow, I need to have the ORDER BY statement to show the data based on different columns either ascending or descending.
There might we workarounds to achieve the same goal that I am not aware of.
Any thoughts is appreciated.

View 3 Replies View Related

Very Slow Running Update Query Query

Nov 19, 2004

I have an update query running which to just now has been running for 22 hours running on two tables 1 a lookuptable that has just been created within the batch the other a denormalised table for doing data analysis on

the query thats causing teh problem is


--//////////////////////////////////// this is the one thats running


Print 'Update Provider 04-05 EmAdmsCount12mths : ' + CAST(GETDATE() AS varchar)
GO
Update Provider_APC_2004_05
set EmAdmsCount12mths =
(Select COUNT(*)-1
from Combined_Admissions
where ((Combined_Admissions.NHSNumber = Provider_APC_2004_05.NHSNumber) or
(Combined_Admissions.PASNUMBER = Provider_APC_2004_05.PDDISTNO)) and
(Combined_Admissions.AdmDate BETWEEN DateAdd(yyyy,-1,Provider_APC_2004_05.AdmDate) AND Provider_APC_2004_05.AdmDate) AND
Combined_Admissions.AdmMethod like 'Emergency%')-- and
-- CA.NHSorPrivate = 'NHS'))
FROM Provider_APC_2004_05, Combined_Admissions


any help in improving speed would be most welcome as there are 3 more of these updates to run right after this one and the analysis tables are almost double the size of this one

Dave

View 6 Replies View Related

Slow Report From Simple Query

Apr 2, 2007

Hi all, I've been building a set of traffic based reports on our website and I've run into a strange problem.

The reports are pretty basic, and up till now I've been really impressed with RS overall.

Recently I've added a StartDate and EndDate and since then the performance has gone from ~10 secs to ~10 minutes.

I've taken a really simple query from my reports. Running this query in Management Studio on the same data returns in less than a second. When its run from a test report with nothing else in it it takes ~1 minute. Even stranger when I run the same query with the same values for parameters inside of RS in the data view it takes less than 1 second. ARG!


SELECT COUNT(DISTINCT SessionID) as Occurences
FROM WebAppSummary
JOIN WebAppLocalizations
ON WebAppSummary.ClientIp = WebAppLocalizations.ClientIp
where FirstTime BETWEEN @StartDate AND @EndDate


The last line that was just added is this part:
where FirstTime BETWEEN @StartDate AND @EndDate

So whats going on here? Is this a really poorly performing query that management studio is optimizing but RS isnt? Is RS messing up the databind and getting a bunch of DSs instead of just one?

View 1 Replies View Related

Can A Calc'd Query Column Be Compared Against A Multi Value Variable Without A Nested Query?

Nov 15, 2007

do i need to nest a query in RS if i want a calculated column to be compared against a multi value variable? It looks like coding WHERE calcd name in (@variable) violates SQL syntax. My select looked like

SELECT ... ,CASE enddate WHEN null then 1 else 0 END calcd name
FROM...
WHERE ... and calcd name in (@variable)

View 1 Replies View Related

Extreme Slow SQL Server 2005 And High CPU Usage - Casting Of Values

Feb 20, 2006

Hello sql and .net gurus :-)I have a problem with my website www.eventguide.it. It's completly developed under .NET 2 and SQL Server 2005 Express. My problem is the folowing:The server is a Intel 3Ghz HT processor with 1GB Ram. No other page on the running system is a CPU consuming site. We optimized the SQL statements, the code, the caching and many other parts of the website (pooling on SQL access), but the SQL Server uses about 50% to100% of the CPU and about 400MB RAM all the time. The whole site seems to be very, very slow. In fact there are many of SQL operations on every page request, but we cache a lot of them in different ways (page output caching, application caching). So I don't understand we have so much performance problems. Any suggestions for optimised code in general? I read nearly all of the MS .NET performance papers - but real world experience is the missing part :-) It is better to cast the values of a SQL reader like thisDim String1 as String = Ctype(DataReader.item(0), String)Dim Integer1 as Intger = Ctype(DataReader.item(1), Integer)or like thisDim String1 as String = DataReader.item(0)
Dim Integer1 as Intger = DataReader.item(1)Thanks a lot for your help!FOX

View 6 Replies View Related

Running DTS From Querry Analyzer

Sep 21, 2007

Hi,How to run a DTS from query analyzer using some t-sql commands??? Is it possible?? What is the command to run a DTS from QA???and What are the possible ways of running a DTS?? from how many types we can execute a DTS like 2 of it i know from EM directly and from the help of Schedule jobs.

View 1 Replies View Related

DTS Running Too Slow

Aug 8, 2000

I have a dts job set up to transfer 550,000 records from a dbf file into a sql server. If I just let it run, there is a 9-10 minute delay, then it starts. If I try to schedule a job, it fails completely. I looked up ways to get it to execute quicker, mainly going to the advanced tab of the transform arrow and making the inserts 1000 at a time, the table locked, turning constraints off. Any advice on how to speed it up or why the job is failing?

View 7 Replies View Related

Slow Running SP

Apr 27, 2002

Hi All

Last week, we upgraded to sql2000 from 6.5. Everything went on fine. we re-compiled all the procedures. When i try to run a procedure, its running for long time - more than 10 hours (in sql 6.5 it runs for 50 mins). do i need to set any procedure cache?.

Also, the server CPU usuage is constantly high - more than 80%. It was fine till last week.

Any suggestions?

Thanks in advance

Jaya

View 2 Replies View Related

SP Running Slow

Aug 30, 2006

i have written a SP, but it eventually runs slower. i have to run this SP 500 times.

do you know what is causing that?

View 13 Replies View Related

Slow Running

Jul 1, 2007

when i turn on my pc, it takes about 10 minutes from turning on to actually be able to use.

View 4 Replies View Related

Slow Running Server

May 2, 2001

Our server is running. There are no locks, and server has been rebooted but the problem is still there. This has been going on for some time now. I intend to restart the server. Does anybody have a quick solution, please help. Thanks for your assistance!!

View 4 Replies View Related

Slow Running Server

May 17, 2001

Please what do I look out for 6.5 if I want to troubleshoot for slow running.
Thanks.

View 2 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved