Tuning The Sort Step Of Execution Plan
Jul 20, 2005
hi
i got a query that takes about 14 mins
here it is
select BDProduct.ProductCode,BDProduct.ProductName,SALTer ritory.TerritoryID
,SALTerritory.TerritoryName,SALAccount.AccountID,S ALAccount.AccountName
,sum(SalesNetFact.Qty2) as Quantity
,sum(SalesNetFact.bonus) as Bonus
from SalesNetFact
inner join BDProduct
on BDProduct.ProductID=SalesNetFact.ProductID
inner join SALAccount
on SALAccount.AccountID=SalesNetFact.AccountID
and SALAccount.BranchID=SalesNetFact.branchid
inner join SALTerritory
on dbo.SALAccount.TerritoryID = dbo.SALTerritory.TerritoryID
and dbo.SALAccount.BranchID = dbo.SALTerritory.BranchID
group by BDProduct.ProductCode,BDProduct.ProductName
,SALTerritory.TerritoryID,SALTerritory.TerritoryNa me,SALAccount.AccountID
,SALAccount.AccountName
the SalesNetFact table has BranchID,TransactionLineID as primary key
the BDProduct table has ProductID as primary key
the SALAccount table has AccountID,BranchID as primary key
the SALTerritory table has TerritoryID,BranchID as primary key
i have no other indices in any of these tables
the execution plan shows that the sort step takes 96% cost,that is the
most expensive step,it is done after all the joining steps and before
the group by step
for the sort step:the estimated row count is 1552242,the arguments
are:ORDER BY [BDProduct].[ProductCode]
asc,[SALTerritory].[TerritoryID] asc,[SALTerritory].[TerritoryName]
asc,[SalesNetFact].[AccountID] asc,[SALAccount].[AccountID] asc)
any ideas about how to improve this sort step
View 1 Replies
ADVERTISEMENT
Nov 11, 2014
I'm new to using SQL Server. I've been asked to optimize a series of scripts that queries over 4 millions records. I've managed to add indexes and remove a cursor, which increased performance. Now when I run the execution plan, the only query that cost is a DELETE statement from the main table. It shows a SORT which cost 71%. The table has 2 columns and a unique index. Here is the current index:
ALTER TABLE [dbo].[Qry] ADD CONSTRAINT [Qry_PK] PRIMARY KEY NONCLUSTERED
(
[QryNum] ASC,
[ID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = ON, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
GO
Question: Will the SORT affect the overall performance? If so, is there anything I should change within the index that would speed up my query?
View 5 Replies
View Related
Jan 31, 2007
I have a package that has multiple data flow tasks. At the end of a task, key data is written into a raw file (file name stored in a variable) that is used as a data source for the next task. Each task requires a success from the preceding task.
Here's the rub:
If I execute the entire package, the results of the package (number of records of certain tasks) differs significantly from when I execute each step in the package in turn (many more records e.g. 5 vs 350).
I get the feeling that the Raw file is read into memory before it is flushed by the previous task, or that the next task begins preparation tasks too early.
Any help is greatly appreciated.
I am running on Server 2003 64 (although the same thing happens when deployed on a Server 2003 32 machine)
Thanks
B.
View 2 Replies
View Related
Jul 7, 2006
The benefit of the actual execution plan is that you can see the actual number of rows passing through each step - compared to the estimated number of rows.But what about the "cost percentages" ?I believe I've read somewhere that these percentages is still just an estimate and is not based on the real execution.Does anyone know this and preferable have a link to something that documents it?Thanks
View 1 Replies
View Related
Mar 2, 2008
Hi pals,
I have stored procedure which creates job with 3 job steps whichruns 3 SSIS packages in a sequence.
But i dont know why it is skipping the 2nd step and executing the 3rd step.
I can clearly observe it in the logs.
And if comment the 3rd step and re-run the job, now it is executing the 1st and 2nd step in a sequence.
I can see the log for 2nd package also.
Again i uncommented the code for calling the 3rd step which loads data into Oracle tables.
This time again it is skipping the 2nd step.
I dont know the reason why it is happening so.
It is really frustrating me a lot.
IS there any precedence/ priority given while loading data into Oracle database?
Job steps
First step loads data from staging database to ODS
Second step loads data from ODS to DataMart
Third step loads data from DataMart to Oracle Tables
Can anyone please help me out.
I have tried all options.
Here is my stored procedure.
Is anything wrong in the below stored procedure.
CREATE PROCEDURE [dbo].[spExecDTSPackage]
as
declare @jid uniqueidentifier
declare @cmd1 varchar(4000)
declare @cmd2 varchar(4000)
declare @cmd3 varchar(4000)
SET @cmd1 = '"C:Program FilesMicrosoft SQL Server90DTSBinnDTExec.exe" /F "D:RepositoryPackage1.dtsx" '
SET @cmd2 = '"C:Program FilesMicrosoft SQL Server90DTSBinnDTExec.exe" /F "D:RepositoryPackage2.dtsx" '
SET @cmd3 = '"C:Program FilesMicrosoft SQL Server90DTSBinnDTExec.exe" /F "D:RepositoryPackage3.dtsx" '
declare @jname varchar(128)
set @jname = cast(newid() as char(36))
exec msdb.dbo.sp_add_job
@job_name = @jname,
@enabled = 1,
@delete_level = 1,
@job_id = @jid OUTPUT
exec msdb.dbo.sp_add_jobserver
@job_id = @jid,
@server_name = '(local)'
exec msdb.dbo.sp_add_jobstep
@job_id = @jid,
@step_name = N'step1',
@step_id = 1,
@on_success_action=4,
@on_success_step_id=2,
@subsystem = N'CMDEXEC',
@proxy_name = N'Proxyname',
@command = @cmd1,
@on_fail_action = 2 -- quit with failure
exec msdb.dbo.sp_add_jobstep
@job_id = @jid,
@step_name = N'step2',
@step_id = 2,
@on_success_action=4,
@on_success_step_id=3,
@subsystem = N'CMDEXEC',
@proxy_name = N'Proxyname',
@command = @cmd2
exec msdb.dbo.sp_add_jobstep
@job_id = @jid,
@step_name = N'CoreD_To_Oracle',
@step_id = 3,
@on_success_action=1,
@on_success_step_id=0,
@subsystem = N'CMDEXEC',
@proxy_name = N'Proxyname',
@command = @cmd3
-- Start job
exec msdb.dbo.sp_start_job
@job_id = @jid,
@step_name = N'step1'
I have also tried out with an simple example , a job with 3 steps which basically inserts 3 recs into a table.
Here is the code and this one is executing fine.
USE [msdb]
GO
BEGIN TRANSACTION
DECLARE @ReturnCode INT
SELECT @ReturnCode = 0
DECLARE @jobId BINARY(16)
EXEC @ReturnCode = msdb.dbo.sp_add_job @job_name=N'2D4474C9-2B4F-45C2-8640-EB8DE30A5276',
@enabled=1,
@notify_level_eventlog=2,
@notify_level_email=0,
@notify_level_netsend=0,
@notify_level_page=0,
@delete_level=1,
@description=N'No description available.',
@category_name=N'SRM',
@owner_login_name=N'sa',
@job_id = @jobId OUTPUT
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_add_jobstep @job_id=@jobId, @step_name=N'Step_1',
@step_id=1,
@cmdexec_success_code=0,
@on_success_action=4,
@on_success_step_id=2,
@on_fail_action=2,
@on_fail_step_id=0,
@retry_attempts=0,
@retry_interval=0,
@os_run_priority=0,
@subsystem=N'TSQL',
@command=N'use ods
go
insert into test select 1,''ram''
go',
@database_name=N'master',
@flags=0
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_add_jobstep @job_id=@jobId, @step_name=N'Step_2',
@step_id=2,
@cmdexec_success_code=0,
@on_success_action=4,
@on_success_step_id=3,
@on_fail_action=2,
@on_fail_step_id=0,
@retry_attempts=0,
@retry_interval=0,
@os_run_priority=0,
@subsystem=N'TSQL',
@command=N'use ods
go
insert into test select 2, ''ganesh''
go',
@database_name=N'master',
@flags=0
IF (@@ERROR <> 0 OR @ReturnCode <> 0)
GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_add_jobstep @job_id=@jobId,
@step_name=N'Step_3',
@step_id=3,
@cmdexec_success_code=0,
@on_success_action=1,
@on_success_step_id=0,
@on_fail_action=2,
@on_fail_step_id=0,
@retry_attempts=0,
@retry_interval=0,
@os_run_priority=0,
@subsystem=N'TSQL',
@command=N'use ods
go
insert into test select 3,''Vinod''
go',
@database_name=N'master',
@flags=0
IF (@@ERROR <> 0 OR @ReturnCode <> 0)
GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_update_job @job_id = @jobId,
@start_step_id = 1
IF (@@ERROR <> 0 OR @ReturnCode <> 0)
GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_add_jobserver @job_id = @jobId, @server_name = N'(local)'
IF (@@ERROR <> 0 OR @ReturnCode <> 0)
GOTO QuitWithRollback
COMMIT TRANSACTION
GOTO EndSave
QuitWithRollback:
IF (@@TRANCOUNT > 0) ROLLBACK TRANSACTION
EndSave:
I dont know why the stored procedure is not working as expected.
Any thoughts?
Welch can you please help on this regard?
View 12 Replies
View Related
Mar 9, 2006
Hi,
I'm working with SQL 2000 and am just learning about Maintenance Plans (MP). They seem convenient, but after some time, I'm wondering if they're the best approach long-term. Here are my experiences.
Using the MP Wizard, I created a plan with tasks from all the dialogs:
- Optimize database
- Check integrity
- Backup database
- Backup transaction log
- Write a report
I was puzzled to find 4 jobs were created, each with just 1 step, and staggered starting times. I expected to find 1 job with 4 steps. So, brimming with confidence, I did just that. I combined all 4 into 1 job, deleted the 3 other MP created jobs, and checked for any job-specific details in the code. However now when I open the MP, I get this pop-up:
"One or more of the jobs created for this plan has had additional steps added to it. It is not recommended that jobs created by the maintenance plan be modified in any way."
Okay, fair warning. Yet it appears the job and all steps run successfully, both on demand, and on a schedule. So now I'm wondering if jobs always need a MP. If I don't mind working with xp_sqlmaint syntax, it appears the only thing I'm giving up is the MP history. But I expect job history and '-WriteHistory' will minimize that loss.
I searched BOL, this Forum, and Google, and found a couple articles. One author preferred the ease of the Wizard, another preferred the control and added features of T-SQL, but both created a MP in their examples. So I'm hoping some experienced DBAs can advise.
If I create a job with multiple steps, and no MP, are there important things I give up or problems I create?
Is this approach a bad idea in SQL 2005?
At this stage, I don't need replication or other advanced features. Just simple database maintenance.
Thank you,
- Martin
View 1 Replies
View Related
Jul 17, 2007
hi,
i am currently in the process of moving a bunch of jobs into SSIS from another ETL tool. I would like to benchmark the two products against each other by comparing how long each step of an ETL process would take.
I see no way to do this in SSIS, there is the Progress tab but it doesnt list start/time and end/time. Plus I having loops and things which I want to know how long each iteration takes.
Is there a way to track all this?
Thanks
View 5 Replies
View Related
Aug 21, 2001
I'm new to SQL server but familiar enough with databases to know this doesn't seem right.
Here's the situation:
I have a table with real estate property information. There are about 650,000 rows in it. I have a nonclustered non-unique index on the city where the property is located. There are about 40 unique values in this index.
I do a simple query like:
SELECT city,address from propinfo where city= 'CARLSBAD'. The query will return about 4,000 rows. The problem is that the execution plan that it chooses is to do a full table scan. I.E. Even though there is an index on City, it chooses to look through 650,000 rows rather than do an index seek. Something sounds inefficient here. BTW, this happens in both SQL 7 and SQL 2000. Can anyone explain why this happens? I've got to think that SQL Server is more efficient here.
View 5 Replies
View Related
Feb 2, 2006
Hello, I have been looking at the execution plan for a procedure call and the select, compute scalar, stream aggregates, constant scan, nested loops, asserts are all at 0% cost, the PK costs are 2% apart from a rogue 7% and a few 20%, tables scans are all at 23%. The query cost realtive to the batch is 100%. What does this all mean?
I have put non-clustered indexes on all the table attributes that are involved in the select statements but this has made no difference, i am guessing this is because my tables are not heavily populated and i may have seen a difference if i had thousands of entries in the tables the select statements acted on, is this assumption correct?
Does anyone else bother using the execution plan to tweak there DB or is it a negligible tool?
Jill
View 5 Replies
View Related
Aug 29, 2007
In sql server 2005 management studio where do I find the option to run the sql query in the query analyser and also show the execution plan?
At present I see the option under Query menu which is "Display estimated Execution plan" which only shows the plan but does not execute the query.
Thanks
View 2 Replies
View Related
Feb 29, 2008
Does anyone know of a good way to copy the execution plan when using "Include Actual Execution Plan"?
I often need to copy this and mail it.
I know I can use PrintScreen button, but I need a more efficient way to do this.
If I just could rightclick the execution plan and select "Copy" and get complete plan it would be great.
Mladen?
E 12°55'05.25"
N 56°04'39.16"
View 14 Replies
View Related
Jul 20, 2005
Which of the following does NOT cause the execution plan of a query to berecompiled ?- new column is added to a table accessed by a query OR- index used by a query has been dropped from the database OR- query perfoms a join to return data from multiple tables OR- significant amount of data in a table has been mofified
View 1 Replies
View Related
Jul 20, 2005
Hi,I have a table-valued user defined function (UDF) my_fnc.The execution of statement "select * from my_fnc" takes much longertime than runnig the code inside my_fnc (with necessary changes).What can be the reason?How can I see an execution plan used for UDF?Thanks a lotMartin
View 1 Replies
View Related
Jul 9, 2007
Hi,I want to access the real execution plan via my webapplication after I have executed an SQL statement. I know how to get the estimated execution plan:1 cmd.CommandText = "SET SHOWPLAN_XML ON";2 cmd.ExecuteNonQuery();3 4 cmd.CommandText = myStatement;5 SqlDataReader dataReader = cmd.ExecuteReader();6 7 String plan = String.Empty;8 9 while (dataReader.Read()) {10 plan += dataReader.GetSqlString(0).ToString();11 }12 13 cmd.CommandText = "SET SHOWPLAN_XML OFF";14 cmd.ExecuteNonQuery();I want do compare the estimated costs with the real costs of the same statement. If I change code line 1 an 13 to "SET STATISTICS XML [ON|OFF]" the string "plan" will contain the result of the submitted SELECT statement, but I just need to get the plan and not the result itself. Thanks in Advance,Dominik
View 6 Replies
View Related
Jul 28, 1999
What does 'tablename. index... cost: 100%' mean when I use display estimated execution plan?
View 1 Replies
View Related
Sep 24, 2002
Hello ,
I wanted to know whether we have an execution plan enabled in SQL 6.5 as we have it in SQL 7.0 and SQL 2000 .
I.e when we execute a query and if we enable ' show execution plan 'then it creates a map and shows the vital statistics .
If that is available on SQL 6.5 then i am missing that tool .
How can i have it installed on my SQL 6.5 server ??
Thanks.
View 3 Replies
View Related
Jul 9, 2003
Hi,
I want to know how to analyze query execution plan for complex queries and what information is useful from that for improving the performance. I have gone through details in some sites like www.like sql-performance.com (http://www.sql-server-performance.com/query_execution_plan_analysis.asp), where it was more generic. I want more info regarding this.
Can any one tell about the resources for this or do you have any white papers or documents, which you can share with me.
Thanks in advance,
sekhar
View 1 Replies
View Related
Jun 12, 2006
Hi ,
when
operator = then index SEEK
operator <> then index SCAN
Is normal ?
Example
SELECT *
FROM dbo.Batch
WHERE (Status = 'Batch Completed')
(1 row(s) affected)
StmtText
---------------------------------------------------------------------------------------------------------------------------------
|--Bookmark Lookup(BOOKMARK:([Bmk1000]), OBJECT:([PriceAvisPr].[dbo].[Batch]))
|--Index Seek(OBJECT:([PriceAvisPr].[dbo].[Batch].[IX_Batch]), SEEK:([Batch].[Status]='Batch Completed') ORDERED FORWARD)
StmtText
---------------------------------------------------------------------------------
SELECT *
FROM dbo.Batch
WHERE (Status <> 'Batch Completed')
(1 row(s) affected)
StmtText
------------------------------------------------------------------------------------------------------------------------
|--Clustered Index Scan(OBJECT:([PriceAvisPr].[dbo].[Batch].[PK_Batch]), WHERE:([Batch].[Status]<>'Batch Completed'))
View 2 Replies
View Related
Mar 24, 2008
Hi all,
I am experiencing performance problems with one of my stored procedures. When the stored procedure is first compiled an executed, it behaves as expected (it usually takes 1 or 2 seconds to complete). But its performace it is degradated, so in 1 day, it usually takes 120 seconds to complete !!!. Once the stored procedure is compiled, its performance it is then the expected.
It is a complex stored procedure with two integer parameters with only one select, but composed by multiple views and sub-queries. We have been trying to break the query into small pieces using temporary tables but without success. The SQL Profiler shows an unusual number of reads when it goes wrong (more than a million reads).
I think the problem is in the execution plan. I know than compiling the stored procedure, the problem is fixed, but I do not know exactly when and why it starts to happen.
The stored procedure is running under the following configuration:
- Microsoft SQL Server Standard Edition (64-bit).
- Version: 9.00.1399.06
- RAM 16 MB
- 8 CPUs
Anyone has any ideas or possible solutions?
Thanks in advance,
Carlos.
View 2 Replies
View Related
Aug 5, 2005
The cost of query with usage of functions is as same as that of withoutfunctionsIn the below code, the query cost of insert is 0.02% and two selectstatements costs same 0.04%Declare @t table(mydate datetime)Declare @i intset @i=1while @i<=5000Begininsert into @t values(getdate())set @i=@i+1EndSelect mydate from @tSelect convert(varchar,mydate,112) from @tBut I thought usage of convert function will take more query costWhat do you think of this?Madhivanan
View 5 Replies
View Related
Mar 9, 2006
We've got as slightly unusual scenario happening whereby a statement ispassed to SQL which consists of two parts.BEGIN TRANSACTIONDELETE * FROM WhateverBULK INSERT INTO Whatever...(etc)COMMIT TRANSACTIONThe first is a deletion of the data and the second is the bulk insertof replacement data into that table. The error that we see is aviolation of the primary key (composite).The violation only happens if we run both processes together. If we runone, then the other, it works fine. If we set a line by line insert, itworks fine.My suspicion is that the execution plan that is being run is mostlikely working the two parts in parallel and that the records stillexist at the point that the insert is happening. Truncate is not anoption. The bulk insert was added for performance reasons. There is anoption of trying the bulk insert, and if that fails, do the line byline insert, but it's far from ideal.I think we can probably wrap this into two individual transactionswithin the one statement as follows :BEGIN TRANSACTIONDELETE * FROM WhateverCOMMIT TRANSACTIONBEGIN TRANSACTIONBULK INSERT INTO Whatever...(etc)COMMIT TRANSACTIONWill this give sufficient hint to SQL about the order it processes itso that it completes as we intend and not as it sees being the mostefficient method ?Or, is there a better approach to this ?I've seen that some hints can be passed to SQL for optimizing, but myunderstanding was that it was always better to trust the optimiser andre-work the query as needed.With the server having two processors, is it feasible that one is doingone part and the other processor the other part in parallel ? Willtelling it to use a single processor be worthwhile looking at ? MAXDOP1 ?Finally, I'd imagine that the insert is quicker to process than thedeletion. Is this correct ?ThanksRyan
View 14 Replies
View Related
May 2, 2006
Using SQL Server 2000 SP4.There is a relatively complex stored procedure that usually completes inless than 20 seconds. Occasionally it times out after 180 seconds. The SPis called via ADO 2.8, using adCmdStoredProc command type. If I useProfiler to capture the EXEC that ADO sends to run the procedure, and runthat from QA, the procedure completes in less than 20 seconds as it should.The procedure is created WITH RECOMPILE. One additional twist is thatsp_setapprole is called from the client before running the procedure inquestion. This may be irrelevant, because even if I include the samesp_setapprole call when running the procedure from QA, it still executesquickly, and even if I comment out the call to sp_setapprole in the clientcode, the proc still times out when run from the client.The only thing that fixes it, at least for a day or two, is DBCCFREEPROCCACHE. So it appears that a bad plan is somehow stuck in memory andis only used when the procedure is called from the client app, and is notflushed even though the procedure was created WITH RECOMPILE.Other than scheduling the DBCC call to run every night, is there anythingelse I could try to get this resolved? Thanks.--(remove a 9 to reply by email)
View 5 Replies
View Related
May 31, 2006
I was hoping someone could shed some light on wierd situation i'm experiencing. I have the following query:
select count(*) LeadCount
from auto_leads al
where received > dbo.GetDay(GetDate())
dbo.GetDay simply returns a smalldatetime value of today's date.
Now I recently got thrown into a data mess and for some reason this query takes 8 seconds to run. Now the first thing I did was update the stats on the Received column of this auto_leads table. I re-run the query and I'm still getting 8 seconds. I look at the execution plan I can make figure out why this is happening.
I then change the above query so the filter received > dbo.GetDay(GetDate()) is now just received > '5/31/2006' and the query comes back immediately. This doesn't make sense to me because the GetDay function is really simple and comes back immediately. I then try the following query to confirm it isn't a problem with the GetDay function:
declare @Today DateTime
set @Today = dbo.getday(GetDate())
select count(*) leads
from auto_leads al
join type_lead_status tls on (tls.type_lead_status_id = al.type_lead_status_id)
where received > @Today
Sure enough, the query came back immediately. Next thing to go through my mind is that the query execution plan has been cached by SQL Server using the execution plan from before I updated the stats on the received column. So I executed sp_recompile 'auto_leads' and tryed the original query again. Still taking 8-10 seconds to come back.
So my question, is why when I remove the GetDay function call in my query filter is the query slow, as opposed to me just passing a variable into the query? Thanks!
- James
View 6 Replies
View Related
Apr 27, 2007
I'm new to sql server 2005 and was reviewing the execution plan on one of my queries.
I have a query that selects about 62,000 rows from a table of about 20 million
I see there is a index seek indicated but further down the execution plan I see that a large percent is being assigned to a RID LOOKUP on the same table.
Should I be concerned with this and if so, what would you recommend I do to correct it?
View 12 Replies
View Related
Sep 5, 2006
Hi,
We migrated our database from SQL Server 2000 to Yukon last week. Now, when we run our application it has slowed down. We analyzed some stored procedure and they seems to have degarded. The execution plan has changed. Now, this looks like there's lot of work if we have to tune each query w.r.t the new execution plan. Our application has around 4000 stored procs. Is anyone aware of some generic pattern or solution such that these exection plans problem can resolved? Also, does the new execution plan ensure that the once we tune stored procs will perform better than SQL Server 2000.
Need help on this, otherwise it seems we might have to move back to 2000.
Thanks in Advance
Ritesh
View 2 Replies
View Related
Jul 5, 2007
Hi all,
I€™m having a test regarding to the image data type. The test program is written with sql native api and just update the image data type column, but I looked the SQL Compilations/sec and Batch Requests/sec counters in SQLServer:QL Statistics using Perfmon, both values are almost the same. It seemed whenever the stored procedure is called, SQLServer compiles it and makes execution plan again. But when I had a test without image data type, SQL Compilation/sec was 0. SQL version is Microsoft SQL Server 2005 - 9.00.3054.00 (Intel X86) (Build 2600: Service Pack 2).
Is SQL server working the way expected or am I missing something?
View 1 Replies
View Related
Jun 19, 2003
Is there anyway to force sql server to use the same execution plan?
One of the sp for web page takes about 2 minutes to execute. Once it's executed through query analyser, it takes relatively less time.
Is there any explanation for this?
View 5 Replies
View Related
Apr 8, 2014
I want to find out more information about the execution plan. I saw Parallelism (Gather Streams) in the excution plan. In what situation we can see this icon?If in case if we need to avoid how can we avoid this?
View 5 Replies
View Related
Jul 4, 2014
We have got a query for fine tuning and it is using lot of CTE ,how can i check the execution plan of that?
CREATE VIEW Mercy
AS
with ADR
as
(
SELECT urpx.RoleID ,
urx.UserID
FROM [DBA].dbo.URPX WITH ( NOLOCK )
INNER JOIN [DBA].dbo.URX WITH ( NOLOCK ) ON urpx.RoleID = urx.RoleID
WHERE PermissionID = '1'
),
SDR
as
(
-- Collect the roles that a configured with Sales Team Create permission
-- This will include Sales Director , Suite Admin,
SELECT urpx.RoleID
FROM [DBA].dbo.URPX WITH ( NOLOCK )
INNER JOIN [DBA].dbo.URX WITH ( NOLOCK ) ON urpx.RoleID = urx.RoleID
LEFT OUTER JOIN ADR ON ADR.UserID = urx.UserID
WHERE ADR.RoleID IS NULL
AND PermissionID='2'
)
View 6 Replies
View Related
Sep 29, 2014
I have an execution plan that is huge, the pdf it generates if I print it is over 1000 pages. Is there a way to change the graphical plan into a table, so I can sort the %, and find the items that are taking the longest?
View 4 Replies
View Related
May 15, 2015
I have same query but when executed from different server use different plan. when it runs on QA box it is faster and when it runs on PRD it is slow.
Is it possible to force SQL Server to use QA plan by giving a hint?
View 2 Replies
View Related
Jun 5, 2015
if t-sql query is perfectly run in development and when I execute in production at that time I want to use execution plan which is in development . so how I can do using cache? I know about hint we can use hint USE_PLANE. but I want to do with cache .
View 1 Replies
View Related
Apr 4, 2008
What's the use of display estimated execution plan....
View 4 Replies
View Related