After installing SP2 for SQL Serve 2005, I get strange errors when using either "Estimated Execution Plan" or "Actual Execution Plan".
Using "Estimated Execution Plan" generates this error
An error occurred while executing batch. Error message is: Error processing execution plan results. The error message is:
There is an error in XML document (1, 3998).
Unexpected end of file while parsing has occurred. Line 1, position 3998.
Using "Actual Execution Plan" generates this error
An error occurred while executing batch. Error message is: Error processing execution plan results. The error message is:
There is an error in XML document (1, 4001).
Unexpected end of file has occurred. The following elements are not closed: RelOp, ComputeScalar, RelOp, Update, RelOp, QueryPlan, StmtSimple, Statements, Batch, BatchSequence, ShowPlanXML. Line 1, position 4001.
All I do is testing this solutiondeclare@t table (dt datetime)
insert@t
select'02-Jan-2007' union all
select'01-Feb-2007' union all
select'10-Feb-2007' union all
select'28-Feb-2007' union all
select'18-Mar-2007'
DECLARE@DaysRange INT,
@NumOfCalls INT
SELECT@DaysRange = 35,
@NumOfCalls = 4
SELECTt1.dt AS theDate
FROM@t AS t1
CROSS JOIN@t AS t2
WHEREt2.dt >= DATEADD(day, DATEDIFF(day, 0, t1.dt), 0)
AND t2.dt < DATEADD(day, DATEDIFF(day, 0, t1.dt), @DaysRange)
OR
t2.dt >= DATEADD(day, DATEDIFF(day, @DaysRange, t1.dt), 1)
AND t2.dt < DATEADD(day, DATEDIFF(day, 0, t1.dt), 1)
GROUP BYt1.dt
HAVINGCOUNT(*) >= @NumOfCallsHas anyone else experienced this?
Can anyone help me with this strange problem please?I have a stored procedure, with a parameter defined as auniqueidentifier. The procedure does a select with a number of joins,and filters within the Where clause using this parameter.(@orderHeader_iduniqueidentifier)SELECT*FROM originalOrderHeader oohINNER JOIN originalOrderLine oolON ooh.id = ool.idFULL OUTER JOIN orderLine olon ool.id = ol.idAND ool.productCode = ol.productCodewhere (ooh.id = @orderHeader_id)There is a clustered index on the id column of originalOrderHeader,and on id and productCode of both originalOrderLine and orderLine.These indexes are regularly rebuilt. The execution plan shows a seekagainst these indexes, but the estimated row count values are huge,and should be single figures.If I change the SP to accept the parameter as a varchar, and thenexplictly cast back to a uniqueidentifier in the where clause, the SPruns much, much quicker, the execution plan is doing far less work,and crucially the estimated row counts on the clustered index seeksare correct (single figures).Does anyone have any ideas what might be causing this?
The benefit of the actual execution plan is that you can see the actual number of rows passing through each step - compared to the estimated number of rows.But what about the "cost percentages" ?I believe I've read somewhere that these percentages is still just an estimate and is not based on the real execution.Does anyone know this and preferable have a link to something that documents it?Thanks
I am trying to export data from my local server to the hosting server. However I get errors when executing it:
Validating (Error)
Messages
Error 0xc0202049: Data Flow Task: Failure inserting into the read-only column "ID". (SQL Server Import and Export Wizard)
Error 0xc0202045: Data Flow Task: Column metadata validation failed. (SQL Server Import and Export Wizard)
Error 0xc004706b: Data Flow Task: "component "Destination 7 - Batches" (497)" failed validation and returned validation status "VS_ISBROKEN". (SQL Server Import and Export Wizard)
Error 0xc004700c: Data Flow Task: One or more component failed validation. (SQL Server Import and Export Wizard)
Error 0xc0024107: Data Flow Task: There were errors during task validation. (SQL Server Import and Export Wizard)
I have a query like below .. if i add where Served = 1 , the query takes foreever... if i remove it, it takes only 6 sec...
I am not sure why this is hapening?
select distinct a.Episode_Key, case when ag.Category IN ('ASMI', 'COOC', 'SPCL') then 'SMI' when ag.Category = 'SEDC' then 'SED' when ag.Category = 'ACCA' then 'SA' when ag.Category like 'CGA%' then 'Gam' end as [Category], ag.Agreement_Type_Name as [Agreement], p.ServiceProvider, s2.Served from dbo.Assessment a INNER JOIN ( select distinct Episode_Key, p.ServiceProvider, max(CSDS_Object_Key) as [Sequence] from dbo.Assessment a INNER JOIN dbo.CD_Provider_Xref p ON a.Provider_CD = p.Provider_CD where Creation_DT >= '07/01/2007' and Reason_CD = 1 group by Episode_Key, p.ServiceProvider ) as s1 ON a.CSDS_Object_Key = s1.Sequence INNER JOIN dbo.CD_Provider_XREF p ON a.Provider_CD = p.Provider_CD INNER JOIN dbo.CD_Agreement_Type ag ON ag.Agreement_Type_CD = a.Agreement_Type_CD LEFT OUTER JOIN ( select distinct Episode_Key, p.ServiceProvider, 1 as [Served] from dbo.Encounters e INNER JOIN dbo.CD_Provider_Xref p ON e.Provider_CD = p.Provider_CD where Encounter_Begin_DT between '01/01/2008' and '01/31/2008' and Procedure_CD is not null and Encounter_Units > 0 ) as s2 ON a.Episode_Key = s2.Episode_Key and p.ServiceProvider = s2.ServiceProvider ????---where Served = 1 group by a.Episode_Key, ag.Agreement_Type_Name, p.ServiceProvider, Served, case when ag.Category IN ('ASMI', 'COOC', 'SPCL') then 'SMI' when ag.Category = 'SEDC' then 'SED' when ag.Category = 'ACCA' then 'SA' when ag.Category like 'CGA%' then 'Gam' End
I'm having an odd problem with some SQL CE statements I'm trying to use.
SELECT [PK Product ID] FROM [tblProduct Header] WHERE ([Description] = 'World''s Greatest 3D Mug - 18th Birthday')
Gives me the error message:
Major Error 0x8007007A, Minor Error 0
> SELECT [PK Product ID] FROM [tblProduct Header] WHERE ([Description] = 'World''s Greatest 3D Mug - 18th Birthday')
The data area passed to a system call is too small.
But this query does not:
SELECT [PK Product ID] FROM [tblProduct Header] WHERE ([Description] = 'Worlds Greatest 3D Mug - 18th Birthday')
and nor does this:
SELECT [PK Product ID] FROM [tblProduct Header] WHERE ([Description] = 'World''s Greatest 3D Mug 18th Birthday')
It's as though SQL CE doesn't like a combination of single quotes and hyphens in the criteria. If I remove the quote, it works, if I put the quote back in and remove the hypen it works. The data I'm trying to look up is:
World's Greatest 3D Mug - 18th Birthday
So I really need to be able to look for text that contains both quotes and hyphens.
ID,Operator Emailed,Operator Net sent,Operator Paged,Retries Attempted 09/24/2007 15:00:00,Hourly Extract From OReSA,Error,0,INHSCTSTTOMVM82,Hourly Extract From OReSA,(Job outcome),,The
job failed. The Job was invoked by Schedule 7 (Hourly). The last step to run was step 1 (OReSA
Extract).,00:00:59,0,0,,,,0 09/24/2007 15:00:01,Hourly Extract From OReSA,Error,1,INHSCTSTTOMVM82,Hourly Extract From OReSA,OReSA
Extract,,Executed as user: INENVts_hia. hod call failed. End Error Error: 2007-09-24 15:00:58.93 Code:
I'm new to SQL server but familiar enough with databases to know this doesn't seem right. Here's the situation: I have a table with real estate property information. There are about 650,000 rows in it. I have a nonclustered non-unique index on the city where the property is located. There are about 40 unique values in this index.
I do a simple query like: SELECT city,address from propinfo where city= 'CARLSBAD'. The query will return about 4,000 rows. The problem is that the execution plan that it chooses is to do a full table scan. I.E. Even though there is an index on City, it chooses to look through 650,000 rows rather than do an index seek. Something sounds inefficient here. BTW, this happens in both SQL 7 and SQL 2000. Can anyone explain why this happens? I've got to think that SQL Server is more efficient here.
Hello, I have been looking at the execution plan for a procedure call and the select, compute scalar, stream aggregates, constant scan, nested loops, asserts are all at 0% cost, the PK costs are 2% apart from a rogue 7% and a few 20%, tables scans are all at 23%. The query cost realtive to the batch is 100%. What does this all mean? I have put non-clustered indexes on all the table attributes that are involved in the select statements but this has made no difference, i am guessing this is because my tables are not heavily populated and i may have seen a difference if i had thousands of entries in the tables the select statements acted on, is this assumption correct? Does anyone else bother using the execution plan to tweak there DB or is it a negligible tool?
In sql server 2005 management studio where do I find the option to run the sql query in the query analyser and also show the execution plan? At present I see the option under Query menu which is "Display estimated Execution plan" which only shows the plan but does not execute the query.
Does anyone know of a good way to copy the execution plan when using "Include Actual Execution Plan"? I often need to copy this and mail it.
I know I can use PrintScreen button, but I need a more efficient way to do this. If I just could rightclick the execution plan and select "Copy" and get complete plan it would be great.
Which of the following does NOT cause the execution plan of a query to berecompiled ?- new column is added to a table accessed by a query OR- index used by a query has been dropped from the database OR- query perfoms a join to return data from multiple tables OR- significant amount of data in a table has been mofified
Hi,I have a table-valued user defined function (UDF) my_fnc.The execution of statement "select * from my_fnc" takes much longertime than runnig the code inside my_fnc (with necessary changes).What can be the reason?How can I see an execution plan used for UDF?Thanks a lotMartin
I am getting a linked server connectivity errors randomly. We have a set of DTS packages to pull data from OLTP (SQL 2000) to reporting server (SQL 2005) through linked server and both instances are on same server. The process is running ok on Acceptance server, and failing on production server. It is throwing the following error randomly, and it is getting fixed for a while after restarting SQL 2000 service.
An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft OLE DB Provider for SQL Server" Hresult: 0x80004005 Description: "TCP Provider: The specified network name is no longer available. ". An OLE DB record is available. Source: "Microsoft OLE DB Provider for SQL Server" Hresult: 0x80004005 Description: "OLE DB provider "SQLNCLI" for linked server "GIS_STG" returned message "Communication link failure".". An OLE DB record is available. Source: "Microsoft OLE DB Provider for SQL Server" Hresult: 0x80004005 Description: "[DBNETLIB][ConnectionRead (recv()).]General network error. Check your network documentation.".
We seem to be having a rather strange issue. We just finished migrating from 2000 to 2005. I've been working on converting packages from DTS to SSIS. Everything has been going well in testing, but a problem has shown up going into production. The first package I deployed ran significantly slower on the production server, going from about 7 seconds to 90 seconds. The production server is much more powerful, so this was rather confusing. The major difference is that the server is running 64 bit.
After searching around, I haven't found much that is helpful. To test things out, I created a blank package that does absolutely nothing. All it contains is a single connection manager (OLE DB). Executing this package takes 45 seconds. If I remove the connection manager, it executes in 0.1 seconds. Setting DelayValidation to TRUE has no effect. If the connection is to a nonexistent server, or if an incorrect login is used, the execution time is still 45 seconds. Adding a second connection manager increases execution time to 75 seconds. The package runs on my local computer in 0.1 seconds.
Any ideas what could be going on here? I can€™t believe that no one would have seen an issue like this. Is there some sort of strange configuration issue going on here?
Hi,I want to access the real execution plan via my webapplication after I have executed an SQL statement. I know how to get the estimated execution plan:1 cmd.CommandText = "SET SHOWPLAN_XML ON";2 cmd.ExecuteNonQuery();3 4 cmd.CommandText = myStatement;5 SqlDataReader dataReader = cmd.ExecuteReader();6 7 String plan = String.Empty;8 9 while (dataReader.Read()) {10 plan += dataReader.GetSqlString(0).ToString();11 }12 13 cmd.CommandText = "SET SHOWPLAN_XML OFF";14 cmd.ExecuteNonQuery();I want do compare the estimated costs with the real costs of the same statement. If I change code line 1 an 13 to "SET STATISTICS XML [ON|OFF]" the string "plan" will contain the result of the submitted SELECT statement, but I just need to get the plan and not the result itself. Thanks in Advance,Dominik
I wanted to know whether we have an execution plan enabled in SQL 6.5 as we have it in SQL 7.0 and SQL 2000 . I.e when we execute a query and if we enable ' show execution plan 'then it creates a map and shows the vital statistics . If that is available on SQL 6.5 then i am missing that tool .
How can i have it installed on my SQL 6.5 server ??
I want to know how to analyze query execution plan for complex queries and what information is useful from that for improving the performance. I have gone through details in some sites like www.like sql-performance.com (http://www.sql-server-performance.com/query_execution_plan_analysis.asp), where it was more generic. I want more info regarding this.
Can any one tell about the resources for this or do you have any white papers or documents, which you can share with me.
I am experiencing performance problems with one of my stored procedures. When the stored procedure is first compiled an executed, it behaves as expected (it usually takes 1 or 2 seconds to complete). But its performace it is degradated, so in 1 day, it usually takes 120 seconds to complete !!!. Once the stored procedure is compiled, its performance it is then the expected.
It is a complex stored procedure with two integer parameters with only one select, but composed by multiple views and sub-queries. We have been trying to break the query into small pieces using temporary tables but without success. The SQL Profiler shows an unusual number of reads when it goes wrong (more than a million reads).
I think the problem is in the execution plan. I know than compiling the stored procedure, the problem is fixed, but I do not know exactly when and why it starts to happen.
The stored procedure is running under the following configuration:
- Microsoft SQL Server Standard Edition (64-bit). - Version: 9.00.1399.06 - RAM 16 MB - 8 CPUs
The cost of query with usage of functions is as same as that of withoutfunctionsIn the below code, the query cost of insert is 0.02% and two selectstatements costs same 0.04%Declare @t table(mydate datetime)Declare @i intset @i=1while @i<=5000Begininsert into @t values(getdate())set @i=@i+1EndSelect mydate from @tSelect convert(varchar,mydate,112) from @tBut I thought usage of convert function will take more query costWhat do you think of this?Madhivanan
We've got as slightly unusual scenario happening whereby a statement ispassed to SQL which consists of two parts.BEGIN TRANSACTIONDELETE * FROM WhateverBULK INSERT INTO Whatever...(etc)COMMIT TRANSACTIONThe first is a deletion of the data and the second is the bulk insertof replacement data into that table. The error that we see is aviolation of the primary key (composite).The violation only happens if we run both processes together. If we runone, then the other, it works fine. If we set a line by line insert, itworks fine.My suspicion is that the execution plan that is being run is mostlikely working the two parts in parallel and that the records stillexist at the point that the insert is happening. Truncate is not anoption. The bulk insert was added for performance reasons. There is anoption of trying the bulk insert, and if that fails, do the line byline insert, but it's far from ideal.I think we can probably wrap this into two individual transactionswithin the one statement as follows :BEGIN TRANSACTIONDELETE * FROM WhateverCOMMIT TRANSACTIONBEGIN TRANSACTIONBULK INSERT INTO Whatever...(etc)COMMIT TRANSACTIONWill this give sufficient hint to SQL about the order it processes itso that it completes as we intend and not as it sees being the mostefficient method ?Or, is there a better approach to this ?I've seen that some hints can be passed to SQL for optimizing, but myunderstanding was that it was always better to trust the optimiser andre-work the query as needed.With the server having two processors, is it feasible that one is doingone part and the other processor the other part in parallel ? Willtelling it to use a single processor be worthwhile looking at ? MAXDOP1 ?Finally, I'd imagine that the insert is quicker to process than thedeletion. Is this correct ?ThanksRyan
Using SQL Server 2000 SP4.There is a relatively complex stored procedure that usually completes inless than 20 seconds. Occasionally it times out after 180 seconds. The SPis called via ADO 2.8, using adCmdStoredProc command type. If I useProfiler to capture the EXEC that ADO sends to run the procedure, and runthat from QA, the procedure completes in less than 20 seconds as it should.The procedure is created WITH RECOMPILE. One additional twist is thatsp_setapprole is called from the client before running the procedure inquestion. This may be irrelevant, because even if I include the samesp_setapprole call when running the procedure from QA, it still executesquickly, and even if I comment out the call to sp_setapprole in the clientcode, the proc still times out when run from the client.The only thing that fixes it, at least for a day or two, is DBCCFREEPROCCACHE. So it appears that a bad plan is somehow stuck in memory andis only used when the procedure is called from the client app, and is notflushed even though the procedure was created WITH RECOMPILE.Other than scheduling the DBCC call to run every night, is there anythingelse I could try to get this resolved? Thanks.--(remove a 9 to reply by email)
I was hoping someone could shed some light on wierd situation i'm experiencing. I have the following query:
select count(*) LeadCount from auto_leads al where received > dbo.GetDay(GetDate())
dbo.GetDay simply returns a smalldatetime value of today's date.
Now I recently got thrown into a data mess and for some reason this query takes 8 seconds to run. Now the first thing I did was update the stats on the Received column of this auto_leads table. I re-run the query and I'm still getting 8 seconds. I look at the execution plan I can make figure out why this is happening.
I then change the above query so the filter received > dbo.GetDay(GetDate()) is now just received > '5/31/2006' and the query comes back immediately. This doesn't make sense to me because the GetDay function is really simple and comes back immediately. I then try the following query to confirm it isn't a problem with the GetDay function:
declare @Today DateTime
set @Today = dbo.getday(GetDate())
select count(*) leads from auto_leads al join type_lead_status tls on (tls.type_lead_status_id = al.type_lead_status_id) where received > @Today
Sure enough, the query came back immediately. Next thing to go through my mind is that the query execution plan has been cached by SQL Server using the execution plan from before I updated the stats on the received column. So I executed sp_recompile 'auto_leads' and tryed the original query again. Still taking 8-10 seconds to come back.
So my question, is why when I remove the GetDay function call in my query filter is the query slow, as opposed to me just passing a variable into the query? Thanks!
I'm new to sql server 2005 and was reviewing the execution plan on one of my queries.
I have a query that selects about 62,000 rows from a table of about 20 million
I see there is a index seek indicated but further down the execution plan I see that a large percent is being assigned to a RID LOOKUP on the same table.
Should I be concerned with this and if so, what would you recommend I do to correct it?
We migrated our database from SQL Server 2000 to Yukon last week. Now, when we run our application it has slowed down. We analyzed some stored procedure and they seems to have degarded. The execution plan has changed. Now, this looks like there's lot of work if we have to tune each query w.r.t the new execution plan. Our application has around 4000 stored procs. Is anyone aware of some generic pattern or solution such that these exection plans problem can resolved? Also, does the new execution plan ensure that the once we tune stored procs will perform better than SQL Server 2000.
Need help on this, otherwise it seems we might have to move back to 2000.
I€™m having a test regarding to the image data type. The test program is written with sql native api and just update the image data type column, but I looked the SQL Compilations/sec and Batch Requests/sec counters in SQLServer:QL Statistics using Perfmon, both values are almost the same. It seemed whenever the stored procedure is called, SQLServer compiles it and makes execution plan again. But when I had a test without image data type, SQL Compilation/sec was 0. SQL version is Microsoft SQL Server 2005 - 9.00.3054.00 (Intel X86) (Build 2600: Service Pack 2).
Is SQL server working the way expected or am I missing something?
I have SQL 2000 Server that had a database called ABC and it has been moved to another server on 5-15-2007 I kept ABC database in Read_only mode for few days (just in case) on old server and finally dropped it on 5-20-2007 and I think I forgot to drop the associated logins. I started seeing login failure for user 'xyz' in error logs When I first noticed the login failure error in SQL Server log for login xyz, I deleted xyz login but it did not stop the errors.
I have been trying from then and no luck in identifying the cause/ resolving this issue. I have ran SQL profiler trace and caught the user hostnames, NTusername in few cases and Application Name and contacted the Application owner & user (who are in the trace) to stop windows service/ schedule jobs..anything that is pointing to old server but the bad luck is they are not aware of anything running or pointing to old server. The worst part is the user whose hostnames are shown in the trace have never used ABC database and do not have any idea.
Here is what I found in the profiler trace: TextData LoginName NTUserName HostName ApplicationName Login failed for user 'xyz' xyz AB00007 WAB000007 Microsoft (r) Windows Script Host
Today I have created the xyz login in the server and assigned model database with reader permission to see if log some different error but nothing new (the same login failure error) Finally, I had no solution other than restoring the ABC database back to my old server and set it in to Read_Only mode to stop these errors and Now I see the login 'xyz' firing the query against the database
* Any hint or pointers why this is happening and any possible solution on this?
SELECT * FROM a LEFT OUTER JOIN b ON a.id = b.id instead of
SELECT * FROM a LEFT JOIN b ON a.id = b.id
generates a different execution plan?
My query is more complex, but when I change "LEFT OUTER JOIN" to "LEFT JOIN" I get a different execution plan, which is absolutely baffling me! Especially considering everything I know and was able to research essentially said the "OUTER" is implied in "LEFT JOIN".