Does someone know whether it is better to drop and reload or sp_recompile a stored procedure to get a new, recompiled execution plan? I have another DBA telling me it is better to drop and reload the stored procedure rather than use sp_recompile. I would think that sp_recompile would be the preferred method.
I know that all the documentation always tells you that sp_recompile will force a stored procedure to recompile the next time it is executed. However, I am not seeing the recompiles in a SQL Trace, when capturing SP: Recompile events. I have tried this on many different database servers, using sp_recompile and also the WITH RECOMPILE option when creating the proc.
I am doing a master page and i want the page to update itself when a new value comes in. Example, my master page default value has login and register (both are set as linkbutton)
Login | Register
Then in the login page, when the person successfully logged in to his/her account, master page update linkbutton login and register text will change to welcome john and logout.
I have a problem with getting a dropdown list to change a txt field every time I select a new value on it.
I have a parameter called "Type" that is selected from a non-queried drop down list. This contains values like "Month" "Day" "Quarter" "Week", etc. I have a query that takes those parameters and then returns a "TimeInterval" for each one. I want this value to appear in a text box (Which is a MaxDate parameter for an entirely separate query) whenever the user selects a value from type. However there is a catch, the user needs to be able to also change the value of the textbox so that he/she can customize that end date for report rather than the values that I have pre-canned for them. This is a bit confusing so I'll try to give an example. The wants to run a report and selects "Week" from the drop down list, this value is passed one of my datasets and that data set returns the value 07/15/2007. That value will then populate in the MaxDate parameter field. The user then changes his/her mind selects "month" from the Type drop down list. What should happen is that the MaxDate field changes to be 07/01/2007 (This is returned correctly in my querry, but I can't get it to show in the maxDate field after the first time.), but instead it still shows 07/15/2007. Here is my SQL code for the dataSet that uses the parameter Type to return a date
IF(@Type= 'Day') BEGIN SELECT dbo.F_START_OF_DAY(getdate()) AS TimeInterval END
IF(@Type='Year') BEGIN SELECT dbo.F_START_OF_YEAR(getdate()) AS TimeInterval END
IF(@Type='Month') BEGIN SELECT dbo.F_START_OF_MONTH(getdate()) AS TimeInterval END
IF(@Type='Quarter') BEGIN SELECT dbo.F_START_OF_QUARTER(getdate()) AS TimeInterval END
ETC....................................
If you need any aditional information or have any questions, I'de be more than happy to answer them. Thanks in advance, /jcarver
here's my situation. i want to take a copy of my database at work and take it home to my home pc. what/how is the best way to do this. i am relatively new to this stuff, so detailed directions would be nice.
sidebar - if i have the .mdf and .ldf files, can I restore from them?
I would like to empty out my 2k8 database (remove all the data in the tables only) And reloaded it with the data from production (2k5). I know I would have to dis-enable any constraints, triggers and remove all indexes before I can run a Truncate/Delete on all tables correct And then import the data via Wizard or script And lastly enable all the constraints & triggers, rebuild the indexes. Is this the only way to go about this? I don't want to do a backup & restore because the 2k5 doesn't have a 20% of the tables that are in 2k8. The 20% of those tables in 2k8 i'm not going to remove.
After deleting all the test data from all tables in a SQL 2000 database, is there a way to reset all the auto-incrementing fields back to zero in one shot? In Access, you can run the Compact and Repair option. Also, in Sybase SQL, there was an "unload/reload" option to reduce the database size. Is there a similar function in SQL2000? Thanks for all the help
We have lots of stored procedures containing temporary tables. In SQL 6.5 every thing was great. But in 7.0, it's doing a lot of recompile while executing. Tried trace flag 8720, didn't work..
Basically I am talking abt this problem : http://support.microsoft.com/support/kb/articles/Q224/5/87.ASP
Let me know if any ideas/Remedies ??? How did any of you tackled this behaviour.
Hi, What is equivalent to OPTION (RECOMPILE) in SQl Server 2000. Create table #Employee ( EmpId int IDENTITY,EmpName varchar(30) ) insert into #Employee(EmpName ) select EmpName from AllEmployees OPTION (RECOMPILE)
Is there a way (command or stored procedure) to RECOMPILE or REFRESH a USER DEFINED FUNCTION? I can recompile SPs with sp_recompile and refresh views with sp_refreshView, but I could not find any way to refresh User-defined functions (some of them are like views, with parameters).
Hello:The installation details:W2K SP4, SQL Server 2000 Ent with 1GB RAM. It is a Bi-P3.When I run the Profiler to trace Stored Procedure performance, I get abunch of SP:CacheMiss for couple of stored procedure I invoke quiteoften in a web app.But I do not see SP:Recompile.Here are my questions:i) If the plan is not in the Cache, why am I not see SP: Recompile.Where else can it be tugged.ii) What are the other counters I need to monitor to see if I need morememory.Thanks in advance for any leads on this.Regards:
If my data structure never changes, just the data itself, is there a need to use the "with recompil option" on stored procedures? Isn't there a performance hit having it in the stored procedure?
What's the performance hit for using 'WITH RECOMPILE' in a stored procedure? I'm not a serious DBA, nor do I pretend to be one, but I'm writing a sp_ to be used with both insert and updates. I'm using a variable that defines the operation (IF @operation = 'Update'...) which will be passed at run-time from ColdFusion. Do I need to use the 'WITH RECOMPILE' clause to keep the sp_ kosher with respect to the operation being performed? And what's the damage in resources?
currently i am working on performance tuning on some stored procedure and found that most of the stored procedure include with recompile on top of it.
i try to remove it and now it improve a lot on speed tuning. However, for those stored procedure which is using dynamic sql, is it a must to include recompile in our stored procedure?
Hi!I need to refresh an entire database.I can recompile SPs with sp_recompile (or DBCC FLUSHPROCINDB), andrefresh views with sp_refreshView, but I cannot find any way torefresh my user-defined functions (some of them are like views, withparameters).Any help appreciated :) !Ben
Hi,I have a question in SQL Server 2K, I use SQL Profile to trace, andfind Stored Procedure was auto recompiled, like this row in thetrace:SP:Recompile151680762004-02-27 16:01:11.610How can I stop the auto recompile.ThanksHarold
We are developing a production/management solution for the photo finishing sector. We need a performance of 1 order priced per second. If we run the procedure once we dont have a dramatical performance loss due to recompilation of the stored procs. If we have about 3 consecutive sessions we find the performance loss to be at a rate of about 200 - 500 %. We can't afford this. On the site of msdn we found some reasons why sql server needs to recompile, but since the structure of our db can't be changed in such a manner that this would resolve the problem we need an alternative. All help is greatly appriciated.
We have a problem with one of our MS SQL 2000 databases and some stored procedures.
I'm not sure exactly what the problem is, but these are the symptons....
The stored procedure runs without problems for a period of time. Abruptly, without warning it begins to time out when called from our web application.
Calling it through the query analyzer it runs within a second.
Forcing the stored procedure to recompile allows the web application to start calling it again without it timing out.
We have a DTS package that runs over night and imports a number of records (not sure on the exact numbers, but definately enough to make a difference to indexes) so this could be part of the problem although when I force a recompile I do not do any update stats or anything else.
I wrote a test script to call the stored procedure when it was timing out to ensure it wasn't a web application problem and the procedure continued to time out until the forced recompile. So I don't think the problem is there.
The stored procedure returns multiple results sets and when it starts timing out it is while it is returning the second results sets.
The code for the second results set is...
Select avg(round(p.PricingValue, 5)) as Average, stdev(round(p.PricingValue, 5)) as StdDev, min(p.CaptureDate) as FromDate, max(p.CaptureDate) as ToDate From Pricing p Inner Join Security s On p.SecurityID = s.SecurityID Left Outer Join Issuer i On s.IssuerID = i.IssuerID WHERE p.PricingTypeID = @PricingType And p.TenorTypeID = @TenorType And p.CaptureDate Between @DateFrom And @DateTo AND p.SecurityID IN ( SELECT SecurityId FROM UserResult ur WHERE ur.UserResultSelected = 1 AND ur.UserID = @userID )
Does anyone have any idea what might be going on here?
Hi, I increased one of my base tables column which is referenced in view
I noticed sql server didn't recognized this change and its still showing old field size in the view.
I can simply drop and create it again. But wanted to know if there is any way (command/sp) to recompile the view which will be easy to deploy in production as patch.
I use recompile option in SQL query to dynamic pass variable to optimizer.
I verify explain plan with SET STATISTICS PROFILE ON
and optimizer chose nested lookup ,ok. But if use Display Estimated Execution Plan (CTR+L) I€™ve get merge join. It€™s very confusing, some suggestion €¦?
Use AdventureWorks
go
declare @StartOrderDate datetime
set @StartOrderDate = '20040731'
SELECT * FROM Sales.SalesOrderHeader h, Sales.SalesOrderDetail d
I need a way to programmatically (via JDBC) find out which triggers for a table may not compile properly, so that I can disable the bad triggers.
I can do this fine in Oracle but cannot figure out if there's a way to do this in SqlServer. (In Oracle I'd just "alter trigger... compile" and select from user_errors.)
I know how to find the triggers that exist on a table, and I know how to enable/disable individual triggers. I know about sp_recompile, but all that does is flag the trigger for recompile at the next execution.
I need to verify whether the trigger is valid without having to actually invoke it. For example, if there's a bad Update trigger, I don't want to actually execute an update on the table.
One example of what I'm dealing with is this... We have Table A and Table B. There is an update trigger on Table B that references column A.col1. Then we alter Table A to drop col1. Later we have to update Table B. At this point the update will fail because of the bad trigger. I want to find and disable the trigger before executing the update on Table B. If there are other triggers on Table B that are valid, I want to leave them alone.
If I have a view such as: SELECT T.* FROM T When I add a column to table T the view is not updated to reflect that change. Furthermore, if there are other columns after the * in the view (for example SELECT T.*, GETDATE() as "My Date" FROM T) the last columns will contain incorrect data.
Is there a work around for this? An "auto-recompile when tables are modified" kind of option?
Thanks Nick
PS: This is the script I used for testing:
create table tt ( test1 int primary key, test2 int) go insert into tt (test1, test2) values (1,2) go create view vw_tt as select *, getdate() as "My Date" from tt go select * from vw_tt go create view vw_tt2 as select * from tt go alter table tt add test3 int go select * from vw_tt select * from vw_tt2 select * from tt drop table tt drop view vw_tt drop view vw_tt2
I can't seem to place the "option (recompile)" in any valid position so that the following procedure executes without a syntax error .
USE [PO] GO /****** Object: StoredProcedure [dbo].[npSSUserLoad] Script Date: 4/18/2015 3:57:38 PM ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON
[Code] ...
-- Generated code - DO NOT MODIFY
-- From Object Schema: 'C:XXXXXX.NetPOPOModel\_ObjectSchema
-- To regenerate this procedure use the 'Open With' option on file _ObjectSchema and select POCodeGen.exe
Declare @SqlCmd nvarchar(max) Declare @ParamDefinitions nvarchar(1024) Set @ParamDefinitions = N'@UserId int,NTUser varchar(30), @XmlResult XML OUTPUT' Set @SqlCmd = N'Set @XmlResult = ( Select [UserId] [a], [UserName] [b],
We have on demand snapshot replication set up between 2 servers. When the subscriber applies the snapshot, our stored procedures start executing very slowly. Updating statistics and rebuilding indexes does not resolve the problem, however; executing sp_recompile on the affected stored procedures does fix the problem. Is this a known issue with replication? Is there a better workaround than manually recompiling stored procedures after every snapshot?