I added option keepfixed plan and it is not working, I am still getting recompiles in the profiler trace. The table with the trigger on it has a huge amount of inserts, updates and deletes all day long.
Does anyone have any ideas? I know triggers are expensive, but we want to keep this trigger (I don't have a choice to rewrite another way).
I use recompile option in SQL query to dynamic pass variable to optimizer.
I verify explain plan with SET STATISTICS PROFILE ON
and optimizer chose nested lookup ,ok. But if use Display Estimated Execution Plan (CTR+L) I€™ve get merge join. It€™s very confusing, some suggestion €¦?
Use AdventureWorks
go
declare @StartOrderDate datetime
set @StartOrderDate = '20040731'
SELECT * FROM Sales.SalesOrderHeader h, Sales.SalesOrderDetail d
We're using SQL2000 on Windows 2000 Server, but this is a problemwe've had on one particular database since SQL7 on NT4.The database in question is set to autogrow by 10% (currently sittingat 31Gb total size). However, last week users complained of aslowdown in performance. When we checked we found that only 14Mb wasfree on the database (we thought it would've grown automaticallybefore then), and when we added an additional 1Gb manually performancepicked up.Does SQLServer wait until all the space is used up (i.e. 0% free)before autogrowing? Even at that, we've never actually had thedatabase grow automatically - we've always had to add space manually.Settings on this database, and one that does grow automatically,appear to be the same (have also checked via sp_helpdb). So wheredoes the problem lie?Any help you can give would be greatly appreciated.
I have a weird one. I have 3 maintenance plans that were working for months and now suddenly are sort of not working. The jobs start and then take forever. I stop them from QA (EM doesn't appear to stop them). When I look at the Application log in the event viewer there are no errors because the jobs appeared to freeze. When I look at the maintenenace plan history it says the jobs completed in the usual short amount of time. When I look at the application log it says it did the backups of the data and transaction logs, etc. When I look on the server in the d:SQL ServerMSSQLLog directory for that maintenance plan, it says it completed the backups. So, essentially, EM is telling me the jobs haven't completed and everything else tells me the jobs did complete. But I won't feel secure until EM tells me the jobs have actually finished.
I did do a manual backup of one of the databases and then ran that maintenance plan for that database. I canceled the maintenance plan job after a few minutes (how long the job would usually take) and then looked at the size of the .BAK files. They were both exactly the same size.
I am trying to push the install for ReportBuilder 3.0 and am having an issue with the REPORTSERVERURL option for installing via command line.I have a batch file that works fine, however when I launch the app it does not have a report server configured. I have verified I can connect to my report server if I enter it manually.
Has anyone had experience of using Parent/Child packages while enlisting them in Transactions. I tested this on a small sample and thought that I had got it to work, but in my real-world package it does not.
The parent package essentially calls three child packages. In each child package there are multiple DFT's that import and transform data into SQL Server. All data must be imported or not at all. Therefore I created a FELC container into which three Exec child package tasks were placed. The FELC is set to Trans Option 'Required' and the Exec child package tasks to supported. Unfortunately upon failure of one of the DFT's in the child the data was not rolled back.
So initially we had in terms of container hierarchy for the Trans Option property: Parent package Supported FELC for calling child packages Required Task execute child package Supported Child package Suppored Tasks Suppored
Looking at this more closely we thought that we would need Parent package Supported FELC for calling child packages Required Task execute child package Required Child package Required Tasks Suppored
for it to work. However, the latter now gives us failures with error messages on the tasks on the child packages. [Execute SQL Task] Error: Failed to acquire connection "Conn ECARS1CEDImport". Connection may not be configured correctly or you may not have the right permissions on this connection.
Even more strange the first couple of tasks in the child pkg complete successfully even though they use the same connection listed in the error. These tasks also have Event handlers.
I am trying to change variable value at run time in ssis 2012 package using DTEXECUI utility but can not see any changes happening in config file variable value and also data is not getting populated in my table as per new variable value.
What is the right syntax or method of dynamically changing variable value either through DTEXECUI or DTEXEC command prompt command.
I am using SQL server 7.0. After restoring all the databses (Exceptdistribution) Everything seems to be working fine except the backupmaintenance plan. I put following detail in the maintenance plan"General Tab:Plan Name --- DB Maintenance Plan1() All DatabasesOptimizations tab:Reorganize Data and Index PageChange free space per page percentage to 10%Remove unused space from database filesShrink Database when it grows beyond 50 MBAmount of free space to remain after shrink 10% of the data spaceSchedule: Occurs every 1 week(s) on Thursday, at 1:30:00 PM.Integrity tab:Chack database integrityInclude indexes.Attempt to repair any minor problemSchedule : Occurs every 1 week(s) on Thursday, at 1:00:00 PM.Complete Backup tab:Backup the database as part of the maintenance planVerify the integrity of the backup upon completionDisk use this directory: d:SQL BackupsBackup file extension: BAKSchedule: Occurs every 1 day(s), at 2:10:00 PM.Transaction Log Backup tab:schedule: Occurs every 1 day(s), at 2:00:00 PM.Reporting tab:Write report a text file in Directory: D:SQL BackupsLOG "Problem:My databases(mroduction as well as systems) sre not at all gettingbacked up according to above plan.Error Message in the Log Directory:"Microsoft (R) SQLMaint Utility (Unicode), Version [Microsoft SQL-DMO(ODBC SQLState: 42000)] Error 4062: [Microsoft][ODBC SQL ServerDriver][SQL Server]Cannot open user default database '<ID>'. Usingmaster database instead."In the SQL server registration properties the login through whichgetting logged into the EM for that, default database is "my productiondatabase".Is this problem because of the distribution db which I am not able torestore.Please helpDeepak Sinha
I installed SQL Server 2005 Enterprise, then SP1 and then SP2, Maintenance Plan worked. But if I installed SQL Server 2005 Enterprise and then SP2 directly (skipping SP1), the Check Database Integrity in Maintenance Plan was not working. The error message is as follows:
Executed as user: DomainSqlServiceAccount. Microsoft (R) SQL Server Execute Package Utility Version 9.00.3042.00 for 64-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 10:56:21 AM Could not load package "Maintenance PlansTest Plan" because of error 0xC0014062. Description: The LoadFromSQLServer method has encountered OLE DB error code 0x80004005 (Login failed for user ''. The user is not associated with a trusted SQL Server connection.). The SQL statement that was issued has failed. Source: Started: 10:56:21 AM Finished: 10:56:21 AM Elapsed: 0.047 seconds. The package could not be loaded. The step failed.
Other tasks like Rebuild Index seem to be fine. SP2 is supposed to be inclusive. Does anyone have any ideas on why this is happening.
Hi everyone In my SqlServer Management Studio Express, on start up it shows the server type option, but greyed.So that value is fixed to database engine. ( I'm trying to work on an SqlServer Compact Edition database through the SSMStudiothat's why I'm trying to get this to change.)Besides, after I connect i go to the Object Explorer, expand the server node, and go to Replication.When i expand replication, i get the "Local Subscription" option, but nothng for Publication.( I want to work on Merge Replication, that's why I desparately need Publication to work)Am i missing something here? I did not install SqlServer separately, I only have what comes bundled with the Visual Studio 2005 Setup.
So I started a new job recently and have noticed a few strange configurations. Typically I would never mess with min memory per query option and index create memory option configuration because i just haven't seen any need to. My typical thought is that if it isn't broke... They have been modified on every single server in my environment.
From Books Online: • This option is an advanced option and should be changed only by an experienced database administrator or certified SQL Server technician. • The index create memory option is self-configuring and usually works without requiring adjustment. However, if you experience difficulties creating indexes, consider increasing the value of this option from its run value.
We have a debate in our team about embedded SQL vs. Stored Procs.
The argument is why use SP's if you can embed the SQL in the code and SQL2K will cache it on the fly?
I can't find any definitive information on pros and cons between the two methods.
If there are no major performance issues, or gotchas, I guess it comes down to developer preference.
SP Pros: - Great SQL support in VS.NET (dev, debug, integration) - Seperation of database specific code from middle tier. - Less lines of code in middle tier - VS.NET support for .xsd dataset definitions. - Logic closer to data for more demanding processes.
Embedded SQL Pros: - Less artifacts for version control - Better encapsulation of logic
I am working on tuning the procedure cache hit ratio for my server. We haveadded 4 Gb of memory to the server, which has helped. In addition, I have runthe DBCC FREEPROCACHE, which helped for a couple of days to get the hit ratioup to about 84% (from 68%).When I use the performance monitor on the server and look at SQL Server CacheManager:Buffer Hit Ratio, I see that the Prepared SQL Plan is around 97%, butthe Procedure Plan hit ratio is down around 55%. I've done some research ondifferent tuning techniques, but can't seem to find 1. a clear definition ofthe difference between the prepared sql plan and the procedure plan and 2.other than adding memory and running dbcc freeprocache, how can I get theprocedure plan cache raised? I do know that there are some procedures thatneed to be modified to be called fully qualified (e.g. exec dbo.sp_###instead of exec sp_###), but I don't think that those will increase theprocedure plan by 30% or more.Any insight you can give would be greatly appreciated.Thanks,Michael--Message posted via SQLMonster.comhttp://www.sqlmonster.com/Uwe/Forum...eneral/200511/1
The benefit of the actual execution plan is that you can see the actual number of rows passing through each step - compared to the estimated number of rows.But what about the "cost percentages" ?I believe I've read somewhere that these percentages is still just an estimate and is not based on the real execution.Does anyone know this and preferable have a link to something that documents it?Thanks
I had a view in which I did something like this isnull(fld,val) as 'alias'
when I assign a value to this in the client (vb 6.0) it works ok in sql2000 but fails in 2005. When I change the query to fld as 'alias' then it works ok in sql 2005 . why ?? I still have sql 2000 (8.0) compatability.
Also some queries which are pretty badly written run on sql 2000 but dont run at all in sql 2005 ???
any clues or answers ?? it is some configuration issue ?
I am writing a pgm that attaches to a SQL Server database. I have an Add stored procedure and an Update stored procedure. The two are almost identical, except for a couple parameters. However, the Add function works and the Update does not. Can anyone see why? I can't seem to find what the problem is...
This was my test:
Dim cmd As New SqlCommand("pContact_Update", cn) 'Dim cmd As New SqlCommand("pContact_Add", cn)
Catch ex As Exception Label1.Text = ex.Message End Try
When I use the Add procedure, a record is added correctly and I receive the "done" message. When I use the Update procedure, the record is not updated, but I still receive the "done" message.
I have looked at the stored procedures and the syntax is correct according to SQL Server.
I have a table name “StringResources� which contains resources for different cultures. Right now, whenever admin adds any new resource, it immediately available to end user. Now the new requirement came up. We want admin to add resource first and when he is ready with all the resources for particular culture then only the resources should be available to end user. Important: StringResource table has SQLCacheDependecy set. So any change for particular culture will invalidate the cache. This how the select statement looks.SELECT dbo.StringResources.resourceType, dbo.StringResources.cultureCode, dbo.StringResources.resourceKey, dbo.StringResources.resourceValue FROM dbo.StringResources WHERE dbo.StringResources.cultureCode=@cultureCode Which would be the best option below: 1> Add new Boolean column “Published� and show only resources which are published to end user. Advantage: No need of extra table. Disadvantage: This will invalidate the cache every time resource is added even if it’s not published. Other option 2> Add new temporary table with same structure. When admin add new resource, add it to this temp table and when publish move resources to String Resources table. Advantage: Admin will have separate working space. It will invalidate cache only when resources are published. Disadvantage: Need extra table.
How can I get an All option into the cascading prompt? I want to view data for all states in USA. I find I can't proceed to load report unless I have filled in a value for all prompts. Any help?
I have a small doubt. If we enable AWE option we can have advantage of available physical memory.we can have more memory by using max server memory property. then why this AWE option comes in to picture.
I read bol but iam not able to understand what exactly happens.
Could any one tell me why this AWE option if we have max server memory property.
Hi, What is equivalent to OPTION (RECOMPILE) in SQl Server 2000. Create table #Employee ( EmpId int IDENTITY,EmpName varchar(30) ) insert into #Employee(EmpName ) select EmpName from AllEmployees OPTION (RECOMPILE)
I have a vacation request app I'm designing, and it has a VacationData Table with TotalVac, UsedVac, VacLeft, VacationCarriedOver, and VacCompleted. I need to take the VacLeft and divide by 2 and place that data in the following two spots, one in the VCO and add it to the TotalVac, which is pulled from another table w/hire date and other info. I only need to run this on Jan 1 of every year, any suggestions?
I am trying to alter a table that has an identity field to make the column have the not for replication option. I just can't seem to get the syntax down for the alter table command. PLEASE HELP. I know it can't be that hard!!!
70-229 certification exam measures your ability to design and implement database solutions by using Microsoft SQL Server 2000 Enterprise Edition. Candidates for this exam work in a medium to enterprise computing environment that uses SQL Server 2000 Enterprise Edition. Candidates have at least one year of experience implementing relational databases. I don't have experience on database. Haven't experience on SQL server administration. Suggest me good one resources in low price. What are best option available in market?
First off, I apologize for not knowing what I'm talking about and being long winded. I'm trying to determine if SQL Server Express is an option for a client of mine. Their needs are beginning to go beyond what I'm comfortable with in Access, so I'm looking into the option of upgrading to a SQL Server Express back-end with a VB front-end.
Access doesn't require any setup beyond "File, new", so I know nothing about the background work required to get a database running on a platform like SQL Server.
My first concern is what kind of network admin rights do I need to install & use SSE..
It's unlikely that her IT group will just hand us the keys to any of their servers, so all we really have available is what we can put on her network drives. Am I right in assuming that using a SQL Server database would involve more network privileges than just dropping a file on the network and pointing my front end app at it?
With that said, I think I remember seeing something on one of the MSDN pages about setting up a database to run off of a CD-Rom. Could I somehow use this capability to get what I need?
I would like to add a couple more fields that the users can query by, more options how would I add in the same stored proecedure, First Name, Date, and IR#?? so they can have the ability to run it by one field or two fields or all the above?? how would I incorporate those in the stored procedure??
CREATE PROCEDURE [SearchByDateExcls] (@StartDate datetime, @EndDate datetime) AS SELECT [IR Number], Date, Inspector, Violation, [Violation Type], Loss, [Loss Type] FROM dbo.Revised_MainTable GROUP BY [IR Number], Date, Inspector, Violation, [Violation Type], Loss, [Loss Type] HAVING (Date BETWEEN @StartDate AND @EndDate)
is this correct
CREATE PROCEDURE [SearchByDateExcls] (@StartDateServed datetime, @EndDateServed datetime, @Enter_LastName nvarchar(25), @Enter_Duration nvarchar(10)) AS SELECT IR#, [Date Served], [Reason For Exclusion], Duration, [First Name], [Last Name] FROM [dbo].[Extended Exclusions] GROUP BY IR#, [Date Served], [Reason For Exclusion] , Duration, [First Name], [Last Name] HAVING (Date BETWEEN @StartDateServed AND @EndDateServed, Enter_LastName, @Enter_Duration)
I was sreading about NOLOCK that it could prevent deadlocks but could return data which is not committed yet. 1) Should we use NOLOCK with select statements 2) If the transaction isolation level is set appropriately (e.g. Serializable)in the component (for e.g COM+ component) but NOLOCK is specified in the select then would it return uncommitted data. I mean if the transaction is controlled at hihger level then what will be the Pros and Cons of using NOLOCK.
I am selecting few records from a particular table and however I need to run the same query using UNION.But its taking long to execute the query.If I run the query as 2 parts then it works fine but if I put in UNION to both queries then I have problem.Can anyone help?
eg:
Select '' as name,'' as userid,'' as firstname from table1 where user like 'A%'(works fine)
Select name,userid,firstname from table1 where user like 'A%'(also works fine)
but if I user
Select '' as name,'' as userid,'' as firstname from table1 where user like 'A%' UNION Select name,userid,firstname from table1 where user like 'A%'