I am maintaining a large table with millions of rows that has two non clustered indexes and data changing frequently, I need to keep the indexes fresh. Update Statistics runs much quicker than Reindex. What is the appropriate situation for each and why?
Thanks in advance.
Does running DBCCReindex update the space allocated columns in sysindexes? I understand that running dbcc updateusage updates the space allocated columns in the sysindexes table. But, I cannot find any documentation that indicates whether dynamically rebuilding the indexes as opposed to drop and recreating the indexes updates the space allocated columns in the sysindexes table?
Any information would be helpful. Thanks. Gail Wade Database Administration Raymond James Financial gwade@it.rjf.com
I am contemplating creating a job to execute every 5 mins which will update index statisics if they are more than say 8% out. I would like to know what thoughts people have on this? i.e. pros and cons.
I like forward to what you have to say.
I have auto stats on. Our stats are often more than 10% out. At what level do you reckon the query plan might be effected by out of data stats?
It seems to me there are many ways to update statistics for a table. i.e. "sp_updatestats", "sp_recompile", "dbcc updateusage"
Can somebody tell me the difference between those commands and what's the best way for updating your statistics? Does reindexing update the statistics?
To update statistics for entire DB i have taken the script from under given link.But need to know the 1 : what is sample percent on update statistics 2 : will it be applicable for 2005 ?
script taken from : http://weblogs.sqlteam.com/tarad/archive/2006/08/14/11194.aspx
Hi All, I update statistics for three tables every day 2:00 AM and in the job we call one stored procedure and, in that stored procedure only three statements are writtern for update statistics Like: Exec('update statistics TBL1 with fullscan') Exec('update statistics TBL2 with fullscan') Exec('update statistics TBL3 with fullscan') And this job was working fine since many months but last two days its getting fail and it gives the error messages like : could not continue scan with NOLOCK due to data movement So could you help me what is the solution for this
I would like to know when we upgrade SQL Server 2000 database to SQL Server 2005 is it required to update the statistics even if we rebuild all the indexes or create new indexes?
I am planning to change our current UPDATE STATISTICS strategy, which is auto stats ON. Our database is terrabytes sizes and some tables with millions of rows with over 200 indexes in one table. Some of these indexes are not really used. Most of the tables are very small.
Droping and creating new indexes are quite often used in our environment. So static script may not help.
How can I identify most frequently used indexes in a table?
With the Microsft recommended auto stats ON, what are the best other practices I can include to improve the effeciency?
Any help would be apprecited. It would be realy great if any of you can share some scripts to generates dynamic scripts.
On a SQL 7 sp 2 server, I have a database with about about 77,000 records, with automatic update statistics on inserting 1000 records took 43 minutes. With automatic update off, it took 23 minutes to insert the same 1000 records. On the same machine, I inserted 1000 records into 2 other databases with the same database structure and automatic update statistics on. On the second database, there are about 174000 records and it took 35 minutes to insert 1000 records. On the third database, there are about 93000 records and it took 19 minutes to insert 1000 records.
I have a query that retrieves a single record from searching on two tables. The statement goes like this... select sum(amount) from Table1 A union Table2 B on a.id = b.id where date < ### and date > #### and account = ###
As people are running a particular report, this statement is executed time and time again to pull up the numbers necesarry for the report. When the report gets slow, I can speed it up by updating the statistics. My concern is that I'm having to update the statistcs every hour; otherwise, the query becomes slow. I have noticed that users are inserting data while users are running the report on one of the tables listed above. I'm sure that's making it become more fragmented and ultimately slowing down the query. Do you have any suggestion on how I can make the union of these two tables faster? Or is there anything I could do to speed the query besides creating clusted indexes? Any help would be appreciated....thank you
We are upgrading from sql 7 to 2000.During the upgrade process do we have to do a reindexing of all tables or will update statistics take care of that.
Or do we have to do both? What is the difference between reindexing and update statistics.
I have recently defragged my SQL server using INDEXDEFRAG. Can somebody please tell me how to update the statistics on all the tables? Thanks in advance.
Below is the script that I executed to defrag all the tables in my database if anyone needs this.
/*Perform a 'USE <database name>' to select the database in which to run the script.*/ -- Declare variables SET NOCOUNT ON DECLARE @tablename VARCHAR (128) DECLARE @execstr VARCHAR (255) DECLARE @objectid INT DECLARE @indexid INT DECLARE @frag DECIMAL DECLARE @maxfrag DECIMAL
-- Decide on the maximum fragmentation to allow SELECT @maxfrag = 20.0
-- Declare cursor DECLARE tables CURSOR FOR SELECT TABLE_NAME FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE = 'BASE TABLE'
-- Loop through all the tables in the database FETCH NEXT FROM tables INTO @tablename
WHILE @@FETCH_STATUS = 0 BEGIN -- Do the showcontig of all indexes of the table INSERT INTO #fraglist EXEC ('DBCC SHOWCONTIG (''' + @tablename + ''') WITH FAST, TABLERESULTS, ALL_INDEXES, NO_INFOMSGS') FETCH NEXT FROM tables INTO @tablename END
-- Close and deallocate the cursor CLOSE tables DEALLOCATE tables
-- Declare cursor for list of indexes to be defragged DECLARE indexes CURSOR FOR SELECT ObjectName, ObjectId, IndexId, LogicalFrag FROM #fraglist WHERE LogicalFrag >= @maxfrag AND INDEXPROPERTY (ObjectId, IndexName, 'IndexDepth') > 0
-- Open the cursor OPEN indexes
-- loop through the indexes FETCH NEXT FROM indexes INTO @tablename, @objectid, @indexid, @frag
I am looking to run UPDATE STATISTICS for the first time, don't ask why it wasn't done prior please :(, on a set of large tables in our 346gb database whcih has been being populated with transactional data for the past 4 years. The tables contain 1.2, 35, 64, and 92 million rows. I have used the STAT_DATE function to determine that none of these tables have ever had update statistics run for them.
My question is how should I go about this process and what options should I be selecting when issuing the command? I assume that I must first run with the FULLSCAN paramater in order to initially generate statistics for the table then would assume that following this initial population I could run without any paramaters nightly against the tables in the database to keep statistics up to date. Any guideance you all could provide to a newb would be greatly appreciated.
I'm using SQL 7, there is a setting on DB properties called "Auto update statistics", what kind of statistics does this refers to and how can this stats be accessed?
Hi. I have automatic statistic update turned on for all my databases. Isthis an overhead I can do without? Could I update them overnight when thedatabase is hardly in use?Thanks--Chris Weston
I am using the Maintencance Plan wizard, but it only allows me to either select the "reorganize data and indexes" option or the "update statistics" option (in the Optimizations tab). I can't select both of them. What is the reason for this?
We are using SQL Server 2005. The auto update statistics and auto create statistics for a database is set to ON. This database has a very heavy work load. When I checked the individual statitics , still the last updated statistics is in a old date value (few months ago).
Is it necessary to manually update the statistics for the same database? Or can we rely upon "auto update statistics" itself ?
Usually in what frequency the manual UPDATE STATISTICS should be run on production system which has heavy transactions ?
Dear Sql Server experts:First off, I am no sql server expert :)A few months ago I put a database into a production environment.Recently, It was brought to my attention that a particular query thatexecuted quite quickly in our dev environment was painfully slow inproduction. I analyzed the the plan on the production server (itlooked good), and then tried quite a few tips that I'd gleaned fromreading newsgroups. Nothing worked. Then on a whim I performed anUPDATE STATISTICS on a few of the tables that were being queried. Thequery immediately went from executing in 61 seconds to under 1 second.I checked to make sure that statistics were being "auto updated" andthey were.Why did I need to run UPDATE STATISTICS? Will I need to again?A little more background info:The database started empty, and has grown quite rapidly in the lastfew months. One particular table grows at a rate of about 300,000records per month. I get fast query times due to a few well placedindexes.A quick question:If I add an index, do statistics get automatically updated for thisnew index immediately?Thanks in advance for any help,Felix
Is it neccessary to schedule a update statistics on index in sql server 2005 on daily basis Is it neccessary to schedule a rebuild index on index in sql server 2005 on daily basis
I had run a stored procedure in my server that update statistics against all user defined tables in my database (MSSQL 7.0).
Since then I am getting errors in my ASP application where I am reffering to adovbs.inc.
Here is an example of errors I get.
Microsoft VBScript runtime error '800a0411'
Name redefined: 'adOpenForwardOnly'
/Essai/adovbs.inc, line 14
Below is the stored procedure I have run against the database.
Can anybody help tank you guys.
CREATE PROCEDURE update_all_stats AS /* This PROCEDURE will run UPDATE STATISTICS against ALL user-defined tables within this database. */ DECLARE @tablename varchar(30) DECLARE @tablename_header varchar(75) DECLARE tnames_cursor CURSOR FOR SELECT name FROM sysobjects WHERE type = 'U' OPEN tnames_cursor FETCH NEXT FROM tnames_cursor INTO @tablename WHILE (@@fetch_status <> -1)
BEGIN IF (@@fetch_status <> -2)
BEGIN SELECT @tablename_header = "Updating " + RTRIM(UPPER(@tablename)) PRINT @tablename_header EXEC ("UPDATE STATISTICS " + @tablename ) END FETCH NEXT FROM tnames_cursor INTO @tablename END PRINT " " PRINT " " SELECT @tablename_header = "************* NO MORE TABLES" + " *************" PRINT @tablename_header PRINT " " PRINT "Statistics have been updated FOR ALL tables." DEALLOCATE tnames_cursor
I am working on an existing infrastructure and i do not have liberty to change much right now. I am in a situation where app issues update statistics command quite often. So frequently that sometimes one blocks another. Is there any way i can do something like this
IF ( update_statistics going on) dont do anything else run update statistics
This is temporary solution untill i fix bad inline SQL code (in app) and use SPs.
I have tried many variations (after reviewing other posts) and can notresolve the following issue:RUNNING SQL MAINTENANCE----------------------------SET ARITHABORT ONSET CONCAT_NULL_YIELDS_NULL ONSET QUOTED_IDENTIFIER ONSET ANSI_NULLS ONSET ANSI_PADDING ONSET ANSI_WARNINGS ONSET NUMERIC_ROUNDABORT OFFexec master..xp_sqlmaint '-D SBC -UpdOptiStats 10 -RebldIdx 10'--tried UpdOptiStats and RebldIdx separately with same resultsRECEIVE THE FOLLOWING MESSAGE------------------------------[Microsoft SQL-DMO (ODBC SQLState: 42000)]Error 1934: [Microsoft][ODBC SQL Server Driver][SQL Server]UPDATE STATISTICS failed because the following SET options haveincorrect settings: 'QUOTED_IDENTIFIER, ARITHABORTSERVER SETUP-------------------------------Windows 2000, Service Pack 4SQL Server 2000 Standard Edition, Service Pack 3Any help is greatly appreciated.