Effect Of Update Statistics On Database And On An ASP Application
Jun 13, 2001
Hi
I had run a stored procedure in my server that update statistics against all user defined tables in my database (MSSQL 7.0).
Since then I am getting errors in my ASP application where I am reffering to adovbs.inc.
Here is an example of errors I get.
Microsoft VBScript runtime error '800a0411'
Name redefined: 'adOpenForwardOnly'
/Essai/adovbs.inc, line 14
Below is the stored procedure I have run against the database.
Can anybody help tank you guys.
CREATE PROCEDURE update_all_stats
AS
/*
This PROCEDURE will run UPDATE STATISTICS against
ALL user-defined tables within this database.
*/
DECLARE @tablename varchar(30)
DECLARE @tablename_header varchar(75)
DECLARE tnames_cursor CURSOR FOR SELECT name FROM sysobjects
WHERE type = 'U'
OPEN tnames_cursor
FETCH NEXT FROM tnames_cursor INTO @tablename
WHILE (@@fetch_status <> -1)
BEGIN
IF (@@fetch_status <> -2)
BEGIN
SELECT @tablename_header = "Updating " +
RTRIM(UPPER(@tablename))
PRINT @tablename_header
EXEC ("UPDATE STATISTICS " + @tablename )
END
FETCH NEXT FROM tnames_cursor INTO @tablename
END
PRINT " "
PRINT " "
SELECT @tablename_header = "************* NO MORE TABLES" +
" *************"
PRINT @tablename_header
PRINT " "
PRINT "Statistics have been updated FOR ALL tables."
DEALLOCATE tnames_cursor
I'm looking into the automatic recompilation of stored procedures andI have been reading up on the "Do not recompute statistics" option onindexes.Am I correct in concluding that disabling the "Do not recomputestatistics" option for an index, will ensure that no automaticrecompilations will occur as a result of updates to data in thatindex?Am I also correct in understanding that the "Update Statistics" willstill update statistics for the index even if the "Do not recomputestatistics" option is disabled?RegardsBjørn
How can I query the table for following management information (in SQL SERVER 2008)?
1. Last user login? 2. How long(i.e. duration) user has been online for the day? 3. How many times user has login or logout per day? 4. which users logged into system on certain day? 5. Which users still logged in after 11pm? 6. Any other statistic that could be useful to management?
I'm a newbie to Replication and recently setup the following.
Publisher and Distributor on the same SQL2005 server, then I've got 7 subscribers(SQL2000 servers) and I'm using push subscriptions. I'm replicating 5 SQl tables which don't have too many changes and these are scheduled to run every 3 hours. In a few days a large one off SQL update with add an additional 10,000 rows to one of the replicated tables. I was wondering what impact this would have on the above setup i.e are there any sort of limitations here. I'm assuming not but thought I would check. I'm thinking it will just cause additional overhead on the server, but the update is being applied when no users will be using the database.
I created a little test application with SQL server. After everything was tested successfully I builded a setup file with Installshield. I installed the setup file on a different mashine. Everything runs without problems, but when I make entries to the database and I reopen the application no entry is saved. I am new to SQL server.
We are receiving the following error when trying to Update the cube (the last line of code)Cannot update the 'Database' object 'DB Cube_Temp', it needs to be part of a connected Server object.
I am contemplating creating a job to execute every 5 mins which will update index statisics if they are more than say 8% out. I would like to know what thoughts people have on this? i.e. pros and cons.
I like forward to what you have to say.
I have auto stats on. Our stats are often more than 10% out. At what level do you reckon the query plan might be effected by out of data stats?
It seems to me there are many ways to update statistics for a table. i.e. "sp_updatestats", "sp_recompile", "dbcc updateusage"
Can somebody tell me the difference between those commands and what's the best way for updating your statistics? Does reindexing update the statistics?
To update statistics for entire DB i have taken the script from under given link.But need to know the 1 : what is sample percent on update statistics 2 : will it be applicable for 2005 ?
script taken from : http://weblogs.sqlteam.com/tarad/archive/2006/08/14/11194.aspx
Hi All, I update statistics for three tables every day 2:00 AM and in the job we call one stored procedure and, in that stored procedure only three statements are writtern for update statistics Like: Exec('update statistics TBL1 with fullscan') Exec('update statistics TBL2 with fullscan') Exec('update statistics TBL3 with fullscan') And this job was working fine since many months but last two days its getting fail and it gives the error messages like : could not continue scan with NOLOCK due to data movement So could you help me what is the solution for this
I would like to know when we upgrade SQL Server 2000 database to SQL Server 2005 is it required to update the statistics even if we rebuild all the indexes or create new indexes?
I am planning to change our current UPDATE STATISTICS strategy, which is auto stats ON. Our database is terrabytes sizes and some tables with millions of rows with over 200 indexes in one table. Some of these indexes are not really used. Most of the tables are very small.
Droping and creating new indexes are quite often used in our environment. So static script may not help.
How can I identify most frequently used indexes in a table?
With the Microsft recommended auto stats ON, what are the best other practices I can include to improve the effeciency?
Any help would be apprecited. It would be realy great if any of you can share some scripts to generates dynamic scripts.
Hello all, thanks in advance for any advice here. My question is, what's the effect of the Enterprise Manager > Tasks > Shrink Database function? It seems to reduce the used space on the device. Testing it on some dev machines, I've seem to have gotten back as much as 20g of space. I know it should grow back to that, but the time it took was minimal, and it didn't seem to affect my developers. They tend to add alot and delete alot during testing. What negative effects of running this should I look out for? Will it affect the DB long term? Is is preferable to schedule once a month or so? Is this done on Production DB's? Thanks for any guidence on the usage of this.
On a SQL 7 sp 2 server, I have a database with about about 77,000 records, with automatic update statistics on inserting 1000 records took 43 minutes. With automatic update off, it took 23 minutes to insert the same 1000 records. On the same machine, I inserted 1000 records into 2 other databases with the same database structure and automatic update statistics on. On the second database, there are about 174000 records and it took 35 minutes to insert 1000 records. On the third database, there are about 93000 records and it took 19 minutes to insert 1000 records.
I have a query that retrieves a single record from searching on two tables. The statement goes like this... select sum(amount) from Table1 A union Table2 B on a.id = b.id where date < ### and date > #### and account = ###
As people are running a particular report, this statement is executed time and time again to pull up the numbers necesarry for the report. When the report gets slow, I can speed it up by updating the statistics. My concern is that I'm having to update the statistcs every hour; otherwise, the query becomes slow. I have noticed that users are inserting data while users are running the report on one of the tables listed above. I'm sure that's making it become more fragmented and ultimately slowing down the query. Do you have any suggestion on how I can make the union of these two tables faster? Or is there anything I could do to speed the query besides creating clusted indexes? Any help would be appreciated....thank you
I am maintaining a large table with millions of rows that has two non clustered indexes and data changing frequently, I need to keep the indexes fresh. Update Statistics runs much quicker than Reindex. What is the appropriate situation for each and why? Thanks in advance.
We are upgrading from sql 7 to 2000.During the upgrade process do we have to do a reindexing of all tables or will update statistics take care of that.
Or do we have to do both? What is the difference between reindexing and update statistics.
I have recently defragged my SQL server using INDEXDEFRAG. Can somebody please tell me how to update the statistics on all the tables? Thanks in advance.
Below is the script that I executed to defrag all the tables in my database if anyone needs this.
/*Perform a 'USE <database name>' to select the database in which to run the script.*/ -- Declare variables SET NOCOUNT ON DECLARE @tablename VARCHAR (128) DECLARE @execstr VARCHAR (255) DECLARE @objectid INT DECLARE @indexid INT DECLARE @frag DECIMAL DECLARE @maxfrag DECIMAL
-- Decide on the maximum fragmentation to allow SELECT @maxfrag = 20.0
-- Declare cursor DECLARE tables CURSOR FOR SELECT TABLE_NAME FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE = 'BASE TABLE'
-- Loop through all the tables in the database FETCH NEXT FROM tables INTO @tablename
WHILE @@FETCH_STATUS = 0 BEGIN -- Do the showcontig of all indexes of the table INSERT INTO #fraglist EXEC ('DBCC SHOWCONTIG (''' + @tablename + ''') WITH FAST, TABLERESULTS, ALL_INDEXES, NO_INFOMSGS') FETCH NEXT FROM tables INTO @tablename END
-- Close and deallocate the cursor CLOSE tables DEALLOCATE tables
-- Declare cursor for list of indexes to be defragged DECLARE indexes CURSOR FOR SELECT ObjectName, ObjectId, IndexId, LogicalFrag FROM #fraglist WHERE LogicalFrag >= @maxfrag AND INDEXPROPERTY (ObjectId, IndexName, 'IndexDepth') > 0
-- Open the cursor OPEN indexes
-- loop through the indexes FETCH NEXT FROM indexes INTO @tablename, @objectid, @indexid, @frag
I am looking to run UPDATE STATISTICS for the first time, don't ask why it wasn't done prior please :(, on a set of large tables in our 346gb database whcih has been being populated with transactional data for the past 4 years. The tables contain 1.2, 35, 64, and 92 million rows. I have used the STAT_DATE function to determine that none of these tables have ever had update statistics run for them.
My question is how should I go about this process and what options should I be selecting when issuing the command? I assume that I must first run with the FULLSCAN paramater in order to initially generate statistics for the table then would assume that following this initial population I could run without any paramaters nightly against the tables in the database to keep statistics up to date. Any guideance you all could provide to a newb would be greatly appreciated.
I'm using SQL 7, there is a setting on DB properties called "Auto update statistics", what kind of statistics does this refers to and how can this stats be accessed?
Hi. I have automatic statistic update turned on for all my databases. Isthis an overhead I can do without? Could I update them overnight when thedatabase is hardly in use?Thanks--Chris Weston
I am using the Maintencance Plan wizard, but it only allows me to either select the "reorganize data and indexes" option or the "update statistics" option (in the Optimizations tab). I can't select both of them. What is the reason for this?
We are using SQL Server 2005. The auto update statistics and auto create statistics for a database is set to ON. This database has a very heavy work load. When I checked the individual statitics , still the last updated statistics is in a old date value (few months ago).
Is it necessary to manually update the statistics for the same database? Or can we rely upon "auto update statistics" itself ?
Usually in what frequency the manual UPDATE STATISTICS should be run on production system which has heavy transactions ?