Transaction Log Growth Question
Nov 23, 2004
Hi DBA's,
I have the tran log backups running every 30 mins. When I re-build indexes on my database with the tran log back ups runniing every 30 mins, I get tran log backups whose size is greater than the actual log file (.LDF). I find this really strange as to how can the backup be greater than the log file itself.
I have unrestricted growth on the tran log file ( By 10%). Auto Shrink is disabled.
Have any of you experienced this problem. Kindly share your comments.
Thanks,
V
View 4 Replies
ADVERTISEMENT
Feb 5, 2015
I received alerts from a couple of the production servers last night stating that the log file is running out of space. So, I took some log backups and shrunk those files. However, I would like to find out what made the log grow like that. After all, I have t-log backups scheduled every 30 minutes.Is there a way I can find out the reason behind the log growth?
View 9 Replies
View Related
Oct 5, 2006
During my maintenance plans every night, my transaction log for a 17GB database grows to about 9.5GB. I am running transaction log backups every 2 hours, and my transaction log stays around 300MB during the day. However, every night when the optimization job runs, this explosive growth occurs. I have the optimizations set to Reorganize data and index pages, changing free space to 10%, and remove unused space from database files to 10%, shrinking when they grow beyond 50MB. This database handles cash register operations which are not active during the maintenance plan.
View 4 Replies
View Related
May 31, 2008
Hi All
I'm getting this when executing the code below. Going from W2K/SQL2k SP4 to XP/SQL2k SP4 over a dial-up link.
If I take away the begin tran and commit it works, but of course, if one statement fails I want a rollback. I'm executing this from a Delphi app, but I get the same from Qry Analyser.
I've tried both with and without the Set XACT . . ., and also tried with Set Implicit_Transactions off.
set XACT_ABORT ON
Begin distributed Tran
update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TRANSACTIONMAIN
set REPFLAG = 0 where REPFLAG = 1
update TSADMIN.TRANSACTIONMAIN
set REPFLAG = 0 where REPFLAG = 1 and DONE = 1
update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.WBENTRY
set REPFLAG = 0 where REPFLAG = 1
update TSADMIN.WBENTRY
set REPFLAG = 0 where REPFLAG = 1
update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.FIXED
set REPFLAG = 0 where REPFLAG = 1
update TSADMIN.FIXED
set REPFLAG = 0 where REPFLAG = 1
update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.ALTCHARGE
set REPFLAG = 0 where REPFLAG = 1
update TSADMIN.ALTCHARGE
set REPFLAG = 0 where REPFLAG = 1
update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TSAUDIT
set REPFLAG = 0 where REPFLAG = 1
update TSADMIN.TSAUDIT
set REPFLAG = 0 where REPFLAG = 1
COMMIT TRAN
It's got me stumped, so any ideas gratefully received.Thx
View 1 Replies
View Related
Feb 22, 2007
I have a design a SSIS Package for ETL Process. In my package i have to read the data from the tables and then insert into the another table of same structure.
for reading the data i have write the Dynamic TSQL based on some condition and based on that it is using 25 different function to populate the data into different 25 column. Tsql returning correct data and is working fine in Enterprise manager. But in my SSIS package it show me time out ERROR.
I have increase and decrease the time to catch the error but it is still there i have tried to set 0 for commandout Properties.
if i'm using the 0 for commandtime out then i'm getting the Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.
and
Failed to open a fastload rowset for "[dbo].[P@@#$%$%%%]". Check that the object exists in the database.
Please help me it's very urgent.
View 3 Replies
View Related
Feb 6, 2007
I am getting this error :Distributed transaction completed. Either enlist this session in a new
transaction or the NULL transaction. Description:
An unhandled exception occurred during the execution of the current web
request. Please review the stack trace for more information about the error and
where it originated in the code. Exception Details:
System.Data.OleDb.OleDbException: Distributed transaction completed. Either
enlist this session in a new transaction or the NULL transaction.have anybody idea?!
View 1 Replies
View Related
Dec 22, 2006
i have a sequence container in my my sequence container i have a script task for drop the existing tables. This seq. container connected to another seq. container. all these are in for each loop container when i run the package it's work fine for 1st looop but it gives me error for second execution.
Message is like this:
Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.
View 8 Replies
View Related
Jan 8, 2008
Hi,
i am getting this error "Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.".
my transations have been done using LINKED SERVER. when i manually call the store procedure from Server 1 it works but when i call it through Service broker it dosen't work and gives me this error.
Thanks in advance.
View 2 Replies
View Related
Aug 14, 2001
Does this seem right? We have our transaction logs set to "Truncate Log on Checkpoint" and they still grow over 1GB. Is it possible that one transaction (to a checkpoint) generates this much logged information? Will transaction log backups every 5-10 minutes help me out better or is this just a poorly written application?
Thanks!
View 4 Replies
View Related
Mar 6, 2001
I am having a problem with the growth of the tempdb on my SQL 7.0 box. I have over 300+ stored procedures that are running (many with group by and order by in them). This is causing the size of my tempdb to grow to over 30gigs in size. If i reset the services of the DB it shrinks back down to the managable 6 gigs that i expect. Is there a way to have the services restart automatically on a nightly basis or is there a way to have the tempdb deallocate resources once they are used without restarting services?
I apreciate any help you can provide,
Nathan
View 3 Replies
View Related
Oct 9, 2002
Hi,
I am moving table(14 million rows) data from one server to another, The transactional log file growth during the process is 3 folds compared to actual data file size.
Could you please tell why?
Thanks
John Jayaseelan
View 4 Replies
View Related
Feb 14, 2005
Well my question is how do i analyze db growth per day. is there a tool i can use or a method. I mean i do take a look at the task view and the files but per day it doesnt move in MB wich is weird since this is a warehouse and their are nightly loads to it inserting maybe 30000 record a night on avg.
Any help would be grately aprreciated.
View 3 Replies
View Related
Sep 17, 2007
Hi everyone,
I'm a beginner in SQL Server databases, my problem is this:
i'm making a database witch the frontend is an access project, the database has several stored procedures views and user functions (the normal..), but a few data, (only the experimental), last night i've noticed that the file grow from 22 MB to 89 MB, the objects are the same and also the data, the only diference was that i forgot to put in an event procedure code, the ADO method, "MoveNext", to update various records, and the loop was infinit.
Is it possible that SQL statments generated by ADO make the file grow so rapidly!?
If so how can i shrink it, because i've tried and and the results was 4%.
Can you help me!?
Thanks
View 1 Replies
View Related
Jul 23, 2005
We use SQL 2000 and our database is configured to grow automatically by10%. Currently 96% of our database is used. At what point will thedatabase expand - what is the trigger point?
View 2 Replies
View Related
Jul 20, 2005
I have done this experiment on one of the tables. There is table calledbuild havinf nvText Field with large no of records. I want to drop thatcolumn and recover space. These are the results I got.SP_SPACEUSED ‘BUILD’ Resultsname rows reserved data index_size unused1. Before Deleting nvText FieldBuild 663211 341440 KB 339464 KB 1944 KB 32 KB2. After Deleting nvText FieldBuild 663211 341440 KB 339464 KB 1944 KB 32 KB3. After Executing the Shrink Database from Enterprise Manager.Build 663211 608280 KB 604744 KB 3456 KB 80 K4. After Executing DBCC DBReindex (build,'',70)Build 663211 124096 KB 123392 KB 712 KB -8 KBCan anyone please explain me after executing step 3 i.e shrink datacolumn as well as index_size shows an increased figure whereas logicallyit should be a reduced figure.Regards,Harcharan*** Sent via Developersdex http://www.developersdex.com ***Don't just participate in USENET...get rewarded for it!
View 1 Replies
View Related
Apr 20, 2007
I would like to know followings:
I want to see every day or weekly Database growth (%) save on table
I have some SP which will give me one time run and see the growth. which is ...
Please advice any other way to find out and save on a location ...
create procedure sp_growth as
set ansi_warnings off
declare @l_db_name varchar(50)
,@l_sql_string varchar(1000)
set nocount on
if object_id('DB_Growth') is not null
drop table DB_Growth
create table DB_Growth (Database_Name varchar(30), Logical_File_Name varchar(15), File_Size_MB int, Growth_Factor varchar(100))
declare db_name_cursor insensitive cursor
for
select name from master..sysdatabases
open db_name_cursor
fetch next from db_name_cursor into
@l_db_name
While (@@fetch_status = 0)
begin
select @l_sql_string = 'select ' + '''' + @l_db_name + '''' + ', name, ceiling((size * 8192.0)/(1024.0 * 1024.0)), case when status & 0x100000 = 0 then convert(varchar,ceiling((growth * 8192.0)/(1024.0*1024.0))) + '' MB''' + char(10)+char(13)
+ 'else convert (varchar, growth) + '' Percent''' + char(10)+char(13)
+ 'end' + char(10)+char(13)
+ 'from [' + @l_db_name + '].dbo.sysfiles'
insert into DB_Growth (Database_Name, Logical_File_Name, File_Size_MB, Growth_Factor)
exec (@l_sql_string)
fetch next from db_name_cursor into
@l_db_name
end
close db_name_cursor
deallocate db_name_cursor
select * from DB_Growth with (nolock)
if object_id('DB_Growth') is not null
drop table DB_Growth
set nocount off
set ansi_warnings on
return
GO
Thanks
Faiz Farazi
Daudkandi,Comilla, Bangladesh
http://www.databasetimes.net/
View 1 Replies
View Related
Apr 20, 2007
I would like to know followings:
I want to see every day or weekly Database growth (%) save on table
I have some SP which will give me one time run and see the growth. which is ...
Please advice any other way to find out and save on a location ...
create procedure sp_growth as
set ansi_warnings off
declare @l_db_name varchar(50)
,@l_sql_string varchar(1000)
set nocount on
if object_id('DB_Growth') is not null
drop table DB_Growth
create table DB_Growth (Database_Name varchar(30), Logical_File_Name varchar(15), File_Size_MB int, Growth_Factor varchar(100))
declare db_name_cursor insensitive cursor
for
select name from master..sysdatabases
open db_name_cursor
fetch next from db_name_cursor into
@l_db_name
While (@@fetch_status = 0)
begin
select @l_sql_string = 'select ' + '''' + @l_db_name + '''' + ', name, ceiling((size * 8192.0)/(1024.0 * 1024.0)), case when status & 0x100000 = 0 then convert(varchar,ceiling((growth * 8192.0)/(1024.0*1024.0))) + '' MB''' + char(10)+char(13)
+ 'else convert (varchar, growth) + '' Percent''' + char(10)+char(13)
+ 'end' + char(10)+char(13)
+ 'from [' + @l_db_name + '].dbo.sysfiles'
insert into DB_Growth (Database_Name, Logical_File_Name, File_Size_MB, Growth_Factor)
exec (@l_sql_string)
fetch next from db_name_cursor into
@l_db_name
end
close db_name_cursor
deallocate db_name_cursor
select * from DB_Growth with (nolock)
if object_id('DB_Growth') is not null
drop table DB_Growth
set nocount off
set ansi_warnings on
return
GO
Thanks
Faiz Farazi
Daudkandi,Comilla, Bangladesh
http://www.databasetimes.net/
View 1 Replies
View Related
Apr 20, 2007
I would like to know followings:
I want to see every day or weekly Database growth (%) save on table
I have some SP which will give me one time run and see the growth. which is ...
Please advice any other way to find out and save on a location ...
create procedure sp_growth as
set ansi_warnings off
declare @l_db_name varchar(50)
,@l_sql_string varchar(1000)
set nocount on
if object_id('DB_Growth') is not null
drop table DB_Growth
create table DB_Growth (Database_Name varchar(30), Logical_File_Name varchar(15), File_Size_MB int, Growth_Factor varchar(100))
declare db_name_cursor insensitive cursor
for
select name from master..sysdatabases
open db_name_cursor
fetch next from db_name_cursor into
@l_db_name
While (@@fetch_status = 0)
begin
select @l_sql_string = 'select ' + '''' + @l_db_name + '''' + ', name, ceiling((size * 8192.0)/(1024.0 * 1024.0)), case when status & 0x100000 = 0 then convert(varchar,ceiling((growth * 8192.0)/(1024.0*1024.0))) + '' MB''' + char(10)+char(13)
+ 'else convert (varchar, growth) + '' Percent''' + char(10)+char(13)
+ 'end' + char(10)+char(13)
+ 'from [' + @l_db_name + '].dbo.sysfiles'
insert into DB_Growth (Database_Name, Logical_File_Name, File_Size_MB, Growth_Factor)
exec (@l_sql_string)
fetch next from db_name_cursor into
@l_db_name
end
close db_name_cursor
deallocate db_name_cursor
select * from DB_Growth with (nolock)
if object_id('DB_Growth') is not null
drop table DB_Growth
set nocount off
set ansi_warnings on
return
GO
Thanks
Faiz Farazi
Daudkandi,Comilla, Bangladesh
http://www.databasetimes.net/
View 3 Replies
View Related
Aug 21, 2001
I've got a question about the automatic database growth feature of V7. Here's an example:
I have a 1gb db that can grow to max size of 2gb.
I set the auto grow option to 75%
The first time the db grows it will grab 75% of the free space (1gb)
What happens if the database needs to grow again?
Will the db grow using the remaining free space (25%) or
has the database reached its max size because it can't grow any further?
Thanks
View 1 Replies
View Related
Jun 19, 2000
Hello,
I started to write a stored procedure to insert into a temp table from sp_spaceused but couldn't get the format right. I did a search in the swynk archive and saw that some people have solved this but I couldn't see any example code. Does anyone have scripts they've written to gather db size info using sp_spaceused or some other mechanism?
Thanks in advance,
Dan
View 2 Replies
View Related
Sep 23, 2002
:eek: I am somewhat confused -- I have a database in production that I restored to a QA environment; upon restore, the size has grown by 200MB.
Both production and QA are running SQL2000 -- the only difference is that QA has the latest security hotfixes installed -- version 8.0.0.665 from KB article at the following link:
Q316333 (http://support.microsoft.com/default.aspx?scid=kb;en-us;Q316333&id=Q316333)
View 3 Replies
View Related
Oct 6, 2003
Restore/Log Growth Question:
We have an SQL2K SP3 Decision Support Server which will be receiving a nightly data dump from the offsite production system. We’ve decided on a backup/restore process to move the data. We’re currently testing the process and have come across a problem. The test backup file is only 500MB, the real DB will be approx. 20GB, and when it is restore the DB is fine but the log file has ballooned to 8 GB.
When the DB is restored the properties of the production DB are forced over our settings. These include no truncation of log on checkpoint and a full recovery model (the destination DB will have simple recovery). We’ve been through the books online and search online forums, but have not found information on this particular problem. We can, of course, shrink the log file after restoring but are concerned about the amount of dead space needed when the real DB is delivered.
Any help would be greatly appreciated.
View 8 Replies
View Related
Nov 8, 2006
Hi All,
I am researching the iformation on how to a percentage of the database growth over time. I have looked on the net, but I didn't like anything I saw. Do you have any ideas?
View 8 Replies
View Related
Jul 1, 2004
We have a SQL 2000 database here. As of 3 days ago, it was about 75 megs ( 73 Data / 2 log ). As of today, it is 73 Meg Data / 15 GIG log. The log file went from 2 MEG to 15 GIG in a matter of days, yet the data file size has not changed. also, there are NO users to this database - it's new and has not been placed into production yet. I currently don't have any idea as to what happened - I'm looking into Lumigent's log explorer software now. Does anyone have an explanation as to why the log file would have increased 6000X in size, yet the data is not any larger ?
Thanks,
JK
View 6 Replies
View Related
Jan 14, 2005
Hi everyone.
I am trying to find a way to calculate everyday my DB Growth, I did find a script on some site but it seems to give me the same information as the taskpad wich is not very specific. Basicaly i would like to know the size of a table in MB or in whatevever conversion possible, so that i will be able to do some forcasting.
Any help here would be greatly apreciated.
View 1 Replies
View Related
Sep 14, 2007
All:
I've got a serious problem happening now with the transaction log on one of my databases. The log keeps growing at an alarming rate - avg. 1GB/hr on a ~550MB database; there's nowhere near that much insert/update activity in the database. This is causing space problems with the (hourly) log backups.
The database serves an internal web site; both the web server and the SQL server have been bounced, but this keeps happening.
Profiler shows many, many TransactionLog events in Profiler - orders of magnitude greater than other events. User activity is no greater than it ever is. DBCC OPENTRAN consistently returns no open transactions. I've created a job to checkpoint and truncate the log right before the log backups, but that's obviously only a stopgap for the backup space problem & not a solution.
Do you have any suggestions as to where to go from here? I'm at a loss, and this can't stay as it is. Thanks for ANY input...
View 3 Replies
View Related
Sep 19, 2005
Hi,
While I am importing few data to my SQL Server (2000) database, the free space is gettig increased in GBs. The database's File Growth is 1 MB (same for both Data Files and Transaction Files). The same database I restored on another PC and did the same process, it is working fine as there is no enormous growth in Free Space.
Any idea why this is
Regards,
Eldho
View 8 Replies
View Related
Jan 15, 2008
Hello Everyone,
I appear to be doing something wrong. When I
set my Table for Unrestricted File Growth and
save it, When I go back and look at the settings
it is set back to restricted file growth.
It seems to me if you set any settings and apply
and/or save it. Should it not keep that setting.
Any advice would be great.
Thanks
View 1 Replies
View Related
Mar 3, 2008
i have a database which has a log file size of 10 Mb. it goes into single user mode automatically . i tried to increase the size of file size of log file from 10 mb to 50 mb... but i want to make it only 20 mb ... i am unable to change since it gives a message .cannot decrease the size of the file .. is there another way to decrease the size of log file .....
View 2 Replies
View Related
Sep 22, 2005
Hello,I need to monitor every 15 minutes growth in data file and log file .Since mdf and intial file sizes are set to high value,measuring these values at 15 min interval will not provide the changein size .My intention is to measure the log file size growth which helps tocalculate the disk space and bandwidth required to setup log shipping .We need to set up this infrastructure based on this calculationThanksM A Srinivas
View 6 Replies
View Related
Jul 28, 2007
SQL Server 2000 SP4. I built a large DTS package that grabs a numberof tables from an Oracle DB, does some scrubbing and date verificationand loads to a SQL Server DB. Most of the tables are full refresh anda few are incremental.Main DW: DwSQLStaging Area: DwLoadAreaSQLThe DW is about 60 Gigs. The Staging Area is about 80 Gigs. This isall good.However, the log file for the staging area is 50 Gigs and I'm tryingto find ways to not require such a large log file. I tried adding afew "BACKUP LOG DwLoadAreaSQL WITH TRUNCATE_ONLY" statements in theDTS package but figured out that because it's 1 DTS package it's all 1transaction. I've thought about breaking it up into multiple DTSpackages and truncating the log between running them but was hoping toavoid this. To be clear, I know how to shrink DB's and LogFiles...that's not the issue.Any Ideas? Thanks.
View 2 Replies
View Related
Feb 5, 2008
I want to be able to display the percent of growth for databases on the server. My query has the file sizes and names and the dates, its done monthly . How do i create a field with the growth % from one month to the next, or from one date entry , to the next. so it appears like this
2006 2007 2008
23000 % growth 3400 %growth 20000
keep in mind, this is how it is displayed in reporting services with a matrix(crosstab)
Here's my query
Code Snippet
SELECT RC_STAT.dbo.Tbl_Database_Statistics.AutoId, RC_STAT.dbo.Tbl_Database_Statistics.Server_Description,
RC_STAT.dbo.Tbl_Database_Statistics.Database_Size_Datetime, RC_STAT.dbo.Tbl_Database_Statistics.Database_Description,
RC_STAT.dbo.Tbl_Database_Statistics.Database_Size_Mb, RC_STAT.dbo.Tbl_Database_Statistics.Database_Location,
RC_STAT.dbo.Tbl_Database_Statistics.Database_File_Name, DbSTAT.MonthlyDBTotal
FROM RC_STAT.dbo.Tbl_Database_Statistics INNER JOIN
(SELECT Database_Size_Datetime, Server_Description, Database_Description, SUM(Database_Size_Mb) AS MonthlyDBTotal
FROM RC_STAT.dbo.Tbl_Database_Statistics AS Tbl_Database_Statistics_1
GROUP BY Database_Size_Datetime, Server_Description, Database_Description) AS DbSTAT ON
DbSTAT.Server_Description = RC_STAT.dbo.Tbl_Database_Statistics.Server_Description AND
DbSTAT.Database_Description = RC_STAT.dbo.Tbl_Database_Statistics.Database_Description AND
DbSTAT.Database_Size_Datetime = RC_STAT.dbo.Tbl_Database_Statistics.Database_Size_Datetime
View 4 Replies
View Related
Sep 12, 2007
I'm trying to insert a fact table using SSIS which has a primary key. This is a one time load of about 150m rows and after this I'll be doing incremental only. It takes more than 8 hrs and finally the ETL fails because of not enough disk space.
After that 8hrs,When I right click the property of the table and see the row count, I can see all 150m. but it seems like it takes forever to commit the transaction. But it keeps on adding to the Log and when it reaches just few hundred MB's left on disk space the row count just rolls back which takes another few hrs to come back to 0 and it fails....
I've changed the database recovery model to simple from full and I've set up the OLEDB destination- Rows per batch, and Maximum Insert Commit size to both 100000. Should I need to just leave it to 0? Or adding space to the log can solve the problem? ( I have almost 95 GB free before the load). I'm just having a hard time to understand why its taking forever to commit the transaction.
Can anyone shed some light?
Thanks-
Rok
View 8 Replies
View Related