Transaction Log Back Ups - Simple Question
Aug 15, 2007
When using the maintenance plan to backup t logs, should I enclude the system database logs such as master, model, etc...?
Thanks!
When using the maintenance plan to backup t logs, should I enclude the system database logs such as master, model, etc...?
Thanks!
Does anyone know what the commands would be? I am trying to create a job that puts a DB in simple mode then launches a reorg and re-index, then sets it back to full when it is complete. This way I can eliminate large transaction logs being created.
Any help would be great!
Thanks
Hi all,
I'm getting the following error message in NT event view:
Error: 9002, Severity: 17, State: 2
The log file for database 'db_sys' is full. Back up the transaction log for the database to free up some log space.
But I don't know how to back up the transaction log for the database.
Do I need TRUNCTE LOG (in E.M) to free up some log space?
I only know back up all the database using ALL TASKS -> BACKUP DATABASE in E.M. Please help me. Thanks in advance.
TH
I have a stored procedure that calls another stored procedure with thefirst stored procedure opening a transaction:BEGINSET XACT_ABORT ONBEGIN TRANSACTIONdoes various updates/insertscalls 2nd stored procedure to proccess updates/inserts common to manyother stored proceduresdoes more various updates/insertscommitENDThe problem I'm having is that within the 2nd stored procedure is thatif it encounters an error, it does not roll back the entiretransaction and I finish up with missing records in the database. Amusing this in the 2nd stored procedure:if(@TypeId1 = @TypeId2 and @Line1 <'' and @Line2 <'')beginRAISERROR('error message', 16, 1)RETURNendWhat could the problem be? From what I've read, it seems as thoughyou can't have an open transaction within one sp that calls another spand it maintains the same transactoin? Is this corrrect?I tired the following too, and I still couldn't get it to work. Anyideas anyone?************ sp 1 ***********Declare @AddressError char(3)SET XACT_ABORT ONBEGIN TRANSACTIONexec Sp2@AddressError OUTPUT,@variable1,@variable2,etc. etc************** sp 2 *****************@AddressError char(3) OUTPUT,if(@TypeId1 = @TypeId2 and @Line1 <'' and @Line2 <'')beginRAISERROR('error message', 16, 1)RETURNendSET XACT_ABORT ONBEGIN TRANSACTIONprocess updates/insertsSet @AddressError = 'no'Commit******** back to sp 1************If @AddressError <'no'BEGINrollback transactionENDcontinue doing updates/insertscommit
View 1 Replies View RelatedHi,
I have a case where I read from SQL Server DB and write to a flat file.
I have one Data Flow Task inside which I have a OLEDB source component that feeds rows to a script component that writes to a flat file. I have set the txn attributes for the container to "Required" and "Read committed" . But I find that rows are written to flat file even when I throw an exception from my script component. Question is how do I prevent rows from being written to the flat file if error/exception happens. I want the whole process to be in a single transaction.
Thanks
Hello, I do not know how to implement transaction roll back in asp.net application. I am using SqlHelper class to communicate with my sql db.Thanks, junior
View 1 Replies View RelatedIve got an insert statement that fails, and below that I have code like the following:
IF @@ERROR <> 0
BEGIN
-- Roll back the transaction
ROLLBACK
-- Raise an error and return
RAISERROR ('Error INSERT INTO Address.', 16, 1)
print 'test was here'
RETURN
END
However, there is now rollback and the inserts below it are going through.
what do i have wrong ?
The following transaction runs, then reverts back after 10 mins.
begin Tran [UpdateSundays]
update ScanAssetInformation with (rowlock)
set BusinessDate = dateadd(day,-2,BusinessDate)
where datepart(weekday,BusinessDate) = 1
commit tran [UpdateSundays]
begin Tran [UpdateSaturdays]
update ScanAssetInformation with (rowlock)
set BusinessDate = dateadd(day,-1,BusinessDate)
where datepart(weekday,BusinessDate) = 7
commit tran [UpdateSaturdays]
hi,
i want to create a disaster recovery site, to which i can fail over (not automatically),
and also to have the option to return the database to a point in time?
for example,
if my principal server fails in 17:00,
i want to have the option to make the mirror server available for users from 17:00 (or at least close to that time),
and also to be able the return to the data from 16:00 (in the mirror site).
Is it possible, and what is the best way to do it?
Thanks.
hi, I would like to know the correct reaction for a crash in both senarios.
First senario,
I made a full back up at 6 am , then scheduled sql server to make transaction log back up every 2 hours (8,10,12,2 pm,4,6,8) . If I have a crash at 12:30. How would I resotre the data in the first senario....Can I restore the full back up done at 6 am then restore the last transaction log backup ( which is 12 Noon ) . I am not sure If I need to resotre the whole tran from 6 am till the time it was crashed.
Second senario,
I made a full back up at 6 am, then scheduled sql server to make Incremental backup every 2 hours (8,10,12,2 pm,4,6,8) . If I have a crash at 3:00 pm. How would I resotre the data in the second senario. ....Do I restore the full backup at 6 am then restore each incremental backup backwords ( 2,12,10,8)
AS you can see, I am not sure how to deal with this issue, I do appreciate your feedback.
Best regards
Ahmed
hi, I would like to know the correct reaction for a crash in both senarios.
First senario,
I made a full back up at 6 am , then scheduled sql server to make transaction log back up every 2 hours (8,10,12,2 pm,4,6,8) . If I have a crash at 12:30. How would I resotre the data in the first senario....Can I restore the full back up done at 6 am then restore the last transaction log backup ( which is 12 Noon ) . I am not sure If I need to resotre the whole tran from 6 am till the time it was crashed.
Second senario,
I made a full back up at 6 am, then scheduled sql server to make Incremental backup every 2 hours (8,10,12,2 pm,4,6,8) . If I have a crash at 3:00 pm. How would I resotre the data in the second senario. ....Do I restore the full backup at 6 am then restore each incremental backup backwords ( 2,12,10,8)
AS you can see, I am not sure how to deal with this issue, I do appreciate your feedback.
Best regards
Ahmed
I have a Development database and I want to roll it back to Monday morning. I backed up the database and used the command:
RESTORE DATABASE ITTEST
FROM ITTEST20040203
WITH NORECOVERY
GO
RESTORE LOG MyNwind
FROM ITTEST20040203
WITH RECOVERY, STOPAT = 'FEBRUARY 2, 2004 09:00 AM'
GO
The transaction logs have never been truncated.
But it does not seem to have worked. It this the best way to do a roll back the database or have I missed out something.
thanks.
Hi,
Is there a way to write to a log-table inside a transaction which is rolled back without rollback of this log-entry.
thanks in advance
Raimund
I have a page that runs a transaction correctly after a button click.
I want to allow someone to click a button that rolls back the transaction, after the transaction runs on the first button click.
I can also successfully roll back within the first button click.
I'm getting a NullReference error when trying to access SqlTransaction.Rollback() outside the button click.
If SqlTransaction.Commit() completes without error can SqlTransaction.Rollback() be called after?
I tried making 'trans' a more global variable and it still gave me the error.
Button Click 1:
Dim trans As SqlTransaction
trans = connection.BeginTransaction()
try
'run SQL Statement
trans.Commit()
Catch e As Exception
trans.Rollback()
throw e
end try
Button Click 2:
trans.Rollback()
I am confused about save transaction in the below scenario :
begin transaction
save transaction t1
delete from #t1
save transaction t2
begin try
delete from #t2
[Code] ....
If there is error after delete #t2 , transaction t1 is rolled back. But i am not able to understand why i am getting error in the statement 'rollback transaction t2' . I am getting error as 'Cannot roll back t2. No transaction or savepoint of that name was found.'. but save point t2 is mentioned in the code.
Hi,
I followed Remus' post about not doing 'fire and forget'.
I have two queues, ProcessingSendQueue and ProcessingReceiveQueue.
Once i receive from ProcessingReceiveQueue, activation SP gets called on ProcessingSendQueue and ends conversation.
However,if I then get an exception, the action of the activation SP ( ie the ending of the conversation ) does not get rolled back... is this possible? I would have thought that the action of the activation SP would get rolled back too.
My ProcessingSendQueue activation SP is as follows:
ALTER PROCEDURE [dbo].[ProcessingSendQueue_AP]
AS
BEGIN
DECLARE @dh UNIQUEIDENTIFIER;
DECLARE @message_type SYSNAME;
DECLARE @message_body NVARCHAR(4000);
RECEIVE @dh = [conversation_handle], @message_type = [message_type_name], @message_body = CAST([message_body] AS NVARCHAR(4000))
FROM [ProcessingSendQueue];
IF @dh IS NOT NULL
BEGIN
IF @message_type = N'http://schemas.microsoft.com/SQL/ServiceBroker/Error'
BEGIN
RAISERROR (N'Received error %s from service [ProcessingReceiveQueue]', 10, 1, @message_body) WITH LOG;
END
END CONVERSATION @dh;
END
END
Goofed up and ran an update query. It messed up all the data in a single table. I'm trying not to restore the table from a previous backup since the backup is more than 20 GB. It's going to take forever to restore it. Any advice would be much appreciated!
Thanks!
Charles.
hi,
I need some example for a simple transaction, (I looked in msdn and I didn't understood this.)
lets say I have 2 update queries. I want to run this queries in a transaction, and make it rollback if one of the updates failed.
I'm a developer and using php to exec all the queries, so I need a simple answer from the transaction, one row with one column the can be 'success' or 'failed.' thats it.
thanks.
We had a siutation last night in our production environment that forced us to revert back to an earlier version of the database (before a major code rollout that failed). After restoring the days full backup (with NORECOVERY), and then restoring a DIFF backup (FULL RECOVERY and had checked Preserve Replication Settings)...the transaction replication failed.
Message #1
The replication agent has been successfully started. See the Replication Monitor for more information.
Message #2
2011-03-04 15:07:17.566 Copyright (c) 2008 Microsoft Corporation
2011-03-04 15:07:17.566 Microsoft SQL Server Replication Agent: logread
2011-03-04 15:07:17.566
2011-03-04 15:07:17.566 The timestamps prepended to the output lines are expressed in terms of UTC time.
2011-03-04 15:07:17.566 User-specified agent parameter values:
[code]....
I've tried reinitializing the publication/subscription and while that took brand new snapshots and copied it over to the replicated data server, it did not fix the problem.I read from a different post that I could try running "sp_replrestart" but that ran for about a half an hour and didn't appear to do anything but fill up our log files...did I not wait long enough?
The only thing I know to do at this point is to drop the publication on the production server and rebuild it completely (and with all the tables we're replicating that would take quite a bit of time.
We have a high volume database with 1000's of users and 1000's of procs. Our application enforces a 20 second timeout on all connections.
We can't adjust the 20 seconds - this is a business rule.
It sometimes happens that a proc does not complete within 20 seconds and then times out halfway though. This causes data inconsistency where 50% of the code was saved to the DB and 50% was not - seeing that a stored proc is not transactional and therefor does not roll back the code.
We can't put the code in a TRANSACTION in order to roll back when a time out occurs, because this causes exclusive locks on the tables.
So I guess my question is:
Is it possible to undo/rollback all the code in a proc when a timeout occurs - without using a TRANSACTION? And if a TRANSACTION is the only way - how do I avoid the exclusive lock and blocks?
I have a transaction number in my mapping table. I have a matching transaction number in my PDHist table. Sometimes I have matching transaction numbers in my PD table, but not always. This is causing no records to be returned. I have a One to Many relationship between my mapping table and both PD and PDHist.
Also, I need to check for nulls in my foreign exchange table.I can’t post the SQL because this is a classified project. However, it should be something like this, I think.
IIf(IsNull([Redem]![FX Rate]),([PDHist]![Remaining Balance]+[PD]![Closing Balance(TC)]).
The addition isn’t working. I think with a small push I can get this straightened out.
hi,
we have a SQL Server 2000 database which we set to 'SIMPLE' recovery mode at 6pm (due to nightly large data loads). We revert back to 'FULL' recovery mode at 6am.
My understanding was that in 'SIMPLE' recovery mode, the transaction log would not grow because it would automatically be truncated after a checkpoint. However this is not the case. I thought perhaps it could be due to a long running uncomitted transaction, but when I ran 'dbcc opentran', the oldest running transactions doesn't last for more than a couple minutes. I manually run a 'checkpoint' command as well in the hope of forcing the transaction log truncation. I repeat this a couple of times to no avail. When I run 'dbcc sqlperf(logspace)' , I can still see the transaction log growing.
It is not until I run 'backup log db with truncate_only' that the transaction log gets truncated.
I do not understand, why the transaction log does not get automatically truncated in SIMPLE recovery mode?
Andrew
We have the following scenario,
We have our Production server having database on which Few DTS packages execute every night. Most of them have Bulk Insert stored procedures running.
SO we have to set Recovery Model of the database to simple for that period of time, otherwise it will blow up our logs.
Is there any way we can set up log shipping between our production and standby server, but pause it for some time, set recovery model of primary db to simple, execute DTS Bulk Insert Jobs, Bring it Back to Full recovery Model AND finally bring back Log SHipping.
It it possible, if yes how can we achieve this.
If not what could be another DR solution in this scenario.
Thanks Much
Tejinder
My trancaction log is 25GB and my database file is 39GB. I justswitched to the 'Simple' recovery model from the 'Full' recovery model.When if ever can I expect the size of the transaction log to reduce insize? Is there anything else that I should do to aide with thereduction?Thanks,Peter
View 5 Replies View Related
Hi there - can anyone advise on the following issue. We have recently performed some server side tracing on a particular SQL instance over 24hr period. We are now attempting to load these into a database for analysis. Here lies the problem.
When we are loading the profiler trace files (one at a time) into the database the transaction log is growing at an excessive rate. Even though the database is in SIMPLE mode.
We are loading the traces using the command:
INSERT INTO sqlTableToLoad
SELECT * FROM ::fn_trace_gettable('MytraceFileName', DEFAULT)
Can anyone advise how we could possibly get round this issue as we're running out of space due to the transaction log.
Thanks
Hi all
Iam working in Prodcution ENV,Please help how make space
The log file for database 'Home_alone' is full. Back up the transaction log for the database to free up some log space.
Hello,I'm trying to create a simple back up in the SQL Maintenance Plan that willmake a single back up copy of all database every night at 10 pm. I'd likethe previous nights file to be overwritten, so there will be only a singleback up file for each database (tape back up runs every night, so each daysback up will be saved on tape).Every night the maintenance plan makes a back up of all the databases to anew file with a datetime stamp, meaning the previous nights file stillexists. Even when I check "Remove files older than 22 hours" the previousnights file still exists. Is there any way to create a back up file withoutthe date time stamp so it overwrites the previous nights file?Thanks!Rick
View 5 Replies View RelatedNew to Database Mirroring and I have a question about the Principal database server. I have a Database Mirroring setup configured for High-safety with automatic fail over mode using a witness.
When a fail over occurs because of a lost of communication between the principal and mirror, the mirror server takes on the roll of Principal. When communication is returned to the Principal server, at some point does the database that was the previous Principal database automatically go back to being the Principal server?
I need to run two reports each of A5 Size to run back to page and print on single A4 paper means in 1st half Sale bill will be printed and in second half Gate Pass Will Be Printed both report will be on same page and size and shape should be maintained. How to do it.
View 4 Replies View RelatedHello,I am hoping you can help me with the following problem; I need to process the following steps every couple of hours in order to keep our Sql 2000 database a small as possible (the transaction log is 5x bigger than the db).1.back-up the entire database2.truncate the log3.shrink the log4.back-up once again.As you may have determined, I am relatively new to managing a sql server database and while I have found multiple articles online about the topics I need to accomplish, I cannot find any actual examples that explain where I input the coded used to accomplish the above-mentioned steps. I do understand the theory behind the steps I just do not know how to accomplish them!If you know of a well-documented tutorial, please point me in the right direction.Regards.
View 2 Replies View RelatedHi All
I'm getting this when executing the code below. Going from W2K/SQL2k SP4 to XP/SQL2k SP4 over a dial-up link.
If I take away the begin tran and commit it works, but of course, if one statement fails I want a rollback. I'm executing this from a Delphi app, but I get the same from Qry Analyser.
I've tried both with and without the Set XACT . . ., and also tried with Set Implicit_Transactions off.
set XACT_ABORT ON
Begin distributed Tran
update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TRANSACTIONMAIN
set REPFLAG = 0 where REPFLAG = 1
update TSADMIN.TRANSACTIONMAIN
set REPFLAG = 0 where REPFLAG = 1 and DONE = 1
update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.WBENTRY
set REPFLAG = 0 where REPFLAG = 1
update TSADMIN.WBENTRY
set REPFLAG = 0 where REPFLAG = 1
update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.FIXED
set REPFLAG = 0 where REPFLAG = 1
update TSADMIN.FIXED
set REPFLAG = 0 where REPFLAG = 1
update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.ALTCHARGE
set REPFLAG = 0 where REPFLAG = 1
update TSADMIN.ALTCHARGE
set REPFLAG = 0 where REPFLAG = 1
update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TSAUDIT
set REPFLAG = 0 where REPFLAG = 1
update TSADMIN.TSAUDIT
set REPFLAG = 0 where REPFLAG = 1
COMMIT TRAN
It's got me stumped, so any ideas gratefully received.Thx
I have a design a SSIS Package for ETL Process. In my package i have to read the data from the tables and then insert into the another table of same structure.
for reading the data i have write the Dynamic TSQL based on some condition and based on that it is using 25 different function to populate the data into different 25 column. Tsql returning correct data and is working fine in Enterprise manager. But in my SSIS package it show me time out ERROR.
I have increase and decrease the time to catch the error but it is still there i have tried to set 0 for commandout Properties.
if i'm using the 0 for commandtime out then i'm getting the Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.
and
Failed to open a fastload rowset for "[dbo].[P@@#$%$%%%]". Check that the object exists in the database.
Please help me it's very urgent.
I am getting this error :Distributed transaction completed. Either enlist this session in a new
transaction or the NULL transaction. Description:
An unhandled exception occurred during the execution of the current web
request. Please review the stack trace for more information about the error and
where it originated in the code. Exception Details:
System.Data.OleDb.OleDbException: Distributed transaction completed. Either
enlist this session in a new transaction or the NULL transaction.have anybody idea?!