Eventhandlers Don't Work If A Package Is Started From A Checkpoint - BIG NASTY BUG (I Think)
Mar 2, 2006
All,
I'd like someone from MS to confirm something for me if possible.
I've just come across a situation where a package that is restarted from a checkpoint fails to execute any of its eventhandlers if those eventhandlers are scoped to a parent container.
Is this a bug or by design? Either way I'm VERY worried about this because this is not in the least bit intuitive. This is the most dangerous bug that I have seen in RTM.
-Jamie
If you want a demo of this happening then download this package: http://blogs.conchango.com/Admin/ImageGallery/blogs.conchango.com/jamie.thomson/20060103Package1.zip
It doesn't need any configuration because there are no connection managers. You can just run it standalone.
Run it first time and it will fail but you WILL see msgboxes popped up by the eventhandlers. Go into "Script Task 2" and make the return value "Success" rather than "Failure".
Run it a second time. The package will complete successfully but NO msgboxes will pop up from the eventhandlers. They are not executing.
Reply here or email direct with any queries!
[Microsoft follow-up] This needs either fixing or explaining.
View 11 Replies
ADVERTISEMENT
Mar 20, 2006
[Yes, I know I keep posting this, and I'm sorry for that. But I want someone to confirm that this is a bug and not by design. I feel it is very very important]
All,
I'd like someone from MS to confirm something for me if possible.
I've just come across a situation where a package that is restarted from a checkpoint fails to execute any of its eventhandlers if those eventhandlers are scoped to a parent container.
Is this a bug or by design? Either way I'm VERY worried about this because this is not in the least bit intuitive. This is the most dangerous bug that I have seen in RTM.
-Jamie
If you want a demo of this happening then download this package: http://blogs.conchango.com/Admin/ImageGallery/blogs.conchango.com/jamie.thomson/20060103Package1.zip
It doesn't need any configuration because there are no connection managers. You can just run it standalone.
Run it first time and it will fail but you WILL see msgboxes popped up by the eventhandlers. Go into "Script Task 2" and make the return value "Success" rather than "Failure".
Run it a second time. The package will complete successfully but NO msgboxes will pop up from the eventhandlers. They are not executing.
Reply here or email direct with any queries!
View 1 Replies
View Related
Nov 30, 2006
hi everyone,
Is there anyway to find out the number of runs of a package using its checkpoint file?
Or is there anyother way to figure out the number of runs of a package till it gets succeeded?
Thanks in advance,
View 3 Replies
View Related
Aug 26, 2015
I can set the propperty of the checkpoint file to a local drive, but not to a UNC path mapping, mapping to my host server. (loop back)
Example: "I:FILEFILE1$InputArchiveOntwikkel " is possible as checkpoint file property.
S11487O$InputArchiveOntwikkel is not possible, though this is the same folder on the local host.
For data source both unc path and drive mapping are allowed. Why this difference?
View 5 Replies
View Related
Apr 25, 2007
Hello all,
We have noticed in our environment slowness when starting SSIS packages from SQL Server jobs. I did a quite detailed study on when the slowness actually occurs and what are the consequences. Here are the results.
The SSIS package execution is slow if all the following is true:
The package is started from a job. If started directly as a SSIS package, the execution is fast.
The job is running on a 64 bit Windows Server (SQL Server 2005 SP2). The SSIS package and the job are either on the same server or on different servers (the second server is SQL Server 2005 SP1). If the job is run on a 32-bit workstation (Windows XP SP2) the execution is fast (the SSIS package still being on the server).
The package contains tasks.
§ If there are no tasks, just an empty sequence container, the execution is fast.
§ If a package that has no tasks has logging into the database configured, the execution is fast.
§ Slowness has been verified with A) a package having a single Execute SQL statement and B) a package having a Send Mail task.
It doesn't seem to matter which user account is used on when running the job.
The slowness happens in several locations, e.g. (there are also others, at least the following have been verified)
There is exactly 30 seconds lag between starting the job (as seen from job history) and when PreValidate (as seen in the sysdtslog90 table) of the package appears in the log.
The validation of the package takes 15 seconds (the time difference in the log betwen PreValidate and PackageStart)
The problem is really affecting our production environment. Currently the only solution we have come up is to put all the jobs on a workstation and use the workstation as a production server for the jobs.
I haven't heard anyone else having the same problem.
View 26 Replies
View Related
Mar 31, 2006
Hi All,
I am trying to import data from flat file to sql server 2005 DB table using SSIS. There are 4 different text files in the input folder. I am using for loop to iterate reading and importing 4 times. In order to do this I have set "Connection String" property of a connection manager to a package level variable using expression. There is a script which supplies the source file name and assigns to this variable.
The package works fine. The data gets imported successfully into the destination table. However, when I close the package, reopen it and then run, the Data Flow task fails with the error "Can not open the source file". If I enter a valid file name in the Flat File Source task and run the package it works again. As soon as I close the file/package, reopen and then run, the data flow task fails with the same error.
How do I make this working. I am planning to schedule a job which will execute the package programatically. In that case no user intevention is possible.
I would appreciate your help on this. I can provide further details if required.
-SGK
View 13 Replies
View Related
Nov 4, 2006
Hi,
I am new to SSIS. I followed the direction of the tutorial Creating Simple ETL Tutorial package in BooksOnline. I have tried more than five times and have done exactly as suggested in the tutorial but it does not work.
The URL is:
http://msdn2.microsoft.com/en-us/library/ms169917.aspx
I get these warnings and finally fails:
1)[Lookup [30]] Error: Row yielded no match during lookup.
2) [Lookup [30]] Error: The "component "Lookup" (30)" failed because error code 0xC020901E occurred, and the error row disposition on "output "Lookup Output" (32)" specifies failure on error. An error occurred on the specified object of the specified component.
3) [DTS.Pipeline] Error: The ProcessInput method on component "Lookup" (30) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
4) [DTS.Pipeline] Error: Thread "WorkThread0" has exited with error code 0xC0209029.
Can someone help me with this tutorial error? or Am I doing something wrong.
Thank you,
View 6 Replies
View Related
Sep 8, 2006
Hi all,
does anyone know how to do event handlers in SSIS?
at the moment i've attached a send email tak to each failure from data flow tasks, but very hard to manage it.
was wondering how i can store the error message and maybe post it at the end of the project execution?
any help?? please
View 3 Replies
View Related
Feb 22, 2006
Hi All,
here is the trigger:
CREATE TRIGGER Members_Set_ApplicationSyncDate ON lyrisadmin.members_ AFTER UPDATEAS
BEGIN
IF NOT UPDATE(ApplicationSyncDate) BEGIN UPDATE lyrisadmin.members_ set ApplicationSyncDate = NULL where MemberID_ = (Select MemberID_ from inserted) AND List_ = (Select List_ from inserted)
END
END
Problem is: If someone runs a query like n"Update TBL Set List_ = 'goo' WHERE List_ = 'boo'
in this case MemberID_ is not present and and error is thrown in my Update statement. How can I verify that MemberID_ is or is not available in the trigger?
Many Many Thanks!
View 2 Replies
View Related
Nov 16, 2006
I ran into a pretty bizzare behavior in SSIS:
I am trying to set up a package with a built-in auditing. It has a OnPreExecute, OnPostExecute and OnError event handlers. I am trying to record when the package starts, completes, and the completion status. Each one of these event handlers has a script task that does the logging. I put in debug message boxes into each event handler script to understand what goes on. So here's the sequence of events:
1. When starting the package the OnPreExecute event fires. Right away it fires the second time. I'm guessing what happens here is the script task within the event handler fires its own OnPreExecute event - that's how the first message really pops up. The second message is generated by the actual package-level OnPreExecute event.
2. I have a condition within the OnPreExecute event handler which might set the task status to failure. You would expect the OnError handler to fire, right?.. Wrong! The package dies without firing either OnError or OnPostExecute event....
3. If i remove the condition in step 2, and force an error in the package body, i get an OnError event, and then 2 OnPostExecute events ( i guess for the same reason as in step 1).
What I'm trying to understand is why in the world my OnPreExecute and OnPostExecute events get fired by their own event handlers, yet when i fire other events within these event handlers the appropriate (other) event handler does not run.
Any ideas will be greatly appreciated.
View 9 Replies
View Related
Mar 27, 2007
Yesterday I had a nasty mirroring problem.
The principal and the mirror server are both running SQL Server 2005 64-bit Enterprise Edition. Witness is running Workgroup Edition. We are running in high availability mode (SAFETY ON).
3 nights before something strange happened and 1 database failed over (out of the six being mirrored) for some unknown reason. The other 5 didn't. We failed that one back and all seemed ok over the weekend.
We came in Monday and found that the main live OLTP database was showing as the Principal on BOTH servers (in SMO/Management Studio and also in sys.database_mirroring). This is the "split brain" scenario that the presence of the witness is supposed to prevent. Both databases were accessible with a USE statement - clearly not right.
I pondered what to do, eventually I decided to remove mirroring from this database. That was ok until suddenly a few minutes later we realised the (original) Principal was in recovery! uisers of course were kicked out/unable to connect. I tried to force recovery with RESTORE DATABASE dbname WITH RECOVERY but it complained users were connected!
I had to KILL the users then recovery proceeded and the database became available again. I forced the mirror offline to prevent accidental usage.
This is obviously a nasty situation where mirroriing - which is supposed to prevent downtime - actually caused it instead.
I intend to log a call with CSS but I wanted to warn other users if they encounter something similar - it has shaken my confidence in mirroring quite severely.
View 4 Replies
View Related
Mar 7, 2006
It is possible for a task to be executed multiple times in parallel. e.g. If a package is executed more than once using "Execute Package Task".
I understand that in the future the ForEach container will have the ability to execute all its iterations in parallel as well.
This is a problem. In the eventhandlers we know the SourceID of the container raising the event so we can tie together events raised by the same container but this simply isn't possible if the events are running in parallel.
An example of where we would want to do this is drop a record into a table when a container fires OnPreExecute and then update that record with the container duration time upon OnPostExecute. This does NOT work when tasks run in parallel.
The solution to this is very simple. As well as capturing System:ourceID, capture a new value, System::ExecutionID as well.
In the meantime, can anyone think of another way around this?
cheers
-Jamie
P.S.
I have raised this bug here: http://lab.msdn.microsoft.com/ProductFeedback/viewFeedback.aspx?feedbackId=FDBK46904 and have asked for the provision of System::ExecutionID. If you think its a good idea then plese go ahead and vote for it.
[Microsoft follow-up]
This is still a valid request. Does it need opening on Connect?
-Jamie
View 3 Replies
View Related
Nov 21, 2006
Does anyone know how to create an eventhandler for a dataflow task specific events (OnPipelinePostEndOfRowset, OnPipelineRowsSent, etc.)? These events are available for logging via the standard logging infrastructure, but there seems to no eventhandler for them.
The reason I'm interested is that parsing information logged by these events using builtin log providers is not easy (eg., the number of rows sent gets burried somewhere in the message column (i'm using sql provider). I'd like to capture this information and record it cleanly in a custom ssis metadata database i'm building. Any ideas are welcome. Thanks.
-alex
View 8 Replies
View Related
Jun 1, 2006
Dear All,
I try to run a Package which is placed in the Package Store via a job - but this does not work.
Some things about the package:
- The package evokes some warnings, because I do not use all columns from the datasource
- Package runs in BI Studio - log is written
- Package runs if startet from the Package Store - log is written
- Package Execution stopps immediately if started via job - without writting in the log
As hint - needless to say :-) - that the log provider is configured in all three spaces...
Does anybody now, if I could use the configured log provider from the package directly in the jobs _without_ doing the settings?
Other jobs - without the warnings - are running and placing there step results in the log.
Thanks in advice!
Cheers
Markus
View 3 Replies
View Related
Jul 20, 2005
I am currently working on a PHP based website that needs to be able to drawfrom Oracle, MS SQL Server, MySQL and given time and demand other RDBMS. Itook a lot of time and care creating a flexible and solid wrapper and amdeep into coding. The only problem is a noticed VARCHAR fields being drawnfrom SQL Server 2000 are being truncated to 255 characters.I searched around php.net and found the following :Note to Win32 Users: Due to a limitation in the underlying API used by PHP(MS DbLib C API), the length of VARCHAR fields is limited to 255. If youneed to store more data, use a TEXT field instead.(http://www.php.net/manual/en/functi...ield-length.php)The only problem with this advice is Text fields seem to be limited to 16characters in length, and I am having similar results in terms of truncationwith other character based fields that can store more than 255 characters.I am using PHP 4.3.3 running on IIS using the php_mssql.dll extensions andthe functions referenced here http://www.php.net/manual/en/ref.mssql.php.What are my options here? Has anybody worked around this or am I missingsomething obvious?James
View 4 Replies
View Related
Jan 3, 2007
I'm an IT developer with an experience of 4+ yrs and mainly working in VB, ASP and in SQL SERVER.
Now i got an offer to work only with SQL SERVER DTS packages ? Is it really works well for my career ? My profile in the new offer will be like...have to conver the vb application into sql dts service.
Please suggest me ! Thanks in advace.
View 2 Replies
View Related
Jun 28, 2001
I have a package that import data from one DB to another DB.
The package work fine but the scheduled job give me this error:
DTSRun: Loading... DTSRun: Executing... DTSRun OnStart: Transfer SQL Server Objects DTSRun OnError: Transfer SQL Server Objects, Error = -2147024893 (80070003) Error string: Error source: Help file: Help context: 0 Error Detail Records: DTSRun OnFinish: Transfer SQL Server Objects DTSRun: Package execution complete. Process Exit Code 1. The step failed.
Any idea?
View 1 Replies
View Related
Nov 14, 2006
Good morning....
I work with many packages in 3 environments...development, homologation and production.
I had in each enviroment the table WCOT_Config with your configurations for this enviroment. There's a connection called Config wich have a expression
ServerName = System::MachineName to pickup the right server for each enviroment.
My package catches the configuration that is in the machine which the expression of the Config points. This
was functioning perfectly, but now my new packages that i place in the
production server are importing for the development server.
Suddenly the configuration stopped to function, but only for the new packages.
I have to check anything else for the package uses the configuration ?
Sorry my poor english ! heeheh
Thanks to everybody
View 1 Replies
View Related
Mar 3, 2008
I have an SSIS package that has this logic:
Map drive(batch file)
For each file (csv) loop
Pump file data into sql server
Move file to "archive" directory(file system task)
Delete File (file system task)
End Loop
Unmap Drive (batch file)
The Map/unmap code is in a batch file
c:windowssystem32et use \10.10.10.10ShareName MyPassword /USER:MyUserName /YES
Unmap:
c:windowssystem32et use \10.10.10.10ShareName /DELETE /YES
Here are the results when running this package:
1. Running in BIDS on separate workstation. Everything OK.
2. Running on Server by right clicking on package in Integration Services (SSMS) and choosing "run". Everything OK
3. Running as job with SQL Agent: Package succeeds but no action was taken on the files, files in "ShareName" still there, so therefore no data pumped into SQL Server.
Now, the difference is the SQL Agent jobs are running using a domain account proxy. I'm not sure how that would affect things though--I have the tasks in the package set to fail the package if they fail, so they are not failing, the drives are being mapped o.k.
The computer with the share is non-domain, but that shouldn't matter--I am specifiying the local username and password in the batch file as you can see, and as you can see it works from the workstation in BIDS on a separate machine, and works on the server too as long as I don't run it as a job. The batch file sits on both the server and the local workstation with the same local path.
Any idea why the files aren't actioned when run as a job?
Thanks,
Kayda
View 4 Replies
View Related
Jul 11, 2001
I have several DTS packages saved 'locally' to the SQL server. I want to duplicate a package, so that I can make some changes then replace the original. I certainly don't want to rebuild the entire package from scratch. So, I open up the original package, go to the 'Package' menu and choose 'Save As', then give it a new name and press OK. No errors, all appears well, the title bar even shows the new name of the package. But, when I close the package and go the the 'local' package list, it (the new package name) doesn't appear in the list. Refresh, exit SEM, reboot - doesn't show up. I even looked in the MSDB table where packages are supposed to be stored (at least the name / package id / etc), and it doesn't show there as well. Tried from several client machines.
OS: Windows 2000 Server (advanced) SP2
SQL: SQL 2000 Server (no SP's)
Any help would be great.
Bryan Parke
View 2 Replies
View Related
Jun 26, 2007
hi,
I have many jobs on sql 05 and all work but one. This one writes to an Access DB on the same server as SQL. The package works fine. But when executed in the context of the SQL Agent job, it fails.
Jobs that write to a text file work fine. The Access DB has no password required. By the way, that job in sql 2000 worked fine.
Any ideas?
View 4 Replies
View Related
Jun 16, 2006
Hi everyone,
I'm stuck with this and I haven't idea how to solve it. I'm trying to migrate a dts 2000 package from BIDS and I obtain this message:
This wizard will close because it encountered the following error:
Index was out of range. Must be non-negative and less than the size of the collection.
Parameter name: index (mscorlib)
I'm going to Migrate DTS 2000 Package select my current sql2k production server (it has almost 600 dts although I think that is not any problem at all)
Wizard recognize without problems my server and then I put a folder for save them but on the next step appears the aforementioned message.
Does anyone ever faced this issue?
Thanks in advance,
View 1 Replies
View Related
Mar 31, 2006
I am having the same problems as those in another post. SSIS package works fine when executed in BIDS and through execute package utility but it doesnt work when executed as a step in a job.
The other problem is that the logging also doesnt work when i try executing it as a job. So I have no clue about what to do without knowing what error it is. When I run the job it simply says the step has failed.
I have tried most of the solutions posted in other websites most of them to do with using proxies with credentials but havent hit a solution. I would love to get any input on what to do.
Thanks
View 6 Replies
View Related
Dec 14, 2007
I am trying to execute an SSIS package from an MS Access 2003 database that imports a table from the Access database into a target table in SQL 2005. I saved the package in SQL 2005 and tested it out. If I run it from the Management Studio Console with Run->Execute ... everything works just fine. However, if I try to run it using the following line command "Exec master.dbo.xp_cmdshell 'DTExec /SER DATAFORCE /DTS SQL2005TestPackage /CHECKPOINTING OFF /REPORTING V'" the execution will always fail when the Access database is open (shared mode). The connection manager looks like this: "Data Source=E:Test.mdb;Provider=Microsoft.Jet.OLEDB.4.0;Persist Security Info=False;Jet OLEDB:Global Bulk Transactions=1". The error is listed below:
Code: 0xC0202009
Source: NewPackage Connection manager "SourceConnectionOLEDB"
Description: An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft JET Database Engine" Hresult: 0x80004005 Description: "Could not use ''; file already in use.".
What am I doing wrong?
View 4 Replies
View Related
Apr 7, 2006
Has someone managed to pass successfully a variable from a parent package to a child package? I€™ve tried a zillion permutations and I can€™t get it to work. The strange thing was that I was able to successfully do this with pre-RTM builds. Basically, what I am trying to do is:
The parent package has a variable, e.g. ExecutionID which I set using a script to System::ExecutionInstanceGUID. I verified that the variable is set correctly by dumping it to a SQL Server table.
I created a child package variable with the same name.
In the child package, I€™ve created a parent package configuration that points to the ExecutionID variable.
I am trying to read the variable in a Derived Column Task in which I have a column linked to @ExecutionID.
This doesn€™t work. Step-by-step instructions from someone who managed to concur this will be greatly appreciated.
Oh, I didn€™t have any luck hitting a breakpoint in a script task inside a child package with both in and out of process execution also.
View 1 Replies
View Related
Dec 14, 2006
This problem is a bit weird but I'm just wondering if anybody else experienced this.
I have a package that has file system tasks (copying dtsx files actually). Basically the package copies other packages to a pre-defined destination. Thing is, it only works if one of the packages it is configured to copy has some sort of sensitive data (e.g., a connectionstring with a password), otherwise it reports a success message on execution but doesn't actually do anything. I've checked the forcedexecutionresult and it is set to None for that matter.
Just wondering if anybody else experienced this problem and of course if there's a way to solve it.
Thanks.
View 2 Replies
View Related
Mar 5, 2001
Can i force a checkpoint?How?Will it have any implication?
TIA
View 3 Replies
View Related
Aug 26, 2003
Please help.
I would like to checkpoint my transaction log every night before full backup.
Would this affect the transaction log sequence in the event of a restore.
I run SQL Server 2K SP 3 on WIN 2K SP 3.
Thank you.
Regards
View 3 Replies
View Related
Oct 4, 2005
Hi,
Has anybody encountered this situation before? DB on SQL Server 2000 SP4 with trunc log on chkpt option turned on. Checkpoint trace flags were turned on but noticing no checkpoints are being done on one specific DB resulting into growing transaction log. No open transactions.
Any ideas?
Thanks.
View 1 Replies
View Related
Apr 12, 2007
I see a line in sys.sysprocesses. The process's status is suspend and the command is CHECKPOINT. I have the information here exactly as it is on my monitor. It seems is consuming hi cpu. What should I do?
[/code]
spid: 10
kpid: 7416
block: 0
waittype: 0x0081
waittime: 232546
lastwaittype: CHECKPOINT_QUEUE
waitresource:
dbid: 1
uid: 1
cpu: 427046
physical_io: 36695
memusage: 0
login_time: 2007-04-04 10:01:32.787
lastbatch: 2007-04-04 10:01:32.787
ecid: 0
open_tran: 0
status: suspended
sid:0x0100000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
hostname:
program_name:
hostprocess:
cmd:CHECKPOINT nt_domain:
nt_username:
loginame: sa
[/code]
Canada DBA
View 4 Replies
View Related
Jan 24, 2006
I have a package that uses checkpoint restart. It is resposible for truncatings many sets of tables and then loading them. There are several ExecuteSQL tasks to truncate the tables and several corresponding data flows to accomplish the loads.
If a load fails I want the corresponding truncate task to be part of the restart otherwise duplicate data may be loaded. Normally, SSIS will start at the failed task. I read something about containers that led me to think that if I put the truncate & matching load pair in a sequence container that the container would be the restart point, but either I read it wrong or it's not working that way.
Anybody know how to accomplish what I want to do?
View 10 Replies
View Related
Nov 27, 2006
With 2005 SP1. Have built a SSIS package that successfully saves a checkpoint file and sometimes successfully restarts. (I've also built some others that are 100% reliable).
On the unsuccessful restart it appears as though the failed steps and subsequent steps do not execute. the package appears to "complete" though and the checpoint file is removed as though everything is fine.
On a successful restart the failed step reexcutes and everything works fine.
The issue appears that when a failed step finishes at the same time as a successful step finishes there is contention in the process that writes the checkpoint file out and the checkpoint file is corrupt. The failing step runs in parallel with a successful step and the execution times are very similar so task A may complete before or after task B.
Contents of a good checkpoint file follows <DTS:Checkpoint xmlns:DTS="www.microsoft.com/SqlServer/Dts" DTS:PackageID="{3BFFF2F9-74BA-4CE9-8435-81CC198E8144}"><DTS:Variables DTS:ContID="{3BFFF2F9-74BA-4CE9-8435-81CC198E8144}"/><DTS:Container DTS:ContID="{3655F83D-5EA5-4F16-9B8F-520582A1229A}" DTS:Result="0" DTS:PrecedenceMap=""/><DTS:Container DTS:ContID="{DB2D7A57-D405-4B11-AF4A-41B331EE3F15}" DTS:Result="0" DTS:PrecedenceMap=""/><DTS:Container DTS:ContID="{DFC6A95F-CCFA-4FD9-B604-FCBD722B47D8}" DTS:Result="0" DTS:PrecedenceMap="YYY"/></DTS:Checkpoint>
Contents of a bad checkpoint file follows
<DTS:Checkpoint xmlns:DTS="www.microsoft.com/SqlServer/Dts" DTS:PackageID="{3BFFF2F9-74BA-4CE9-8435-81CC198E8144}"><DTS:Variables DTS:ContID="{3BFFF2F9-74BA-4CE9-8435-81CC198E8144}"/><DTS:Container DTS:ContID="{3655F83D-5EA5-4F16-9B8F-520582A1229A}" DTS:Result="0" DTS:PrecedenceMap=""/><DTS:Container DTS:ContID="{9FAD4043-8D5F-4044-915A-87ECABDE6A7C}" DTS:Result="1" DTS:PrecedenceMap=""/><DTS:Container DTS:ContID="{DB2D7A57-D405-4B11-AF4A-41B331EE3F15}" DTS:Result="0" DTS:PrecedenceMap=""/><DTS:Container DTS:ContID="{DFC6A95F-CCFA-4FD9-B604-FCBD722B47D8}" DTS:Result="0" DTS:PrecedenceMap="YYY"/></DTS:Checkpoint>
Has anyone seen this behaviour before?
View 1 Replies
View Related
Oct 26, 2007
Hi,
I've set up a number of jobs (not a maintenance plan) via a script in SQL 2005. These jobs do the following:
1) Full backup every sunday night
2) Differential backup every weeknight
3) Log backup every hour
The database is obviously in the full recovery model.
The backups all seem to be running, with one issue - the log file is still growing and is not being truncated. I was under the impression that a log backup should result in the log being truncated after each full backup. However, this does not seem to be the case.
Is there anything obvious I've missed that needs to be set up, or is there a way I can check that the full backup is actually setting the appropriate checkpoint and that the log backups are 'seeing' these checkpoints?
Thanks
View 2 Replies
View Related