Sql Server 2005 Maintenance Task Delete Files Does Not Work
Mar 24, 2008
I used the toolbox to select maintenance cleanup task to create the job to do this. In reading similar notes regarding this problem, some people mentioned that there was a choice to include subfolders. I do not have this choice. When I execute select @@version I get Microsoft SQL Server 2005 - 9.00.3042.00 (X64) Feb 10 2007 00:59:02 Copyright (c) 1988-2005 Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 5.2 (Build 3790: Service Pack 2) . This is running on a cluster. Any idea what is going on here? Thanks.
I get the following message when I execute a mantenance plan to delete files older than 1 day.
Error # -1073548784
Executing the query "EXECUTE master.dbo.xp_delete_file 0,N'',N'',N'2007-09-30T07:56:09' " failed with the following error: "Error executing extended stored procedure: Invalid Parameter". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
I have some simple files but they are failing because the delete history task is failing as it is looking for files in a non existent directory.
It is looking for files in C:Program FilesMicrosoft SQL ServerMSSQL10_50.INSTANCEMSSQLLog whereas it should be looking in C:Program FilesMicrosoft SQL ServerMSSQL10_50.MSSQLSERVERMSSQLLog
how I can get this corrected so I can get the Maintenance Plans to run correctly.
I have tried deleting and recreating the Plan but to no avail
I have several maintenance plans setup and working in SQL/2005 (sp2 9.0.3042) on several servers; however, the .txt maint plan output files will not delete on any of the servers. The backups run fine, the old database and transaction log backup files delete fine (.bak and .trn); but, the .txt files won't go away unless I delete them by hand. I tried copy/paste the scripted command (xp_delete_file) and it runs with no errors but does not delete the old .txt files. ( This server was upgraded from SQL/2000 -
I have the following issue with Maintenance plan backups that work for BAK DIF and TRN to a remote server share. When I try and remove the old files with a clean up task I get an error and the files don't get deleted.
The version is as follows Microsoft SQL Server 2005 - 9.00.3054.00 (X64) Mar 23 2007 18:41:50 Copyright (c) 1988-2005 Microsoft Corporation Standard Edition (64-bit) on Windows NT 5.2 (Build 3790: Service Pack 2)
The error result is as follows,
Failed-1073548784) Executing the query "EXECUTE master.dbo.xp_delete_file 0,N'\\EXECUTE master.dbo.xp_delete_file 0,N'\ABCD-A1\BACKUPS\ABCD_BACKUP\ABC_DAILY\ABCD',N'trn',N'2008-01-13T12:52:49'" failed with the following error: "xp_delete_file() returned error 2, 'The system cannot find the file specified.'". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
The maintenance plan seems to be adding extra "" though when i enter the code directly in a query i get same error.
xp_delete_file() returned error 2, 'The system cannot find the file specified.'
The servers belong to the same domain and are using the same Service account which has all the necessary rights to the share and the file directory location. The backups work but i get the error on the cleanup task.
Trying to figure out how to get the Cleanup task to delete old files. The same happens for all file extensions and I have tried other locations with simpler file paths same error.
Hi all, We have a SQL Server 2005 64-bit, and recently I upgrade from build 3042 to 3054 and I try to do a maintenance plan for transaction logs(TL) backup, including cleanup for two days (have full backup every night).
Problem I have is that I want the TL files to dump in a different location(due to disk space), so I put in the UNC path in the "Create a backup file for every database - >Folder:\FileServerTLDBLogs"
NB: if using the local drives, it work
Check List Security: - The account that I used to create the plan is an sa account - The location that I dump the TL files, I have full access to the folder
SQL Statement: exec xp_cmdshell 'dir FileServerTLDBLogs' (it list all files)
Is this a bug for 64-bit? because I can do this on SQL Server 2005 32-bit and it's work perfectly
I'm using below script within the execute T-SQL statement task of the maintenance plan, which is created to delete the zip files older than 2 days, the task runs before the backup database task gets kicked off, but some how the task is not deleting the old .zip files.
Code being used within the task: output of the @delete_file variable within the delete command is Del X:SQLBackup*2015_05_06*.zip before it is passed to the final exec statement, the server is in EST time zone.
I deleted a database referenced in a maint. task and then I tried to delete the maint. task. I am getting an error "Exception has been thrown by the target of an invocation" and also it says Named Pipes Provider:, error 40: Could not open a connection to SQL , error 53). I think it has something to do with deleting Maint. Tasks outside of SSIS. Does anyone know of a solution?
I'm using SQL Server 2005 (product version = 9.00.1406.00, product level = RTM, and edition = Developer Edition). I have a db with a number of tables; I created a Foreign Key in one table and added a Foreign Key w/ ON DELETE CASCADE to it -- all using Microsoft SQL Server Management Studio. When I delete the record in the table with the foreign key, the record in the other table does not get deleted. I tried doing this with a simple SQL script in Microsoft SQL Server Management Studio and in a simple .Net / ADO (C#-based) program. Only the record in the table that I'm specifically deleting is deleted.
Here's the table that is referenced by the foreign key (I told Server Management Studio to write out the script):
USE [CHNOPSDb] GO /****** Object: Table [dbo].[tblDeviceContainer] Script Date: 08/13/2007 16:47:03 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[tblDeviceContainer]( [ID] [int] IDENTITY(1,1) NOT NULL, [DeviceContainerTypeID] [int] NOT NULL, CONSTRAINT [PK_tblDeviceContainer] PRIMARY KEY CLUSTERED ( [ID] ASC )WITH (IGNORE_DUP_KEY = OFF) ON [PRIMARY] ) ON [PRIMARY]
Here's the table that has the foreign key to the above table (again, I told Management Studio to write out the script):
USE [CHNOPSDb] GO /****** Object: Table [dbo].[tblNode] Script Date: 08/13/2007 16:46:40 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO SET ANSI_PADDING ON GO CREATE TABLE [dbo].[tblNode]( [ID] [int] IDENTITY(1,1) NOT NULL, [NodeName] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL, [NodeTypeID] [int] NOT NULL, [UnitID] [int] NOT NULL, [pDeviceContainerID] [int] NOT NULL, [NodeIndex] [int] NULL, CONSTRAINT [PK_Node] PRIMARY KEY CLUSTERED ( [ID] ASC )WITH (IGNORE_DUP_KEY = OFF) ON [PRIMARY] ) ON [PRIMARY] GO SET ANSI_PADDING OFF GO USE [CHNOPSDb] GO ALTER TABLE [dbo].[tblNode] WITH CHECK ADD CONSTRAINT [FK_tblNode_tblDeviceContainer] FOREIGN KEY([pDeviceContainerID]) REFERENCES [dbo].[tblDeviceContainer] ([ID]) ON DELETE CASCADE
I then perform a delete using the following:
Use CHNOPSDb;
delete from tblNode where ID = 1;
It deletes the tblNode record but doesn't delete the tblDeviceContainer record that is referenced by tblNode.
I have two FTP Tasks configured in my SSIS package. One is for "Receive files" and the other is set for "Delete remote files." Both use variables for the source/destination paths. My remote path variable contains a wild card in the name field such as /usr/this/is/my/path/*.ext and it is working to FTP all the .ext files to my working directory. I then rename the files and want to remove the original files from the FTP server. I use the same variable as the remote path variable in the delete as I do in the receive.
Using the same FTP connection manager for both tasks I am always getting a failure on the delete. The FTP connection manger is setup to use the root user. Using a terminal I am able to open an FTP connection to the server and remove the files manually. There doesn't seem to be any detailed documentation on the FTP Task configured for Delete remote files so I'm hoping someone might have some insight to the problem.
I receive the same message for each of the files that was downloaded: Error: 0xC001602A at MyPackage, Connection manager "FTP Connection Manager": An error occurred in the requested FTP operation. Detailed error description: 550 usr hisismypathdatafile1.ext: No such file or directory. The attempt to delete file "usr hisismypathdatafile1.ext" failed. This may occur when the file does not exist, the file name was spelled incorrectly, or you do not have permissions to delete the file.
With the root user/working manually I'm not understanding the permission reason, the file does exist and is spelled correctly.
We are Downloading files from FTP Site using "FTP Task" and we need to remove those files after downloading.
We need to delete files of specific names, like "Names_*.TXT" (All the txt files and Names starting from "Names_") so We are using Delete Remote Files and Remote path we are specifying as "/Names_*.TXT"
This doesn't work its not able to delete the files so, We tried using "*.*" and also given the specific filename like "Names_A1.TXT" then also its not deleting the remote files.
Can someone tell what could possibly be the problem with SQL7 not running backups that were created and scheduled using the maintance plan wizar. There is no job history, or any record of the backups running. However, I did notice that the SQL Server Agent was set to manual. Could that perhaps have something to do with the jobs not running. If possible could some give me possible solutions or answers as to why these jobs didn't run.
It works remotely if I run it via command prompt. But when I add this to a TSQL job on my remote SQL instance, it runs without deleting anything. What I'm missing?
I have inherited a server on which is a maintenance plan with two subplans on different schedules. In each subplan there are Execute T-SQL tasks with scripts for index rebuilds. Each task is set up with a Completion arrow to the next task and a Failure arrow to a Notify Operator Task. I was asked to add a task for index rebuilds to a specific subplan for a specific database, which is what the other tasks also do. I discovered that my task was failing but the others were fine. No notification was sent about my task failing even though the job is marked in MSDB as a failed job. I have sent a test email using the "Send Test Email..." option when right clicking Database Mail in SSMSand I receive an email so I know Database Mail works.
I set up a test job to model the index job that I can't get notifications from. I have two T-SQL tasks that just select the top row from a small table. The first task has a syntax error that I did so it would fail. I have a failure arrow to a Notify Operator Task and a Completion arrow to another T-SQL task with no syntax error which has a Success arrow to a Notify Operator task. As expected, when I execute this job I receive one failure email and one success email.
The only other troubleshooting step I know to try is to add a Notify Operator task before my failing task. That Notify Operator task will hopefully fire to tell me that the previous step was successful. I am not having problems with the other steps so I was just thinking I would try to get the subplan to send me a success email about one of the steps that has been working fine.
I am looking for a table where Maintenance Clean Up Task configuration is stored. For example, Delete file older than the following - which is 2 days. Which table can I retrieve the setting in msdb ?
I am using Database Maintenance Plan to run backups. But I am quite short in disk space. So I declared to remove previous backups to release disk space. Still have problem when writing backup file onto the hard drive. I suspect that SQL first tries to complete backup and put in on the disk and only after that removes old file. Could not find any tips in BOL or elsewhere in what order this all goes - backup - then removal or removal - then backup. If anyone know I would be very happy for the rest of my life. Thanks.
No matter what table, view, or stored proc I pick, it always says that it doesn't exist at the source. I know it exists because I am picking it from the list of tables, etc. that the GUI provides.
I've created a linked server (and set up the corresponding schema.ini file) in order to perform bulk-inserts from some CSV text files into SQL tables (from my standpoint the text files are just for reading purposes). The linked server works fine (I can select the data in the files without a problem).
Now the question: is possible to automatically detect when one or more of those files change in order to start the import process automatically? Something like having a trigger created on the CSV files Or there's no easy way to do that so I have, to say something, to create a Job that periodically checks if the files have changed programatically (say, recording each file's timestamp everytime is imported and comparing the recorded value with the current one, or whatever)?
Had to replace leased server hardware running Server2003, Sql2005. Had machine Server B, set up sql2005, set up maintenance plan to do full backup on all user dbs every night to Server C network share. Worked great for over a week. Came time to do the swamp, brought down old server, renamed Server B to Server A, ran: EXEC sp_dropserver '<old_name>' GO EXEC sp_addserver '<new_name>', 'local' GO
All other functions to the database seem to be working fine. Was using a network administrator account - same one that worked before the name change so the SID is the same. Unfortunately, I decided to try and delete and start over in case it was just a gliche. Now, the history seems to be gone, but I can't actually delete the job or the maintenance plan. Get an error "Does not allow remote connections." even when I'm physically sitting at the machine.
Now, maintenance plans won't run - can't delete them, can't modify them. Other dts packages that execute a query or bring in data from other datasources work just fine.
I'm using Win SBS R2 2003 with SQL Server 2005 Workgroup Edition, and the database have been up and runnning now.
I was wondering whether I can perform a backup database from the workstation computers or not? That would mean I login as domain users and using the SQL Server Management Studio (located in the server), I will perform the particular database that I wish to backup... Can this be done? If so, perhaps someone could tell me how (step by step procedure) or is there a document about this?
In the first step of my SSIS package I need to get files from FTP and dump it/them in a local directory, but it's more than that, the logic is like this: 1. If no file(s) found, stop executing and send email saying no file(s) found; 2. If file(s) found, then compare it/them with existing files in our archive folder; if file(s) already exist in archive folder, stop executing and send email saying file(s) already existed, if file(s) not in archive folder yet, then transfer it/them to the local directory for processing.
I know i have to use a script task to do this and i did some research and found examples for each of the above 2 steps and not both combined, so that's why I need some help here to get the logic incorporated right.
Thanks for the help in advance and i apologize for the long lines of code!
example for step 1: ----------------------------------------------------------------------------------------------------------
' Microsoft SQL Server Integration Services Script Task ' Write scripts using Microsoft Visual Basic ' The ScriptMain class is the entry point of the Script Task.
' The execution engine calls this method when the task executes. ' To access the object model, use the Dts object. Connections, variables, events, ' and logging features are available as static members of the Dts class. ' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure. ' ' To open Code and Text Editor Help, press F1. ' To open Object Browser, press Ctrl+Alt+J.
Public Sub Main()
Dim cDataFileName As String Dim cFileType As String Dim cFileFlgVar As String WriteVariable("SCFileFlg", False) WriteVariable("OOFileFlg", False) WriteVariable("INFileFlg", False) WriteVariable("IAFileFlg", False) WriteVariable("RCFileFlg", False) cDataFileName = ReadVariable("DataFileName").ToString cFileType = Left(Right(cDataFileName, 4), 2) cFileFlgVar = cFileType.ToUpper + "FileFlg" WriteVariable(cFileFlgVar, True) Dts.TaskResult = Dts.Results.Success End Sub Private Sub WriteVariable(ByVal varName As String, ByVal varValue As Object) Try Dim vars As Variables Dts.VariableDispenser.LockForWrite(varName) Dts.VariableDispenser.GetVariables(vars) Try vars(varName).Value = varValue Catch ex As Exception Throw ex Finally vars.Unlock() End Try Catch ex As Exception Throw ex End Try End Sub Private Function ReadVariable(ByVal varName As String) As Object Dim result As Object Try Dim vars As Variables Dts.VariableDispenser.LockForRead(varName) Dts.VariableDispenser.GetVariables(vars) Try result = vars(varName).Value Catch ex As Exception Throw ex Finally vars.Unlock() End Try Catch ex As Exception Throw ex End Try Return result End Function End Class
example for step 2: -------------------------------------------------------------------------------------------------------
' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Public Class ScriptMain
' The execution engine calls this method when the task executes.
' To access the object model, use the Dts object. Connections, variables, events,
' and logging features are available as static members of the Dts class.
' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.
'
' To open Code and Text Editor Help, press F1.
' To open Object Browser, press Ctrl+Alt+J.
Public Sub Main()
Try
'Create the connection to the ftp server
Dim cm As ConnectionManager = Dts.Connections.Add("FTP")
We have our Maintenance routines set up to delete files older that one week, but it is not working...
...Consequently, we forget to go out and remove the files manually, thus we lose backups due to the drive being full and have to manually remove files, then backup the databases and start all over again.
Has anybody had this problem and can you tell me where to look to try and figure out what the problem is ??
I want to delete all backup files from a folder older than a specific date. But if I use the beklow query, I need to pass how many days of older backup files I need to delete whereas in my case, I dont know how many days/month/syears of old backup files are there in the backup folder.
I'm trying to create a maintenance plan to backup a database, and when I go to select the database in the dropdown, I don't have any of the user databases any more. They were there several weeks ago when I first created the 'plan'. Does anyone know how to get the user db's to show up in the dropdown?
I have a strange trouble on SQL Server 2005 on XP/local and Win 2003/server. When I connect local database and create a backup plan named by "backup plan", I can see the name under "maintenance plan" and "Jobs" under SQL Server Agent. But if I connect to server database and do the samething, I cannot see the name under "maintenance plan". I can see the name under "Jobs" under SQL Server Agent. But cannot delete this job, and message:
Drop failed for job "Backup Plan".(Microsoft.SqlServer.Smo).
We have a SQL Server 2005 instance that I am running a few databases on, nothing too big though.
CRM-3.0
GFI-Network Monitor Database
Future Use
Solomon, and Sharepoint
The goal is to have all of our databases running on one server for backup and maintenace. Currently we are using VMWare to hose all of our servers and we use nightly scripts to backup all of the data. So backups are good to go.
What I am most concerned about is preventative maintenance on the databases. What would people recommend that I set-up for nightly, weekly, or monthly cleaning, integrity checks, etc? Are there third party tools that would do a better job? I am fairly naive on SQL server so any help would be greatly appreciated.
I am new to SSIS. I need to read the files in a remote server folder and ftp them to the other remote server folder.
I know , I have to set up FTP task and FTP task must use a variable to provide the path information. But I am not sure how I can do this. Anyone suggest me with the script task to populate the variable and steps involved in this.
I created a database maintenance plan on sql server 2005 (standard SP2, cluster environment). The plan created successfully on scheduled successfully. But when execute the plan, it fails. Here is the information in the log:
The last step to run was step 1 (TranLog backup).,00:00:01,0,0,,,,0 09/19/2007 12:06:38,Tranlog backups.TranLog backup,Error,1,LAIWWORKSITE1LA,Tranlog backups.TranLog backup,TranLog backup,,Executed as user: STROOCKSQLSRV. Microsoft (R) SQL Server Execute Package Utility Version 9.00.3042.00 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 12:06:38 PM Warning: 2007-09-19 12:06:39.27 Code: 0x80012017 Source: Tranlog backups Description: The package path referenced an object that cannot be found: "PackageTranLog backup.Disable". This occurs when an attempt is made to resolve a package path to an object that cannot be found. End Warning DTExec: Could not set PackageTranLog backup.Disable value to false. Started: 12:06:38 PM Finished: 12:06:39 PM Elapsed: 0.688 seconds. The package execution failed. The step failed.,00:00:01,0,0,,,,0
We have lately experianced a strange problem with our SQL Server 2005 x64 (SP2) that is NOT consistent but when it happens it happens on the same time.
Almost every night at 03:30 one of our databases (not all) seems to be down or locked. When i have a look at the order table in this database I can see that we have stopped recieving orders after 03:30. Two hours later (05:30) I can see the following error each minute in the error log until we reboot the server:
All schedulers on Node 0 appear deadlocked due to a large number of worker threads waiting on LCK_M_IS. Process Utilization 0%%.
As we have a maintenance job running at 03:30 it feels like this is the problem. The job performs the following tasks: "Check Database Integrity -> Rebuild Index -> Reorganize Index"
When i look at the history of the job it looks like it's not completed and only the "Check Database Integrity" task was runned. No error message here either.
Also when i look in the error log i can see that the Maintenance job is started but never ended. Worth to notice is that I get the follwoing info in the log after the start-message:
Configuration option 'user options' changed from 0 to 0. Run the RECONFIGURE statement to install.
Also, when i run this job manually daytime it works great!
Anyone having any idees on this? Is it possible to track this even more? I'm tired of restarting the server 03:30 in the morning =)
Hello, I use SQL Server Express 2005 SP2.(Microsoft SQL Server 2005 - 9.00.3042.00) I want to make a maintenance plan, but I don't have a task called "maintenance plan" I am user sa with "sysadmin" How can I make a maintenance plan. Thanks for your help Thomas