I have been researching on how to cleanup four backup system files located in the MSDB database. I currently have approximately 1 million rows in each of these tables which is making my MSDB database to be 4 gig in size. I cannot find any instructions on how to gracefully clean these tables up or what to do to keep the number of rows down in these tables. The tables are backupfile, backupmediafamily, backupmediaset, and backupset. Thanks.
i am building a new deployment routine for one of our products using NANT (.net version of ANT). I takes all of .sql files out of vault for all non table objects and concatenates them into larger .sql files with IF EXISTS .... DROP ... GO CREATE statements which will then be fired off by osql batch files that will log the errors.
Problem...
When I open the resulting files in notepad I have junk characters (microsoft OEM stuff) in some of my object definitions which you do not see in the QA or if you opnen the file in Vault. I need to get this stuff out of there and I do not want to write a bunch of filesystem code to do it because I need this tested and ready by Friday COB. I know I can solve this by opening the problematic files in ultraedit and saving them one by one but I do have that kind of time.
I'm trying to integrate SQLEXPRESS in a custom application ("the app"). On installation, a new instance ("myNewInstance") is created, and a new DB ("myDB") is created on that instance.
Upon de-installation of "the app", I remove the new instance. To that end, I'm calling the SQLEXPRESS setup like this:
Code Snippet
setup.exe /qb REMOVE=SQL_Engine INSTANCENAME=myNewInstance The Problem: While the service and the instance are removed, the instance's data directory is not, and the DB Datafiles are still there. Now if I reinstall the app (re-creating myNewInstance), it uses the same directory structure, and when I try to re-create the DB, I get the following error message:
Code Snippet
Msg 5170, Level 16, State 1, Line 1 Cannot create file '[...]MSSQL.2MSSQLDATAmyDB.mdf' because it already exists. Change the file path or the file name, and retry the operation. So here's the Question: is there a) any way to tell setup.exe to completely remove the datafiles when uninstalling the instance, or b) any way to tell TSQL to overwrite the old datafiles if they exist?
When I log into Integrated Services on my SQL server, I see [Stored Package] -- File System and MSDB. When I deploy/import my SSIS packages which should it go under? Is there a difference and if so what the difference?
This is terrible thing to happed at 4:30 pm on Friday! I was working yesterday on some of deployment packages using file system (by the way, my msdb can not expand ) and just an hour ago I wanted to try some parent child deployment using file system. but, The only two folders I see in Integration services are Running Packages and Stored Packages. Nothing under those ones... I have checked default location for File System (C:Program FilesMicrosoft SQL Server90DTSPackages) and there are three packages there but these do not show in IServices. And another question, if I execute package by double clicking on .dstx file should that make it show under File System?
Can we remove execute access to the public role to all the system stored procedures ? Has anyone done this & are there any issues with doing this for lockdown. Let meknow
-Dev server lost all its raid drives (which included user data/log files and ALL backups) -Raid drives were removed from server, so we could try and recover the data (not sure why they had to remove them) -Recreating the user db's isn't an issue, but I REALLY need to retrieve the jobs from the MSDB (developer was working on dts package for the last 2 months, but didn't check it in to VSS) -Since the System DB's (model, master, msdb) were installed on the C drive, I was able to recover their mdf/ldf files.
So we build an new server and recreated the user db's. How do I go about getting the jobs data from the msdb database?
I've tried the following -
Started SQL server with the -T3608 and was able to detach the MSDB database, but when I go to reattach using the mdf/ldf files from the other server I get the following error message.
Error 5172: The header for the file 'C:mssqldatamsdbdata.mdf' is not a valid database file header. The PageAudit property is incorrect
I've been doing some research and found a lot of great articles on how to RESTORE (from backup - dont have) or Rebuild (wipes out the data I'm looking for) the MSDB, but not how to replace it with only the ldf/mdf files....
Any idea's would be greatly appreciated.
Thanks
PS - All backups are now being copied to another server (lesson learned)
I'm getting "Executed as user: SPIESQLService. sqlmaint.exe failed. [SQLSTATE 42000] (Error 22029). The step failed." on the TRN backup portion of the maintenance plan for the msdb and model databases. On review of files created it's clear that the msdb trn log backup is failing, but there's no other error to indicate the underlying problem.
Our sysadmin accidentally uninstalled SQL Server started to panic and reinstalled. Thankfully the data/transaction files to our important databases were still present and I simply reattached them but our DTS packages our gone.
However, we've done weekly backups of the msdb database. How do I get the DTS jobs out of these backups?
In one off my production box, we are notable to take a backups of MSDB . When i look at the error, it is failing locate allocation unit ID.. complete error as below
Msg 2533,Sev 16,State 1, Line 36 : Table error : Page (1:111720 ) allocated to object id 110623437, index ID 1, Partition ID 72057594043432960, alloc unit ID 72057594044874752 (type-inrow data) was not seen. This page is invalid or may have an incorrect alloc unit ID in its header.[SQLSTATE 42000]
Due this failure, we are unable to take the backup of MSDB database and our integrity check and reindex jobs also failured with the same.Also, I could see events of I/O issues with underlaying hard dirve with following name
DeviceHarddisk0DR0,has a bad block.
1. I dont no what could happen if restart my server, Question is: Does it recognize MSDB during server statup.
I have inherited a SQLS erver 2000 instance where the client neverimplemented a backup startegy for the "master" and "msdb" databases.MSDB is now showing errors and my only option for a restore is from atape backup of the server.Any thoughts..
Using SQL Server 2005 Server Management Studio, I attempted to back up a database, and received this error:
Backup failed: System.Data.SqlClient.SqlError: Backup and file manipulation operations (such as ALTER DATABASE ADD FILE) on a database must be serialized. Reissue the satement after the current backup or file manipulation is completed (Microsoft.SqlServer.Smo)
Program location:
at Microsoft.SqlServer.Management.Smo.Backup.SqlBackup(Server srv) at Microsoft.SqlServer.Management.SqlManagerUI.BackupPropOptions.OnRunNow(Object sender)
Backup Options were set to:
Back up to the existing media set
Overwrite all existing backup sets
I am fairly new to SQL 2005. Can someone help me get past this issue? What other information do I need to provide?
While checking the SQL server error logs, I notice that the pubs and msdb database are automatically being backed up, even though no job is set up to do so....in addition, its backing up to a directory that I cannot find on our network.....does anybody have an idea of whats going on ?
the path its backingup to is: (FILE=1, TYPE=PIPE: {'.pipedbasql70dbagent0s0'}).
We have a self-written procedure to restore transaction logs on a standby server (Sql Server 2000 sp4) We do this by joining 2 msdb tables, to find out which backups have been performed: backupmediafamily and backupset. Sometimes backups are NOT registred in table msdb..backupmediafamily. Underneath 2 examples.
OK- underneath queries shows the time the backup was created in the physical_device_name; joined with media_set_id 99% shows these correct data
select media_set_id, physical_device_name from backupmediafamily where media_set_id = 258716
Normal] From: XXX "(DEFAULT)" Time: XXX SQL statement: BACKUP DATABASE [msdb] TO VIRTUAL_DEVICE = "Data Protector_(DEFAULT)_msdb_06_00_14" WITH NAME = 'Data Protector: 2007/08/01 0064', DIFFERENTIAL, BLOCKSIZE = 4096, MAXTRANSFERSIZE = 65536; [Warning] From: XXX "(DEFAULT)" Time: XXX Error has occurred while executing a SQL statement. Error message: '<Microsoft SQL-DMO (ODBC SQLState: 42000):bdb> [Microsoft][ODBC SQL Server Driver][SQL Server]Cannot perform a differential backup for database "msdb", because a current database backup does not exist. Perform a full database backup by reissuing BACKUP DATABASE, omitting the WITH DIFFERENTIAL option. [Microsoft][ODBC SQL Server Driver][SQL Server]BACKUP DATABASE is terminating abnormally.'
Ive tried to do a full followed straight after by a diff but doent help.
It works remotely if I run it via command prompt. But when I add this to a TSQL job on my remote SQL instance, it runs without deleting anything. What I'm missing?
I have deployed to production a number of nested packages (parent packages that call child packages) to the SQL msdb via the Save As option rather than building a deployment utility. These packages reference configuration files in a static location off of the c: drive on the production server. In the development environment, when connection changes are made and I run the Reload with Upgrade option the connection manager takes on the new server and user id settings. However, out on the production side I get the following error from the SQL job log:
Cannot load the XML configuration file. The XML configuration file may be malformed or not valid.
As a result the SQL job uses the default connection information which references the development database rather than the production database. I did research the error but found no good solutions. Is there a way to ensure the configuration files are formed correctly and that the packages are correctly referencing the configuration files? We are trying to run the ETL updates via a SQL job.
What is the best way to restore a database from a folder of backups (including full, diff and log backups) without using the backup history in msdb?
I have a restore process that restores all backups on a regular schedule in order to fully verify their integrity. To do this, I use the backup history in msdb on each server that I'm monitoring. I had a thought the other day that I would be in trouble if I lost msdb. Then my backup history would only be as good as the last backup of msdb.
What I'd like to do is read a folder of backup files and generate a restore script up to a specified time. Would I use RESTORE HEADERONLY to do this? If so, would I use PowerShell to traverse each file in the folder?
I've written a custom script to delete backup files from location. But unable to modify now to count the number of files are deleted. How to modify the script...
/* Script to delete older than N days backup from a specific directory */
USE [db_admin] GO IF OBJECT_ID('usp_DeleteBackup', 'P') IS NOT NULL DROP PROC usp_DeleteBackup GO
While backing up our database, I am getting the following message:
Could not insert a backup or restore history/detail record in msdb.dbo.sysbackuphistory or sysrestorehistory. This may indicate a problem with the MSDB database. DUMP/LOAD was still successful. (Message 3009)
Currently we use a SQL maintenance plan to do a full backup of all our databases daily (about 40 databases on our production server). As you can imagine, this eats up disk space quickly so currently we manually zip the backup files and/or move them to an archive drive. I considered writing an application to walk through the backup folder structure and zip any .bak file it finds, but I know there are some third party tools out there that will backup/restore a MS SQL database. I was wondering if any of these also zip the backups once they are created. Any recommendations or suggestions are welcome.
I scheduled automatic backup process but its only showing backup of the only one .sql file in the backup folder. Other created .sql files are not backed up. Why is it so?
I want to do sql db backup.But how can I backup db to split backup files? The reason I want to split the backup file is becasue single file size is too big and I want to write to dvd.
Why are two log files generated when using a Text Log Provider with a Connection String property set to the following expression?
"c:\" + @[System::ExecutionInstanceGUID] + ".log"
I've set up package logging with a view to getting a unique, and linkable log. However, two files are always generated, no matter what the DelayValidation settings may be on the connection manager, file, and package tasks.
I have an export file that updates each night and dumps into prices.txt
I have to send them to an ftp site that processes them automatically if they're formatted correctly. They contain the same header information (saved in header.txt) and footer information (saved in footer.text). I then combine them all header.txt + prices.txt + footer.txt = glpcwholesale.prn
The script below is what I have so far but it isn't working and errors out on line 23.
' Create the File System Object Dim objFSO1, objFSO2,objFSO3,objFSO4
Set objFSO1 = CreateObject("Scripting.FileSystemObject") Set objFSO2 = CreateObject("Scripting.FileSystemObject") Set objFSO3 = CreateObject("Scripting.FileSystemObject") Set objFSO4 = CreateObject("Scripting.FileSystemObject")
'Creating string references to file locations Dim strDirectory, strFileHeader, strFilePrices, strFileTrailer Dim strFileGLPC