A few weeks back I had a problem with setting up Maintenance Plans on my SQL Server, the only fix in the end was re-install the SQL along with SP1. Since then backing up or restoring our developers database takes around 15-20 minutes whereas before it would take 1-2 minutes.
The server doesn't seem to be the problem as I have restored it to a 2nd server and get exactly the same problem there. Other databases on the same server of around the same size backup as expected.
The database is approximately 150MB's with the log file being 800MB's.
The server in question is a Dual Core Opteron 2.2Ghz with 4GB's of RAM, running Windows 2003 x64, fully patched and up to date. The SQL version is 2005 Standard Edition x64 also including SP1.
Plenty of disk space and no CPU activity even at the time of backups taking place.
Is there any way to backup a remote SQL Server that is on a hosted account to a local drive? We have a web hosted account that has a SQL Server that we do not have full admin rights on, just dbo access to the data and structure. We would like to be able to do backups and restores to our local development server if possible. Perhaps vis DTS or some similar means? WE cannot use the normal backup/restore as we do not have right to these fucntions nor can we access the servers local drives directly.
We have log-shipping set up between a source and 3 destination SS 2000 databases. Two of the destination servers actually perform their log restores across the network from the other secondary server. This allows us to only copy the files once from a remote location. All three servers stay caught up within 15 minutes of each other.
Recently, I added a fourth server to this that has SS 2005 SP2 (X64). I wrote a stored procedure that restores log backups from the same single location as the maintenance plan jobs. The problem that I'm experiencing is that this fourth server is not keeping up with the other three. It seems to take longer to restore the same log backups. The destination servers are all on the same domain. This fourth server was previously part of the same maintenance plan configuration as the others prior to rebuilding it for SS 2005 SP2 (X64). During that time, it stayed caught up with the other servers. There is another database on the new server that I am log-shipping to in the same manner and it stays caught up, though, for the most part, the log backups are smaller. There is a file on the fourth server with a ckp extension for the database in question that doesn't seem to exist for the other databases on this server and the other servers.
Any information on this behavior would be appreciated.
On one of our SQL Server 2014 boxes each database has a copy-only full backup made every night, in addition to the maintenance plan schedule of a full backup weekly, daily differential backups and log backups.
When performing a PIT restore in SSMS the restore file list lists the most recent copy-only backup as the full backup to use, not the most recent plan full backup. I noticed that using SSMS 2008 to start a PIT restore on the 2014 box does not have this problem, and lists the correct restore file sequence (ignores the copy-only backups).
I have a database 30 GB in size. I need to copy this entire database to laptop. I want to use backup and restore database on this laptop. I need to know how I can accomplish this wihtout copying the backup file to the destination server. Is there a way you can restore from a file over the network from a different server? I tried everything,with the security setup but was not able to do this. Any recomendations will be valuable.
I took some database backups from a client site to do some development. There were 20G, 16G and 200MB
i ran it through EM but waited for like 20 minutes for the 20G and 16G but nothing happen. Went to Profiler -- there was no activity log for backup/restore events.
went to QA and tried to restore the backup -- with the stats option but still nothing happenned. The 200MB restore worked though. i have restored a databases over 20G but haven't really encountered this problem. Can anyone assist me ? or have come across this problem ?
Recently, we had some tables inadvertantly dropped from a 20+ GB data warehouse in the middle of one of our load cycles. In order to avoid restarting the load, we had to make restore a copy of the database from a full database backup and then manually move in the dropped tables using DTS.
This ended up being a painful process, and I am looking to avoid it by restructuring the database into filegroups so I can mimic the table restore function from 6.5.
I am looking for pros, cons, gotchas, and best practices regarding splitting a MSSQL 7 database into multiple filegroups. We would be including 1 or more tables and all of thier indexes in each filegroup. Specifically, I have the following questions:
1. All the literature I read mentions the need for doing transaction log backups and restores if I utilize filegroup backups. Is this really necessary? We have turned off transaction logging for performance reasons, and since we completely control the addition of new data into the db.
2. Are there implications for disk read performance involved with filegroups? Our data resides on an EMC disk array. I feel that we'll most likely have between 20 and 30 filegroups.
3. If we decide to move a table (and its indexes) between filegroups, is the process as simple a dropping the clustered index and recreating it on the new filegroup (then dropping and recreating the n-c indexes)?
Any information y'all can post would be greatly appreciated.
With Migrating from SQL 2000 to SQL 2005 I realise that there are a few options available.
A SQL 2000 backup is able to be restored into 2005 and is workable however there are some issues with internals that have changed. I have identified some DDL which would need to be modified to work correctly.
If the Backup is restored and left in a recoverable state does 2005 allow for 2000 TLog backups to be restored as well ? I am assuming that as you can restore a full db backup that this is allowed.
I am asking the question as i have a very short and fixed deadline of moving from 2000 to 2005. A new datacentre is being moved to and the old DC is being turned off. all new hardware is being placed in the new DC so i need to move from the old to the new with very little downtime to the DB or app.
My approach is as follows
1. Full DB Backup (2000) copied across to 2005 server and loaded being left in a recoverable state. 2. Tlog backups copied across on scheduled basis and loaded to a point in time. 3. Test new 2005 DB with app/web frontend 4. Bring both in sync again and then perform 1 final Tlog backup, copy and load bringing app online again with very little downtime.
Has anybody tried this approach ? does anybody see any issues with my approach ? all comments welcome on this.
Data encryption and database restores. 1. When using data encryption on a 2 node active/passive cluster does it make any difference which server the sqlserver is running on when encrypting or dcrypting data? 2. When restring a database to a different server with encrypted data is the data decrytable or is it lost?
The space allocated to the Log in question is 180 GB. During this time period I was running TLog backups every 5 minutes, yet the log continued to chew through to 80 GB used, even after the process was complete and a final TLog backup had been taken. It continued to stay very large until the Full backup was complete -- or something else that I'm unaware of completed. Like every other DBA I typically take a TLog backup to shrink the log, but what appeared to be the case here was the Full completed and it released the used log space. All said, will Transaction Log backups not free up the log during Full backups?
I am trying to create sql code that restores a backup of a master database to a new database on the same server. It “seems” to run correctly as no errors are produced. However, the most recent updates to the master database are not present in the new databases. All databases are using the Full recovery model. What is really strange is that if I do (what I think is) the same function in Enterprise Administrator, the restore works fine! For both methods I used the same backup file!
Any and all help is sincerely appreciated.
The master databases from which the backups are made start with “MODTRNMaster”
The databases which are created from the restores start with “M1_” and “M2_”. (We call them training room databases.)
My script for backing up the master databases:
-- Backup the master training database
backup database MODTRNMaster to disk = 'f:kupMODTRNMaster.bak'
backup database MODTRNMaster_IMG to disk = 'f:kupMODTRNMaster_IMG.bak'
backup database MODTRNMaster_MNC to disk = 'f:kupMODTRNMaster_MNC.bak'
backup database MODTRNMaster_VM to disk = 'f:kupMODTRNMaster_VM.bak'
go
This is the restore script for restoring the first training room databases. I’m hoping that there is just something simple that I’m overlooking in these restore statements! J
-- Restore the backup of the master training database into the
-- training room #1 database.
use master go
drop database M1_MSLH go
restore database M1_MSLH from disk = 'f:kupMODTRNMaster.bak' with move 'DEV5_Data' to 'f:mssqldataM1_MLSH.mdf', move 'MM' to 'f:mssqldataM1_MLSH_1.mdf', move 'AMB' to 'f:mssqldataM1_MLSH_2.mdf', move 'DM' to 'f:mssqldataM1_MLSH_3.mdf', move 'IMM' to 'f:mssqldataM1_MLSH_4.mdf', move 'ED' to 'f:mssqldataM1_MLSH_5.mdf', move 'DEV5_Log' to 'f:mssqllogM1_MLSH_log.ldf', recovery
I am working towards automating the process of testing our backups. For the meantime, I do it all manually - I copy the backup files (full + transaction logs) to our test server and then run the restore script. Once database restored I run the DBCC CheckDB. The results of checkdb I manually upload to our Sharepoint portal as proof that the backup file is intact with no errors.
here are some ideas I have but have not yet tested:
Create a maintenance plan with each 3 jobs:
--> Powershell script to copy the files from Prod server to Test server - add this scrip to Job1 --> Powershell script to restore databases files - add this script to Job2 --> Run the DBCC in powershell (yet to find if possible in PS) - add this script to Job3
I would like to use seperate jobs as to get a report on the duration and status of each job
Would also like to get the results of the DBCC Checkdb as proof that no errors were found for upload to our Sharepoint portal. Dont know if possible via the job.
I am attempting to move some SQL 2000 databases to SQL 2005. My main production database does not seem to want to move. When I use the SQL 2005 GUI the .bak backup file is marked 'Incomplete'. When I attempt to restore the backup file I get a 'RESTORE detected an error on page (0:0) in database' message. I saw a thread in the SQL Express forum suggesting trying to restore from the T-SQL level to get the GUI out of the picture and I get the same 'error on page (0:0)' message. However when I take the same file and use SQL 2000 Enterprise Manager it restores with no problems.
Hi, I'm trying to work out how to backup an ASP.NET 2.0 site which uses an SQL 2005 Express database located in the App_Data directory. It seems that the database file cannot be copied while in use so I guess that either leaves taking the site offline or maybe the database can be backed up to another location using a scheduled task? Any suggestions? Thanks, Dale
What backups options do you have if you do not have a tape drive connected to your SQL 7.0 Server ? Can you run a SQL Maintenance Plan and backup the databases to a remote server. If not, can ARCServe 2000 backup the SQL backups to another server with a tape drive ?
I've set up several backups in the past using the backup screen. I've also set up a Maintenance Plan to back up my database using MyDbName* where the * is the familiar suffix automatically generated by the system when the backup runs every night.
The trouble is that, even though I've deleted old maintenance plans and scheduled backups, they're not going away. Every morning, I see new nightly backups of old backups etc., in the backup folder. Furthermore, the maintenance plan I want to backup my database isn't creating a backup file.
How can I make the old backups stop running, and make sure the new backup runs? I've gone through everything and can't seem to find any residue of these old scheduled backups--yet they still run! Can anyone help?
We are having problems with our tape drives so we are attempting to back up our databases to disk. I have a whole server on the network I could use for this. When I use the backup that comes with SQL 7.0 it won't allow me to choose a network drive. It looks like it must be a local drive. Is there a way around this so I could back it up to this other server. I jave 32 gig available on the server which would be plenty. We don't have enough disk space on the local drives to back it up. We have been having problems with our current tape drive company Exabyte. Does anyone have any recommendations. We are thinking of switching to another company.
I currenntly backup several databases to thier own unique device. Each backup is executed as a full backup (databases are small)and they are set to expire after 7 days. I checked my devices after 7 days and they continue to grow in size - is this right? I expected the device size to hold constant once I had a rolling seven days of backup (The databases don't change in size from day to day) Any ideas?
I'm using the database maintenance plans to backup my databases. We are backing them up to tape. Is there a way within the database maintenance plan to tell it to format the tape instead of appending it to the tape.
I am trying to make sure I have my backups scheduled correctly and in the correct spot. We have a tape drive on the server which is backing up the databases each night. The logs I have them currently being backp up once an hour to disk. My question is would it better to have the logs back up to the tape drive or to disk. The server where I'm backing up the logs to has 2 drive C and D. Where D contains the data and C contains the application/program files. I'm assuming I wouldn't want it on D since if there was a crash with that drive and we lost the data we wouldn't be able to get the logs for that day so it would be better to put it on C. What are most people doing to backup there logs. At the end of the day after a full backup I clear the device that contains the log backups. It this good practice?
How do you backup to a remote drive? The book says that inorder to backup to a remote drive, you need to log on with a domain NT account with administrative privileges. I am unable to log on with a domain account and backup to a mapped drive.
For example, S1 and S2 are two servers in the domain D1. The NT domain administrative account is A1. I need to backup a database D1 on S1 to a location on S2. For this I do the following. On S1, sp_addumpdevice 'disk','dump_dev','s2sharenamefilename.bak' On executing, I get,
(1 row(s) affected) 'Disk' device added.
Then when run the command, backup database D1to dump_dev I get the following error,
Server: Msg 3201, Level 16, State 1, Line 1 Cannot open backup device 'dump_dev'. Device error or device off-line. See the SQL Server error log for more details. Server: Msg 3013, Level 16, State 1, Line 1 Backup or restore operation terminating abnormally.
The event log shows the following error message:
BackupDiskFile::CreateMedia: Backup device 's2sharenamefilename.bak' failed to create. Operating system error = 1326(Logon failure: unknown user name or bad password.).
Please let me know how to back up to a mapped drive using TSQL and Enterprise Manager.
I have a dump device configured to append the backup file each time a backup is run. This file has grown extremely large. Im looking for a way to go into the file and delete all but the last 2 or 3 backups. Is there a way to do this?
Hello, is there any backup software (archserve,ultraback) that has a sql agent for 7.0 ? I have not found one yet. Charlie Principato Dialamerica Marketing,Inc.
I have a SQL2K/SP3 database with an MDF of 4GB and an LDF of .5GB. If I do a backup using Enterprise Manager, whether manually or via a DB Maintenance Plan, I get a .BAK of approx. 2GB. If I do a backup using: USE master EXEC sp_addumpdevice 'disk', 'RAMScopyBAK', 'F:MSSQLBACKUPRAMScopyRAMScopy.BAK' GO
BACKUP DATABASE RAMScopy TO RAMScopyBAK GO
I get a .BAK of 24GB !!!
In the end I want a T-SQL step to be part of a SQA job that will back up the database, thus the use of the dumpdevice command. I could create a maintenance plan, disable the jobs it creates, and then start them via sp_start_job, but this is just an example of a systemic problem; i.e. I'd have to do individual maintenance plans for more than one database.
For a lot of smaller/medium sized systems that I administer, I tend to use SQL Server to backup to a backup device and then let the normal system backups backup the dump files to tape.
However, I am taking on a new system with approx 900Gb of data. We dont want to buy another 900Gb of disk to secure the databases ! I have found the SQL Agents in Arcserve 2000 to be unreliable to say the least. (we dont want to stop the SQL Services either)
Would anyone recommend alternative software to backup large databases.
Is there a way in SQL 2k running on win2k adv. server to map a scsi tape device from one sql server to another so database backups can run to the remote tape device??
i took database back up on 01-jan-2005 and from 2nd jan onwards im taking differential backups. if i want to restore the database to a new system, is it sufficient that i restore the back up which i took on 1st jan , and the most recent differential backup ? will the entire data till the last back up date be restored?
I'm pretty new to SQL Server. I have a ? regarding backups.
I have a database which I'm backing up everyday using BACKUP DATABASE db TO db_bak WITH INIT. This job runs everyday successfully, but I'm having an evergrowing transaction log file. I presumed a full database backup will apply transaction logs and will truncate the log files. Isnt it so, or am i missing some concepts? This database can have a data loss of 1 day which is covered by my full database backup job. I dont need a transaction log backup. But to reduce log file size should I do a log backup?