Backing Up: Dumps Vs SQL Backup Agents Using BackupExec
May 31, 2000
I'm going back and forth on an issue and was looking for some outside observations. I have over a dozen SQL NT servers, both 6.5 and 7.0. On some servers I'm dumping the databases and backing up the dumps with Veritas BackupExec, on 4 (two SQL 6.5, two v7) I'm backing up with the Veritas SQL Backup Agent.
Obviously, if you don't have room to dump a database, you must use the backup agent. That is the case on one of my servers and becoming so on a second. But my personal preference is to do the dump/backup.
(as a side note, one server is not backing up correctly with the backup agent but ultimately that box will require it due to DB growth, so it is something that I have to resolve)
I like the dump system for a few reasons. I find it easier to load from a dump, particularly a single table. Likewise, I find it easier to copy a database by loading from a dump rather than going from a backup, but that's mainly because of BackupExec being a little bit strange on redirecting restores.
Here's my clincher. I find restores via the backup agent to be ridiculously slow. Let's say I have a 5gig DB that has 1.5gig in it. The dump size will probably be somewhere below 2gig. The restore via backup agent carefully writes the entire 5gig even though 3.5g of that is empty. This takes a lot of time. Add to that the "post-restore DBCC". I had such a restore take something on the order of 13 HOURS which, needless to say, conflicted with my nightly DBCC's and backups.
OK. End of rant. Any suggestions or thoughts on the subject?
Before I started with this employer the server support staff had planned a backup strategy that included using database agents for Oracle and SQL Server. Nothing is really set up yet for SQL Server so I can still change this direction. Has anyone seen a definite benefit to using database backup agents? If so, what benefits have you seen? There doesn't seem to be much value added by paying for and using an additional product when the database's own utilities are so easy to use, and all the backup files can be backed up to tape with the basic backup software. I've not worked with it, though, so perhaps I am missing something. These are small databases so space is not an issue. Any opinions/comments are appreciated.
Any thoughts or experiences (good or bad) on using the SQL agents from Arcserve or BackupExec for backups as opposed to standard dumps to disk?
I have always used dumps and taken the position that these tools work well, reliably and do not depend on another device ( tape device) for backups. Also they should be quicker and not cause any problem with table / log concurrency or delays in updates while the backup takes place.
Also, does anyone know how these products actualy maintain concurrency? Has anyone ever used one of these backups as a recovery?
Can anyone tell me where I can find information about why sql server might be executing a stack and/or symptom dump? That is, what are some of the conditions that might cause this. Every time I try to DBCC DBREINDEX on a certain table, stack and symptom dumps occur and SQL Server remains incommunicado (can't connect, even locally) until I restart it. I'm going to try to fix the table by using rename table, create new table (and dependent objects) , transfer data, and toast the original table. No problem, I've done that before with smashing success. But, if this solves my problem, I'll still be no wiser about what it was that caused the problem in the first place. Any suggestions, questions, comments, or even jeers would be appreciated. Thanks, Chris
I've recently set up Backup Exec on our SQL Server to backup our databases to DLT tape everynight. However when backing up the mdf and log files these are skipped because its says they are in use. However if you look in the Current Activity they are not in use. Plse could someone help/advise.
I'm reviewing our Backup Stategy. Can anyone advise me what advantage there is to using Backup Exec instead if just backing up the database and transaction logs straight to a DLT tape. Thanking you in advance.
We've been using BackupExec for a while to back up our SQL 2000 databases, but we've suddenly encountered a problem while trying to back up some of them. It seems that it won't back up databases that have a recovery model of "simple". Our Network Admin can't find anything in BackupExec that references Recovery Model, and we haven't changed the recovery model on the databases since they were created .....
Here's the error from the event log ... any ideas ?
An error occurred while executing the following query: "BACKUP DATABASE [TI] FILEGROUP='PRIMARY' TO virtual_device = 'BackupExecSqlAgent_PRIMARY_00'". On server: "SQLDB-01". SQL error number: "1070". SQL error message: "The statement BACKUP DATABASE...FILE=<name> is not allowed while the recovery model is SIMPLE. Use BACKUP DATABASE or change the recovery model using ALTER DATABASE. ".
I currently have Backup Exec 11 w/ SQL Agent backing up my databases on a daily full, and 4 hour incremental basis. I want to configure log shipping as an additional level fault tolerance for my data. I have a few test databases setup with the log shipping and they are working correctly. But my question is related to Backup Exec.
Now that I have logs going to a second SQL server how will Backup Exec play with the primary server logs? Do I need to configure Backup Exec in a certain way, say do not truncate logs, etc? Or can I just keep the backups going as I have them and log shipping will just work at the same time.
We currently use CA ArcServe (ArcServe 6.5 Enterprise and Single Server Editions) to backup our Windows NT files and MS-SQL Server databases. We have experienced significant reliability issues with ArcServe. Many times we have found ourselves rebuilding a corrupt ArcServe Job (ArcServe’s backup schedule) database. One of our NT server occasionally NT bug checks when ArcServe is performing backups. Occasionally ArcServe Jobs incorrectly reschedule themselves. Sometimes the Jobs do not complete but stay executing, not performing any work, and to cancel them may require a lot of effort. The ArcServe job DB repair utility generally does not work. The user interface is lacking. For example, the job scheduling options are very limited. CA tech support for this product has been poor. Because we have issues with ArcServe stability we are now evaluating Veritas (formally Seagate) Backup Exec for NT. What are other people’s experiences with these 2 products?
I need to set up a dump of my main database's transaction log to dump every hour but not overwrite until the 8th hour (keeping 7 log dumps). Can anyone tell me how to set up the scheduled tasks for the transaction log dump? Is there anything special I need to do for the main database dump?
Hi: Actually we have problems doing database dumps, so we're copying directly the .DAT files to a backup machine, I mean the .DAT files are the data devices that uses SQL no dumps files. We can do or eecute the dump process.
What problems can we expect doing this? Which can be the problems 'cause we cannot dump our DB's?
I have noticed a question on the Admin exam which involves declaring and usting a time variable for scheduling a backup or dump of a database. Is this possible and has anyone else seen this on the exam or used it in reality?
Our SQL Servers is giving us a headache, after a certain period in time, either SQL Service automatically shuts down by itself or hangs. I've opened the logs and found hex dumps. Can you help me out with these?
I've read that the SQL dump can only be done directly into the local server. Is there a way to put directly the dump into another server via the network (not copying it but writing it during the process of backup). Any help much apreciate.
We are experiencng high cpu utilization across all 4 cpu's at the top of the hour when our transaction log dump job runs. Has anyone observed this bahavior before? Is there anything we can do to mitigate this? Thank You.
I am testing a procedure to automate the transaction log dump. I am following the steps located in Chapter 22 of the Microsoft SQL Server Administrator's Companion ("Automatic Transaction Log Dumps Using Performance Monitor"). The alert in Performance Monitor appears to be starting when the log is 75% full, but the alert is not firing off the file that contains the dump transaction sql command. For the 'Run Program on Alert' box this is what I have: isql -Ssvrname -Usa -P -id:appsmssqlinndump.sql The dump.sql file contains: 'dump transaction pubs with no_log'
I have also tried the following 4 steps: 1) Created a SQL Alert Messsage, 2) Created an NT Performance Monitor Threshold Alert to run sqlalrtr to issue a certain error when the pubs log is 75% full, 3) Created a TSQL Task, and 4) Created a SQL Server Alert to run the Task created in step 3. This appears to do the same thing. The Alert is fired off, but the Task is never executed. Note: I am able to execute the task from within the Schedule Tasks Window.
I am using Standard Security with SQL Server 6.5 (sp5a) running on NT4. Thanks for you help in advance.
I am new to microsoft SQL server as I am from Oracle background. I am preparing to the MCP certification for Exam 70-229, Designing and Implementing Databases with Microsoft SQL Server 2000 Enterprise Edition. The exam shows as SQL server 2000 is there any exam for SQL server 2005.
Is it the write exam to give to start as fesher into microsoft platform. Iam into development not towards administration.
If any body has any Exam preparation questions please forward them to krishna.kanigelpula@gmail.com.
How can I check to make sure that my dumps are not corrupted? I have been using a utility from Microsoft called DSCAN5, but have found that this has some limitations.
does anybody know how to automate the loading of incremental transaction dumps. The manual way is to use the "load tran DB with FILE = x" statement.
Since this has to be done in the right sequence and i need to automate this task to keep a second server up to date i like to know if there is a stored procedure or any other tool which could do the task.
I have a package which access a DB2 database and pulls data from a single table. I can't put a specific event on it, but the package has been causing a dump to occur on a rather regular basis. The really odd part is sometime when I add a data viewer on the output link of the OLE DB Source it works....then it starts to dump again a couple of executions later. There are not date/time values involved in the result set, just character strings. Default code page is set to 1252 and use default page is set to False....any ideas appreciated - this is really starting to drive me nuts!
our users believe that we lost some valid data, but no one knows who did it, I thought I can find it from the transaction dumps I take every hour so ,Can I read Transaction Dumps (*.TRN) file in SQL server 7.0 or Can I get this information through other means.
I am using MS SQL 6.5 SP5, NT4.0. running Tivoli Service Desk. Two of the four agents on the Applications server were not running so I recycled the server and now NONE of the agents are coming up. Generally the agents show up on the application server control panel but they have not started. The status of the server is 'initializing' and the Watchdog has started but only the 'dispatcher'started. I have hard booted the aps server and recycled the aps and db servers both. I know that there is a way to suspend the agents but that has not been done and the 'resume' button is not available to use. I tried restarting the dispatcher but it didn't do anything. I have also gone into 'config' on the control panel and it shows that '4' agents are set to start up.
We run a Web version also so I have my help desk analysts using it but they cannot refer back to tickets that they have just opened, etc.
ANY SUGGESTIONS - I have been waiting on Tivoli to call me back for almost an hour and a half.
I have around 25 sql servers(sql server 2000) all on windows server 2003. i would like to know if anyone has a script that will poll all the servers and check to make sure the agent is running.
Is it possible to obtain the status, running or not running, of the distribution agents that appear under Replication MonitorAgentsDistribution Agents in Enterprise Manager using WMI? If yes, can you please tell me the WMI namespace and objects that can be used?
I have a case of SQL Server 2005 production db's publication which does not have either Log reader, snapshot or distribution agents. I need to modify primary key in one of the tables, and existing publication does not allow this. When I try to start snapshot job in GUI I am getting of course error: "Could not find the regular snapshot job for the spicified publication. Supply either @job_id or @job_name to identify the job, error 20678."
Is there any way to erase it with system st procs or SQL? Help is greatly appreciated.
I have the case where a SQL Srv 2K5 publication has no agents - agents were accidentally erased during backup/restore. I was abel to erase subscription.
The error is "The specified @job_name <jab name> does not exist. (Microsft SQL Server, Error: 14262)