Running Back Up Logs To A Different Server To Maintain A Copy
Dec 28, 1998
I am going to use the backup and restore function to copy data from one server to the other server. We would like to keep the servers in sink at this point (not instantaneously but update the server say once a day) and I would like to do this by using the back up transaction logs. I have tried to back up from individual transaction logs but in also seems to restore the full database also. The database is roughly 6 gig and transaction logs are about 25- 50 meg. I really do not want to have to restore the database every time.
I know I could set up replication but this has been more of a pain administering this on a daily basis. I would like to do a schedule and forget type of thing. This is going to be done on 6.5.
I am using SQL Server 2012. I Want To Maintain all Type Logs In Particulars database or server. I want to track all Query Which Execute in Particulars Database. and all other activity?
I am in a situation where I need to get a copy of test database that ison production server running MSSQL 2000 Standard to my local machinerunning MSSQL 2000 personel. I tried to use the copy wizard where itappears I get connected to the source server OK but when I try toindicate the destination server which is my local machine I get errorspoping up about cannot connect to (local) etc.I am NOT a DBA just a programmer trying to get a local test environmentup to be more productive.Lsumnler
We are running SQL 7 on a Windows NT Server. If you copy a 25Mb filefrom this machine to a W2K server, the file copy takes over 5 minuteson a 100Mpbs switched network.Copying the same file to another NT server takes only seconds, andcopying the same file to the W2K server from the 2nd NT server, (whichis not running SQL) takes only seconds also.Has anyone any ideas as to why file copying between this machine and aW2K one will take so long. It is repliacted on 5 further w2K machines.
Hi !I know how to save a DTS as a structured storage file and how torecreate a DTS using that file.What I can't find is a command, either in t-sql or from a DOS level,with which I can save the DTS as a structured storage file, for examplein a scheduled job for backup purposes.Hopefully some out there has an answer !!!ThanksDavid Greenberg*** Sent via Developersdex http://www.developersdex.com ***Don't just participate in USENET...get rewarded for it!
for some filegroups, we are turning readwrite back on periodically to trickle some data into an archive. This lasts a few seconds and then we turn readonly back on. What is the impact of doing this while a query is running on the archive?
Does anybody know of a way to rollback SQL Server 2005 databases back to SQL Server 2000? Is there a way of doing it without resorting to Copy Database Wizard? I love to find a way of attaching a SS 2005 database to a SS 2000 instance without any issues.
I recently upgraded to SS 2005 and I am very unhappy with the SS 2005 and I want to rollback to SS 2000, which was a lot more stable. I am having several major issues that are affecting my whole company's day-to-day operations and the managers are not happy. Some of the issues include night time batch running very sluggish for no apparent reason. This is a biggest problem because it only occurs once or so a week and causes a disturbance with the daily activities when the night time processing isnt completed on time. The rest of the time, the batch processing runs great, even a little better then on SS 2000. I don't believe it is a matter of my application needing to be retuned because if that was the case, then why isn't it running sluggish every night? Also, it's never the same day that the sluggish behavior occurs. If it was occurring on the same night, then I would have something to investigate within our application, but it doesn't. Another issue that I am having involves a night time job that restores a copy of the production database to the Data Warehouse server to be used for updating the data warehouse. Again, most of the time it runs great (~2 1/2 hours), but once or twice a week, it goes stupid and takes 6 1/2 hours for no apparent reason. Again, it is not happening the same day either, which could give me something to invesigate. On SS 2000, this same job ran flawlessly. Never I did I run into situation that the database restoration took that long to run. Even another issue involves a SQL Server Agent Job that was put into suspended state. What's a suspended state and how can I get it out of suspended state? I can find no information about suspended state in BOL. I did a Google and nothing came up. If this suspended state was put in for security reasons, great, but then tell me how I can remove the suspended state. I am also not happy with the fact that I can't get accurate information about the queries that are actively running at that particular moment. In SS 2000, when I noticed high CPU usage on the server, I would run the sp_who2 active stored proc and it would show me all the active thread and how much CPU it was consuming. I would then find the running threads with the highest CPU numbers and investigate the query and see if we could improve it. Now in SS 2005, I get in the same situation and run the sp_who2 stored proc, and there is no smoking gun. All of the active threads are showing very little CPU usage, which I am very suspect of. What the heck happen to sp_who2? I looked at some of the other ways of looking at running processes (i.e... sys.sysprocesses) and they don't appear to be giving the information that I need.
I am very unhappy and I just want to roll back to SS 2000 and wait a couple of years before I upgrade to SS 2005.
I need to get up on my computer with MSSQL Server 2005 SP2 an exact copy of running MS SQL Server 2005 SP1 instance (with system objs, accounts, etc.).
In the same LAN.
Well, not exact one but the one as close as possible...
And what exactly may I mimick without interfering to existing SS2005 in production?
What is the best way to do it without invoking excessive (manual) work and problems?
I see dozens of ways and I am still in doubt... No ideal or straight-forward solution
I am working in ASP.NET 2.0 and using sql server 2000 as backend . In my application i need to insert/update to oracle database table lying on different server. Please let me know how can i maintain two different connecttions to different databases lying on different servers.....
I am working in ASP.NET 2.0 and using sql server 2000 as backend . In my application i need to insert/update to oracle database table lying on different server. Please let me know how can i maintain two different connecttions to different databases lying on different servers.....
I am trying to export the result of a select into a .csv file using SQL Server 2000 DTS. The data for varchar fields has leading zeroes in the database, which is very much required in the csv file.
But, the .csv file trims the leading zeroes. How do we force to maintain the same data as in source?
I had used Text File Destination Connection as the destination, with the below options File Extension: .csv File Format: Delimited File Type: ANSI Text Qualifier: Double Quotes ("") Row Delimiter: {CR}{LF} Column Delimiter: comma
Source Data: 0123 Target Data (Requirement): 0123
The data in .csv: 123 (This is the issue)
When I open this file in a Text Editor, I do see the data in double quotes..."0123".
Is there any way to maintain audit trail of access to my SQL server 2000 database by any user ?? I need to log the timestamp of any insert/update/delete to any record in a table within the database by the user.
In my case I have to log the errors raised by any task in a package to either windows event log, text file or SQL server. Also I need to send an email notifications to a group of people telling them about the error.
Now can I use SSIS package logging for logging the errors into the required destinations. I mean right clicking on the package and selecting Logging, then adding the required log providers and enabling the events for logging into those. I think I have to upfront select the log providers to log the error, I will not have the liberty to log the error to the destination, the name of which is passed as a variable to the package. This is okay with me though.
Now what will a custom log provider help me to do in this case. Also can I somehow configure my package to call the send mail task everytime an error is raised.
Also, one more option can be developing a package that only does the error handling. It will take in the paramters or the error codes and descriptions, the destination to write to and a flag to send mail or not for that particular type of error.
Hello,I'm trying to create a simple back up in the SQL Maintenance Plan that willmake a single back up copy of all database every night at 10 pm. I'd likethe previous nights file to be overwritten, so there will be only a singleback up file for each database (tape back up runs every night, so each daysback up will be saved on tape).Every night the maintenance plan makes a back up of all the databases to anew file with a datetime stamp, meaning the previous nights file stillexists. Even when I check "Remove files older than 22 hours" the previousnights file still exists. Is there any way to create a back up file withoutthe date time stamp so it overwrites the previous nights file?Thanks!Rick
New to Database Mirroring and I have a question about the Principal database server. I have a Database Mirroring setup configured for High-safety with automatic fail over mode using a witness.
When a fail over occurs because of a lost of communication between the principal and mirror, the mirror server takes on theĀ roll of Principal. When communication is returned to the Principal server, at some point does the database that was the previous Principal database automatically go back to being the Principal server?
I need to run two reports each of A5 Size to run back to page and print on single A4 paper means in 1st half Sale bill will be printed and in second half Gate Pass Will Be Printed both report will be on same page and size and shape should be maintained. How to do it.
hi expertsss.. my msdb database is like 2gb big.. to me is really big.. so is there a way to maintain that? and how. .. also.. my disk level fragment are bad on one of my drive (some datafiles in there and msdb is there too). is there any 3rd party tool i can use to do the defragment and set schedule ? please help thanks~
Hello,I am hoping you can help me with the following problem; I need to process the following steps every couple of hours in order to keep our Sql 2000 database a small as possible (the transaction log is 5x bigger than the db).1.back-up the entire database2.truncate the log3.shrink the log4.back-up once again.As you may have determined, I am relatively new to managing a sql server database and while I have found multiple articles online about the topics I need to accomplish, I cannot find any actual examples that explain where I input the coded used to accomplish the above-mentioned steps. I do understand the theory behind the steps I just do not know how to accomplish them!If you know of a well-documented tutorial, please point me in the right direction.Regards.
Hi All, Pandon me for asking such question, I am still a beginner to ASP.NET. I have a project that require me to do single operation that is suppose to update two databases, wonder how do I maintain transaction between these two databases? Please advise, thank you!
My application uses sql for performing operation. something like conncetion.execute(query). so there is only conncetion object no recordset object or something like that .
i want to run multiple instence of my application so i want to maintain integrety of data. and i am looking for solution through sql for locking mechanism. so concurrent data access dont currept data.
Sebastian Garibaldi writes "Hi I'm Sebastian from Argentina, and i have a problem with a SQL data base. I receive error from data base of broken index and consistency errors. I set the fill factor with the information from the books online, i put 70 in tables that have a lot of INSERT/UPDATE/DELETE but it works for two o tree days.
I now that i must make some maintain in the database but which tools i shuld use?
here i paste an error from dbcc checktable:
Server: Msg 8964, Level 16, State 1, Line 1 Table error: Object ID 981108969. The text, ntext, or image node at page (1:949979), slot 52, text ID 57535781339136 is not referenced. Server: Msg 8964, Level 16, State 1, Line 1 Table error: Object ID 981108969. The text, ntext, or image node at page (1:949979), slot 53, text ID 57535782191104 is not referenced. DBCC results for 'FCRMVI'. There are 108460 rows in 17430 pages for object 'FCRMVI'. CHECKTABLE found 0 allocation errors and 2 consistency errors in table 'FCRMVI' (object ID 981108969). repair_allow_data_loss is the minimum repair level for the errors found by DBCC CHECKTABLE (ArleiProd.dbo.FCRMVI ).
Can anyone give me an idea like, what percentage of organizations use 'code' to maintain the parent-child relations on their tables than having FK constraints thru the db model? Because,all the companies that I worked with used 'code' to control the relationships across the tables(not the PK/FKs.!!) Thanks. Neil.
I have to synchronize 2 databases hourly but am having difficulty maintaining foreign key relations. These tables use auto-increment columns as primary keys, with child records in other tables related with foreign keys. I can't change the way the local software uses primary or foreign keys as it is hardcoded in the local app. (microsoft retail management system)..(however the web-remote app is easily customized). I am using CDB synchronizer to sync the two databases because the remote one is mysql.
Example tables layout: Items table has auto-increment primary key 'id' TransactionEntry table has its own auto-increment primary key 'id' and a foreign key 'item_id'
Example of how remote and local database foreign key relations are incorrect after sync using CDB synchronizer: 8:00am -first installation of database-'item' tables auto-increment 'id' columns match with id last record value of '6'
locally the following products are added:
11001 short sleeve t---gets added with primary key in 'item' table 'id' of '7'
11002 long sleeve t----gets added with primary key in 'item' table 'id' '8'
remotely the following products are added:
21001 hipster jeans- --gets added with primary key in 'item' table 'id' of '7'
31001 overalls---gets added with primary key in 'item' table 'id' '8'
remotely someone orders 21001..so TransactionEntry table records sale of "item_id" of '7', but after synch with our local server,
product with "item_id" of '7' is "short sleeve t".
9:00 -synch takes place...item_id foreign key isn't accurate because of independent auto-increment values..
whenever a product is ordered, the TransactionEntry table will record the product's ID column thats available in it's own local copy... after synch, the 'item_id' field will not match the 'Item' table id field and the data about the transaction's product is lost.
I have read of solutions involving staging/temporary tables to cascade update foreign keys before synching into main database, but hopefully there is a more elegant solution for this. If this is only way, will it be reliable? foreign key mix-match seems like could cause havoc.
I would like to maintain version control of the all the sql objects (sp, view , tables ) and maintain source code versions control. Any way to use TFS to maintain versions of sql objects. Also the folder structure when using TFS.
In Enterprise Manager, Management and then SQL Server Logs, when I clicked on Current or Archive#1 or 2 etc, nothing happens. This has been going on for the past 3 weeks does any one knows what is causing such problems?