Transaction Replication && Data Archiving On SQL Server 2000
Jul 23, 2005
Hi techies
I have set up a Transaction replication from My Primary Server to
Secondary Server on Orders table.
Thousand of records gets inserted on Orders every hour which get
replicated on the secondary server. it works fine
reporting apps uses Secondory server's Orders table data for generating
reports .
The Problem :
Let say if i want to Remove older records from Orders table in the
primary serverwith out reflecting this change on the secondary server.
is there a way to PREVENT this operation /transaction to be propogated
to the secondary server.
Note : i am moving the records to another table (orders_Archive ) and
deleteing the rows from orders table . Also I need all the rows to be
present on the secondary server table.
We have two SQL Server 2005 production DB at remote sites. Due to network bandwidth issue, we need to replicate these DBs (publishers and distributers) to central corporate SQL 2000 DB (subscriber for backup and possible reporting (and in rare case as a failover server).
We would start out with backup from SQL 2000 db restored on remote SQL 2005 DBs. When we have DB issue on remote 2005 DB, we want to restore it from central corp. 2000 DB backup. Since two DBs are replicating to central DB, we DO NOT want combined db back up data on restored remote 2005 db. We can restore the db and delete unwanted data before we turn on replication from this restored server. So, this is not a problem.
The real problem is how to avoid snapshot replication (during initialization) when we create a transaction replication on this restored server to avoid over writing data on the central subcriber sql 2000 DB???
I want to replicate SQL Server 2000 data from publisher to subscriber with new data added and existing data modify at subscriber, is it possible? what is the solution.
Hi,I'm in need of detailed information on how data replication works on thesql server 2000. If someone knows about a book or articles thatdescribes how data replication works with transactions, storedprocedures and/or merging databases I would like to be pointed in theright direction./Zero_Addiction.
I am looking for bidirectional transactional replication using updatable subscribers (queued or immediate) . Is it possible to replicate the image data from the updatable subscribers to the publisher. I understood that the Image data can't be replicated to the publisher from the updatable subscriber. I am not using the WRITETEXT or UPDATETEXT. I am using just INSERT and UPDATE for image data type transactions.
My problem is, that i don#t know how i can archive the data. That means to documentate when, who, etc. changed the data (in a seperate table). I tried to solve it with different triggers.
I'm working on archiving data from some tables. I've duplicated the data structure, with the exception of not including the IDENTITY specifier on INT columns, so that the archive table will keep the value that was generated in the original table. This is all going well, until I tried to copy the data over where the column is specified as a timestamp data type. I've looked this up and found a couple of things. First, documentation for SQL 2000 says,
Timestamp is a data type that exposes automatically generated binary numbers, which are guaranteed to be unique within a database. Timestamp is used typically as a mechanism for version-stamping table rows. The storage size is 8 bytes.
And then documentation for the soon to be released SQL 2016 on the rowversion data type says,
The timestamp syntax is deprecated. This feature will be removed in a future version of Microsoft SQL Server. Avoid using this feature in new development work, and plan to modify applications that currently use this feature.
and
Is a data type that exposes automatically generated, unique binary numbers within a database. rowversion is generally used as a mechanism for version-stamping table rows. The storage size is 8 bytes. The rowversion data type is just an incrementing number and does not preserve a date or a time.
OK, I've read the descriptions, but I don't get it. Why have a timestamp/rowversion data type?
How do I put data into a text or excel file before I attempt a deleteion from a large table. I know how to select the necessary data, but i'm not sure about the t-sql required to put it into a file?
I wrote a script to archive and delete records rom a table back in 2005 and 2009.
I can't seem to get the syntax right. Any sample script to simply archive and delete records?
This is what I have so far.
DECLARE @ArchiveDate Datetime SET @ArchiveDate = (SELECT TOP 1 DATEPART(yyyy,Call_Date) FROM tblCall ORDER BY Call_Date) --SELECT @ArchiveDate AS ArchiveDate DECLARE @Active bit
Regarding SQL Server data, I am looking to implement the beset Data-Archive and Purge policy. Normal, we do SQL Backups and keep the history for some period , for example, 8 weeks, so we can go back and restore any data point in time upto 8 month in past. and we also do Tape backups.
Question is Where can I get nice article or documentation on this to best design such policy where I make sure that I am covered for point in time recovery of database (which is sql backups) and point in time recovery in far past, say, 3 years ago using tape backups, and I need to make sure that I don't repeat the same efforsts.
Is there a way, for example to script a DTS Package, so that it can be deleted and recreated at a later date if necessary? I have quiet alot of these, but few are used regularly. The msdb database is now up to 80 MB. However I don't want to delete them and have no way to recreate them.If I took a backup of msdb and then deleted the packages, would restoring msdb at a later date restore the packages?????
I have been placed in the position of administering our SQL server 6.5 (Microsoft). Being new to SQL and having some knowledge of databases (used to use Foxpro 2.6 for...DOS!) I am faced with an ever increasing table of incoming call information from our Ascend MAX RAS equipment. This table increases by 900,000 records a month. The previous administrator (no longer available) was using a Visual Foxpro 5 application to archive and remove the data older than 60 days. Unfortunately he left and took with him Visual Fox and all of his project files.
My question is this: Is there an easy way to archive then remove the data older than 60 days from the table? I would like to archive it to a tape drive. We need to maintain this archive for the purposes of searching back through customer calls for IP addresses on certain dates and times. We are an ISP, and occasionally need to give this information to law enforcement agencies. So we cannot just delete it.
We are implementing 2005 transaction replication on source database to target staging subscring database but we want to keep all transaction changes from source within staging subscribing tables. If source column gets updated we want to keep old record and new updated record in staging subscriber. Transaction replication synchronizes but does not keep history on subscriber. Do we update stored proc's anyone have examples of code or ideas??
I'm trying to create a transaction replication from SQL Server 2000 to 2005. Basic replication between the servers works just fine. However, what I want to accomplish is to be able to skip some of the transactions. Example - from time to time we want to purge some of the historical data from the main database (the publisher). We don't want the same purging to occur on the destination database, which will be used for reporting purposes and needs to include all the historical information. I wanted to simply stop the replication log reader, purge the records, backup the transaction log with truncation and then restart the reader. The only problem - the truncation on the replicated database keeps the transactions of the purging until they are replicated, so the transaction log backup doesn't help. Any ideas would be greatly appreciated!
I have a setup of transaction replication between one publisher and subscriber in the Same server.Now, I need to add a new subscriber to the existing publisher. So publisher database name is DB_A and Subscriber 1 name is DB_B. So the new subscriber will be DB_C. Is this kind of setup possible on one server?
If yes then at the time of reinitialization is it going to apply the snapshot on DB_B as well as DB_C?Also let say if due to disk error DB_B gets corrupted then will data be still replicated between DB_A and DB_C? (Assuming publisher, subscriber 1 and 2 are sitting on individual disks).
Hello,Does anyone know of a way to schedule the archiving of analysisdatabases? Seems pretty lame if you can't... The only answer I've gottenis "maybe in Yukon"....Thanks.*** Sent via Devdex http://www.devdex.com ***Don't just participate in USENET...get rewarded for it!
We have a Microsoft SQL Server 2000 SP3 running database for Microsoft Navision 3.7
From time we encounter problems, especially when running heavy query procedures from Navision, with the transaction log. It's actually setup as folows:
We get the following errors (once every 2-3 months so far):
The log file for database 'ME_Prod' is full. Back up the transaction log for the database to free up some log space..
in between numerous abovementioned messages I have the following: Configuration option 'show advanced options' changed from 1 to 1. Run the RECONFIGURE statement to install..
Could not write a CHECKPOINT record in database ID 9 because the log is out of space.
Automatic checkpointing is disabled in database 'ME_Prod' because the log is out of space. It will continue when the database owner successfully checkpoints the database. Free up some space or extend the database and then run the CHECKPOINT statement.
Hi All:I am getting an error when trying to open a recordset in SQL Server 2000.The error states that the transaction log is full. Is there any way I canclear out or empty the transaction log, or get rid of it alltogether as itis not really needed?Any help would be appreciated.Thanks and regards,Ryan
Hey Folks! I have a typical requirement by my client. On submitting a Update (Bulk) button a huge database operation starts. A huge bulk update operation need to be performed. This would take 2-3 minutes some times. Client wants a cancel button in this case where he can be given a way to cancel the database Transaction. Please let me know in case if there is a way out. Thanks, in advance. Regards, Uday.D
Hi All:I am getting an error when trying to open a recordset in SQL Server 2000.The error states that the transaction log is full. Is there any way I canclear out or empty the transaction log, or get rid of it alltogether as itis not really needed?Any help would be appreciated.Thanks and regards,RyanThe error msg is: tempdb transaction log is full. B/U transaction log tofree up space..."
(I may be in the wrong forum.) How do I obtain/find the properties of a table using SQL Query Analyer (SQL Server 2000)? Specifically, I would like to run a query to find the most recent date of any transaction on a table. I have a script that I use for SQL Server 2005 but it doesn't work in 2000. I don't know 2000 but I'm guessing that the syntax is different?
Here's the 2005 SQL Server script (stolen from 2005 Books Online BTW):
I know i can use the sentence SET IMPLICIT_TRANSACTIONS ON in a Stored Procedure to force SQL Server to set the connection into implicit transaction mode.
Have i a sentence or configuration to force all SQL Server connections to implicit transaction mode?
I am able to archive the Analysis Services db's using the msmdarch on one server but not the other server. (It is working in the Test environment but not the Production environment.) Here are the details:
Both servers are SQL Server 2000 sp1. I compared sp_configure and both servers have the same settings. The Production server is clustered and the Test Server is not clustered.
I can archive in TEST using the following: - From Analysis Manager - Using the msmdarch command from a command prompt - From a scheduled job using the "Operating System Command" Type from a job step entering the msmdarch command in the Command box. - Using the msmdarch command with xp_cmdshell within Query Analyzer and from a Stored Procedure.
I can archive in PRODUCTION only using: - From Analysis Manager - Using the msmdarch command from a command prompt ** It does not work from a job or from Query Analyzer
I have checked the permissions of the SQL Server Agent account on both servers and they are identical with sa rights and a member of the OLAP Administrators group.
Does anyone have any thoughts on what could be causing this?
I have to set up archiving of some pretty large tables (125M+ rows) and I'm trying to figure out the best way to do it.
I've been using OUTPUT into, something like this:
delete top (100000) a output deleted.RecordID into dbo.RecordsArchive (RecordID) from dbo.Records
It's difficult to tell if it sub-optimal code or just sheer volume, but any performance penalty using OUTPUT into in this way? or is it advantageous to insert the records to the archive first, and then delete them in two separate operations?
Has anybody encountered a physical size limit for a sql server 2000 transaction log running on win2k?
Transaction log reached ~6Gb before rolling back the delete stating transaction log was full. There was 42Gb free on the server and the log was set to unlimited growth.
Can anyone help me with this scenario!!!! I have a sybase database and a sqlserver 2000 database. I want to insert data into sybase database table thru sql-server 2000 using distributed queries When i execute the following the transaction
Create procedure myCurrentDataBaseProcedure as begin
begin tran insert into mytable values(1) if @@error <>0 begin rollback transaction return end insert into sybasedatabaseserver.databasename.dbo.tablename values(1) if @@error <>0 begin rollback transaction return end commit transaction end
The procedure is created in sql server database trying to execute this procedure..shows error The first part of the procedure is executed.
But the error is here insert into sybasedatabaseserver.databasename.dbo.tablename values(1) The data is succesfully inserted in the local database I am unable to insert data into the remote database Can anyone suggest me wht shd i do in this scenario Are there any drivers to be loaded to commit this transactions