SQL Server 2008 :: Initialize Transactional Publication From Backup Loss Of Data
Mar 18, 2015
I have automated process, which synchronizes a transactional publication using initialize from backup approach. It drops subscriptions and puts them back again once the restore on the subscriber is completed.
Dropping the subscriptions causes a lot of blocking and deadlocking. I've decided to remove those steps, but it causes loss of data on the subscriber.
Is it a must to drop and re-create the subscriptions during such process? If not, how can I avoid the loss of data?
We have setup transactional replication across several databases using SQL Server 2000 spread across multiple sites in a fully connected network. There is one main table from which data is replicated from the publisher to the destination. Horizontal filtering is being used on this table to enable sending/routing of the records to the correct DB(site). It has been observed that the documents/records are getting lost between some sites. Say 10 documents are being sent fron the publishing database but only 5 are being received at the destination database although the sent history for all the 10 documents is available at the publishing database.
Can anyone guide on how to analyse and resolve this problem? Can unreliable network be the issue, If the network is not reliable and the connection is lost during replication how does replication ensure that no data is lost?
I have been trying to set up transactional replication between an OLTP SQL Server 2005 Standard edition server and a BI SQL Server 2005 Enterprise edition server. I am using a pull subscription. I was able to establish the replication from the developement server to the BI server however when I tried setting up the replication between the production environment and the BI server, using the same configuration settings, I ran into a problem. The production server had been renamed after it was initially set up. I used the sp_dropserver and sp_addserver to solve this issue and was able to run the replication wizard. However once the wizard finishes the snapshot agent doesn't initialize the publication. When I look at the replication monitor it has a status of "never run". I try to run it manually and it appears to run for a few seconds then stops and still has the status of "never run". I don't get any error messages and can't find anything in the log. I am wondering if this is related to the server rename? I am hoping to find a soultion that doesn't involve uninstalling SQL Server and reinstalling since this is a production environment.
Subscription to "Transactional Publication with Updateable Subscriptions" works only one way. Changes take effect on subscriber, but the subcriber is unable to update data on publisher.
I have Sanpshot Agent process running under SQL Server Agent service account with login 'sa.' All agents are running at the Distributor (Publishing Server.)
The subscriber is unable to connect to the Distributor using the SQL Server login.
Following is the error message I get:
Creating Subscription(s)...
- Creating subscription for 'SQL3' (Warning)
Messages
Unable to set the Publisher login for the updatable subscription. You may have to set this up directly on the Subscriber machine using sp_link_publication. (New Subscription Wizard)
I have a pretty big (350 gb) OLTP database that I want to replicate in its entirety. I'm concerned about the impact of taking a snapshot of it (it is processing at some level pretty much 24x7). I know on SQL2005 there is the option to initialize from backup, but unfortunately we won't be on 2005 in time.
I'm thinking of doing something like this:
Set up the distributor, publication, and subscription Turn off distribution agent Set the publisher to "sync with backup" Backup the publisher, full then log Truncate tables MSrepl_transactions and MSrepl_commands in the distribution db (I don't have any other replication going on) Turn off "sync with backup" Restore the full and tran log backups to new subscriber db Create subscriber stored procs in subscriber Start up distribution agent
I'm looking for opinions on whether it's worth going this route to avoid taking the snapshot. Data integrity is the number one priority -- if I have to do a snapshot to ensure that, I will do it.
Data got deleted on Friday evening, need to have database restored to FRiday afternoon and also some data has been entered on Monday, which needs to be there.
Hi to evebody. I'm working with the transactional publication with updatable subscriptions provided by SQL Server 2005. The replication works pretty good from the publisher to the subscriber, but I'm having some problems when the data must go from the subscriber to the publisher.
When I do an update in a subscriptor's table, the database engine shows the following error:
21064 - 16 - The subscription is unavailable for immediate updating because it is marked for reinitialization. Try again after the reinitialization completes.
And rollbacks the transaction.
Does anybody knows what to do to solve this problems.
The publisher is a Windows XP with the SQL Server 2005 Developer edition with SP2 The subscriber is a Windows 2003 Server with SQL Server 2005 Developer edition without SP2 I'm using also the inmediate updating subscriptions. Both operative systems have the MSDTC runing.
Thank you in advance.
Sebastian.-
PS: Sorry about my english, it's been a long time without using it.
An exception occurred while executing a Transact-SQL statement or batch. (Microsoft.SqlServer.ConnectionInfo)
------------------------------
Automatic identity range support is useful only for publications that allow updating subscribers. Changed database context to 'sodimprumde4'. (Microsoft SQL Server, Error: 21231)
For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&ProdVer=09.00.3042&EvtSrc=MSSQLServer&EvtID=21231&LinkId=20476
Thanks for any help, If anyone know how to resolve this problem.
We are new to replication and are testing it in our development environment. We have a peer-to-peer transactional publication on our three servers. The single table in the original publication replicated fine to the two subscribing servers. We next added a new table (article) to the publication. Adding it to the original publication worked fine but the table did not replicate to the other servers. (We previously had changed the schema of the original table and the schema changes replicated properly.) We attempted to recreate the snapsnot using the "View Snapshot Agent Status" option. Clicking the Start button resulted in the display of this message: "[0%] A snapshot was not generated because no subscriptions needed initialization."
This seems odd because a new table was added to the publication and Microsoft help states that the snapshot must be rebuilt. I have read other topics that refer to a @immediate_sync property that must be set to zero. I'm not sure if this is our problem or even how to set this value. Meanwhile, the other servers, as viewed through the Replication Monitor, are complaining that their snapshots do not match the publication snapshot.
I have configured transactional publication where some tables have a simple row filter. And I made a supscription to this publication. When I make snapshot and replication initialization, all rows are updated correctly and passed to subscriber. But when I change some column at a row included by filter at publisher, it is never updated on the subscriber. Other tables and that same table when I remove filter is updated correctly.
What can be problem?
I am using SQL Server 2005 Standard for publisher, and SQL Express 2005 for subscriber, both with SP2.
I have transactional replication configured where the publisher and subscriber are on two different servers. Yesterday a database upgrade was carried out, and the DBA dropped replication by issuing sp_removedbreplication on the published database. The subscription is still set up.
I have two questions:
1 - What is the safest way to temporarily switch off replication without losing the publication or subscription? As far as I was aware (my replication knowledge isn't great), simply disabling the relevant agents would do the job.
2 - I now have the task of creating the publication again (fortunately we have a saved script). If I recreate this publication, will I be able to point the existing subscription at it?
I have an existing publication in sql 2012 with 2 articles, and then I add 2 more articles. After that when I generate a snapshot, will the snapshot be generated for 2 new articles only or for all 4 articles?
I remember adding 1 new articles to one existing publication with 150 articles and when I generated snapshot, it was generated only for 1 article. But I don't remember clearly.
Does it behave differently for small and large number of articles?
-----Table Proc Index Performance TSQL &&%$#@*(#@$%.......------------
We have restored a database on new server without keeping replication settings. Now while creating the publication no tables are shown in new publication wizard window. In fact we have hundred of tables in database and included in replication on source server from where backup was taken.
What may be the reason that tables are not appearing?
How do we know our schema changes will propagate to subscriber without breaking the replication? Is there any t-sql command to find out the option true or false?
Is there a script to find which non-clustered indices are replicated? I know i can do this easily through GUI , having a script will make my life much easier ....
I am getting the error The transaction log for database 'ReplicationDB' is full due to 'LOG_BACKUP'.log_reuse_wait_desc from sys.databases is showing logbackup
The database is subscribed database. We configured transactional replication. But the transactional replication is getting errors and failed. Is there relation b/n this replication failures and log growth in subscriber db?
We have a large database with a small number of large tables in it (and a larger number of SMALLER tables), and it is a publisher for a transactional replication scenario. When I create a snapshot to initialize a new subscription, I notice with the larger tables that sometimes it generates multiple files in the snapshot folder, usually in multiples of 16, and numbers them like this:
With other tables, I'll get just one LARGE snapshot file, named:
MyOtherTable_4.bcp
In the latter case, the file can be very large (most recent is 38GB).
In both cases, the subscription will eventually be initialized, but the smaller files will generate separate log entries every few minutes in the Replication Monitor, showing 'Bulk Copied data into 'MyTable' (34231221 rows)', whereas the larger table will generate only ONE log entry, showing 'Bulk coping data into table 'MyOtherTable', and it may take a couple of hours before there is anything else showing...except for an entry saying, 'The process is running and is waiting for a response from the server.'
My question is: what would be the difference between the two tables that would result in one generating MULTIPLE snapshot files, the other only a single, much larger one? The only difference I can see in the table definition is that the one generating multiple files has a clustered index, whereas the others do not.
We wanted to rebuild our main SQL server machine. How can we backup everything about the SQL server (such as all databases/objects and settings on security and users) and then recover them without any data loss? A related question is how to recover the server machine in case of system failure or whole machine crash down? Thanks!
There is a SQL Server 2008 R2 SP3 Clustered Instance that has Transactional Replication. It is by no means a large replication setup in terms of data/article count. SQL Server was recently patched to SP3 and is current on Windows 2008 R2 Patches.
When I added a new article to replication (via 2014 SSMS GUI) it seems to add everything correctly (replication tables/procs show the new article as part of the publication). The Publication is set to allow the snapshot to generate for just new articles (setting immediate_sync & allow_anonymous to false).
When the snapshot agent is run, it runs without error and claims to have generated a snapshot of 1 article. However the snapshot folder only contains a folder for the instance (that does have the modified time of the snapshot agent execution) and none of the regular bcp/schema files.
The tables never make it to the subscribers and replication continues on without error for the existing articles. No agents produce any errors and running the snapshot agent w/ verbose output provides no errors or insight into any possible issues.
I have tried:
- dropping/re-adding the article in question.
- Setting up a new Snapshot Folder
- Validated all the settings and configurations
I'm hesitant to reinitialize a subscriber since I am not confident a snapshot can be generated. Also wondering if this is related to the SP3 Upgrade, every few months new articles are added to the publication and this is the first time since the upgrade to SP3 that it has been done.
I'm trying to initialize a subscriber from a backup for a pull subscription. The publisher was started before the full backup was made. I followed the instructions @ Books Online topic "Initializing a Transactional Subscription Without a Snapshot" http://msdn2.microsoft.com/en-us/library/ms151705.aspx and How to: Initialize a Transactional Subscriber from a Backup (Replication Transact-SQL Programming) http://msdn2.microsoft.com/it-it/library/ms147834.aspx . I initialized the subscriber by restoring the db before I run the create subscriber scripts. All agents run successfully except for the following:
The network setup is that the publisher and the subscriber are in 2 different domains with the distribution database at the publisher. The subscriber agent is able to connect and run however the error on the subscriber side is:
"Agent message code 14080. The remote server "subscriber" does not exist, or has not been designated as a valid Publisher, or you may not have permission to see available Publishers."
The warning on the publisher side is in the Replication Monitor where in the status it says Uninitialized Subscription.
Any ideas on why it says it isn't initialized? Is there a StoredProc that I have to execute to initialize from backup?
Hi, everyone, I am new in SQL server 2005. I had setup SQL server 2005 P2P replication. Somehow it did not work one of two way replication. I tried to delete the publication. However I could not do it. have the same problem. When I tried to delete the publication, I got the publication " " does not exist.[SQL server error: 20026]. I tried to use sp_droppublication, it gave me error "the database is not enabled for publication". Nevertheless, I can see the publication in MS SQL Management Studio and Publication monitor with OK status. I could not find the distribution database either.
Could you anyone has ideas to delete this publication? I am sorry I am not a programmer. Please give me more detail explanation if you can. Thanks.
SQL Server 2008 r2 - 6 GB memory...I attempted a backup on a 500GB database but it was taking way too long. I checked the resources on the box and saw the CPU at 100%. I checked the SQL Server activity log and saw a hung query (user was not even logged on) that had multiple threads so I killed it and now the CPU utilization is back to normal.
Trouble is, now all of the threads in the activity monitor for the backup show 'suspended' and the backup appears to be not doing anything.
I've written a custom script to delete backup files from location. But unable to modify now to count the number of files are deleted. How to modify the script...
/* Script to delete older than N days backup from a specific directory */
USE [db_admin] GO IF OBJECT_ID('usp_DeleteBackup', 'P') IS NOT NULL DROP PROC usp_DeleteBackup GO
I am planning to take one full backup and Transactional Log backups for every month ..as i will be making the changes in database only once in a month .
And I am aware of that in case of disaster i need to restore database with all the Transactional Log backups . My Plan is to have Transactional log backups for 5 Years and after 5 years i would be taking a full backup .
So should I need to take any other precautions or concerns with this approach.?
I have this very big text file (2.5 GB - which of course I am not able to open). This resides on a server which has SQL 2005.
I have created a linked server for this text file on my SQL database, and when I try to query the table, I get an error:
Cannot initialize the data source object of OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "txtsrv".
Why am I getting this error? When I make a dummy linked server on my local machine and test the steps it all works OK. but when I go to this server where the file and it's SQL instance is lying, I get this error. Please can someone provide me with some help on this?
Many thanks, ~S
P.S. Here is how I created the linked server (I also have the Schema.ini lying under the same folder):
I have two servers, DEV and PROD. Now my DEV server works just great, I can connect to the linked server, query, etc... all is well.
So I'm setting up my PROD server and when I go to add the linked server I get:
Cannot initialize the data source object of OLE DB provider "SQLNCLI".... and Unable to complete login process due to delay in opening server connection.
Now I am running SQL Server 2005 and connecting to an SQL 2000 server.
The odd part is that this works just fine on DEV.
When I go to create the linked server I set:
Linked Server: "LinkedServerName"
Server Type: "SQL Server"
and that's it.
I go to Security and enter my DOMAINUSER.ACCOUNT and then enter the login creds for the linked server.
When I click "OK" I get the above mentioned error code.
3.Create linked server with scripts: exec sp_addlinkedserver N'MyOracle1', 'Oracle', 'ORAOLEDB.Oracle', N'//10.154.14.235', N'FetchSize=2000', '' exec sp_addlinkedsrvlogin @rmtsrvname='MyOracle', @useself=N'FALSE', @rmtuser=N'root', @rmtpassword='Sqlexp!23' 4. After create successful and testing the connection ,I got the error below.(My windows firewall is turn off)
An exception occurred while executing a Transact-SQL statement or batch. (Microsoft.SqlServer.ConnectionInfo)
Cannot initialize the data source object of OLE DB provider "ORAOLEDB.Oracle" for linked server "MyOracle".OLE DB provider "ORAOLEDB.Oracle" for linked server "MyOracle" returned message "ORA-12504: TNS:listener was not given the SERVICE_NAME in CONNECT_DATA". (Microsoft SQL Server, Error: 7303)
This is a problem that never get solved, sometime I can use other way to avoid it, but havn't found a solution yet, i hope I can get some more idea here.
I am using SQL 2005, when I run
select * into #import1 from OpenRowSet('microsoft.jet.oledb.4.0','Excel 8.0;hdr=yes;database=\ws8webjeff2.xls', 'select * from [jeff2$]')
I get
Cannot initialize the data source object of OLE DB provider "microsoft.jet.oledb.4.0" for linked server "(null)".
when I try to compile a SP with that statement in it, I get the same error, like
create stored procedure test
as begin
select * into #import1 from OpenRowSet('microsoft.jet.oledb.4.0','Excel 8.0;hdr=yes;database=\ws8webjeff2.xls', 'select * from [jeff2$]')
end
so it seems the error may not relate to the real file, since at the compile stage, it should not check the real file?
On my live db, after I restart the SQL service, the statement will work, after a while, one or several days, I get the same error again. I can not restart my live db quite often for sure, so now I have another backup db server, I need run the statement on the backup server and then read the data from there.
I have the same problem at two places, both use SQL 2005.
So far there are three questions
1, why it works after restart, but only last for a while? something about memory? since the backup db seldom need restart and work fine after many days.
2, why it gives error in compile stage?
3, why two dbs in different Enviroment has the same problem
The most answer I have gathered so far is permission issue, true I got similar error if the import file is located in a place which SQL has no right to access. But in this case, it should not be.
When I tried to restore backup of publication database, time taken for restoration is too long [ 4 hours]. The actual database restore takes only 20 minutes but the stored procedures used for cleanup replication takes the remaining time. Is the issue related to my environment or the cleanup replication stored procedure?
I have an odd problem that is driving me nutz. I have a very simple SSIS package that imports a 5 colum flatfile into a sql Server 2005 Table.
When I created this package with the wizzard, it will execute perfectly fine and processes all rows into the destination table.
But when I hit F5 to execute it manually it will fail before inserting a single row.
The error it generates is (Spalte 5 is a Datetime in the format DD.MM.YYYY) :
Error: 0xC02020A1 at Datenflusstask, Source - Daten_NC_1_txt [1]: Data conversion failed. The data conversion for column "Spalte 5" returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
Error: 0xC0209029 at Datenflusstask, Source - Daten_NC_1_txt [1]: The "output column "Spalte 5" (25)" failed because error code 0xC0209084 occurred, and the error row disposition on "output column "Spalte 5" (25)" specifies failure on error. An error occurred on the specified object of the specified component.
Error: 0xC0202092 at Datenflusstask, Source - Daten_NC_1_txt [1]: An error occurred while processing file "C:WorkDaten_NC_1.txt" on data row 177.
Edit: Modified the Title so it properly reflects the Problem & the Solution
Hi,I have a big problem with a database in MS SQL SERVER 2000.the rows into the some tables, for the second time, have been mixed betweenthey without appearing reason.the application that uses the db is totally TRANSACTIONAL and they do notexist query that they do not have clause WHERE.The database is on a computer with NAS architecture.I have this problem for the first time in 5 years of use of MS SQL SERVERCan someone help me?TIA