Merge Replication Between Same Schema Databases But Different Data
Jan 25, 2007
Hi all.
I 'm trying to set up merge replication between two servers that have the same schema databases. The two database have the majority of there data the same but as well as data inserted at a later time independently on the two servers. (The two servers were connected in a merge replication scheme that failed at some later time and replication was paused, but users continout to insert data indepentedly at the two servers.)
I need to get them up and running.
I cleaned replication at both servers, I recreated the publication at the puplisher distributor and all is fine.
When I create a push subscription to the subscriper I get the error invalid column name 'rowguidcol' .
I so far managed to have merge replication running on two identical databases (schema and data).
Just some thoughts. After some reading I found that it might be related to identities and identity range or indexes. I set the identity seed and increment at 2, 2 at publisher and at 1,2 at subscriper. (On the same tables at puplisher and subscriber.) Is that ok? is that the way to do it?
Digging a bit more Using the SQL Profiler I can locate the error to happen when sp_MSaddmergetriggers executes.
Thanks allot for any help
Version.
Microsoft SQL Server 2000 - 8.00.2039 (Intel X86) May 3 2005 23:18:38 Copyright (c) 1988-2003 Microsoft Corporation Standard Edition on Windows NT 5.0 (Build 2195: Service Pack 4)
I wonder if I can ask for some advise? I wish to use SQL 2005 and merge replication but I have a few concerns over the schema changes that I am allowed to do on the publisher.
Ideally I want sql replication to push all my schema changes from the publisher to the subscribers but I don't know if it can. or even if it is recommended.
I see the BOL suggesting that schema repl. was mainly intended for ALTER type of statemenets, but can I use it for CREATE and DROP type DDL too?
For example, if I add a new table at the publisher I presume I have to add a new article too for this table. Then how can I create a new snapshot? Is this even the right way of doing it? Similarly if I want to DROP a table at the publisher I'm guessing I have to also delete any existing articles too, then recreate the snapshot.
Are there better / safer ways of doing schema changes? Perhaps I'm best disabling publishing and asking each remote site / db to update their schema manually and then re-enabling the publisher?
I am using a merge publication set to synchronize schema changes. Why am I getting the following message when I try to sync after having added columns in the publication database using ALTER TABLE statements?
The schema definition of the destination table ... in the subscription database does not match the schema definition of the source table in the publication database. Reinitialize the subscription without a snapshot after ensuring that the schema definition of the destination table is the same as the source table.
Also, why doesn't reinitializing the subscription with @upload_first = false not fix the problem?
I'm trying to set up a merge replication. Publication is created successfully and snapshot also, but when I create pull subscription on subscriber server and merge agent starts, after some time I get an error message of this type:
The schema script '\ANILREPLDATAuncANIL_BEJK_BEJK20070625142735dl_HF_vMSCene_3836.sch' could not be propagated to the subscriber. It seams there is a problem with certain Views and SPs, because tables are successfully created, and some Views and SPs also.
I tried to exclude problematic articles, but every time another one pops up. Up until now, I excluded 7 articles from publication (1 Stored Procedure and 6 Views) but I still get errors.
I gave up because I can't exclude half of the Views and SPs just to make it work.
Is there something that can be done to solve this problem?
Question re Merge rep (pull) and processing order.  We have a group of changes associated with an app upgrade, the scripts run fine on the publisher.   Part of the change includes creation of a new table , followed by altering a view to use new table.Following the change at the publisher, when the sync is kicked off from the subscriber, it fails - the alter of the view throws an'invalid object' error with regard to the new table. Seems as if the view alter is attempted before the dependant table has been created.Â
I have tried to amend the processing order of the view using sp_changearticle, which executes (quickly) with a 0 return code.But it is to no avail , the error still occurs.  is it possible to change the processing order for a view article , which will be applied to schema changes ? Have
I have a problem with sql server merge replication in sql server 2000. If my db owner €œdbo€? and replication setup under €œsa€? account it works with out any problem. But when I use another db owner it can not work properly. For an example I have customer table ([dbo]. [Customer]) When I setup merge replication under sa account it€™s work properly. Again I was setup merge replication using another db owner ([INV]. [Customer]) It doesn€™t work.
We have a big problem here with merge replication, specifically whenever a schema change occurs. We are replicating schema changes, and triggers/stored procedures. The example is that we changed about 150 stored procedures and about 30 triggers. This is then replicated to the subscriber database (which is also a merge publisher for further remote systems that were offline at the time) over a 10Mb link - hardly low bandwidth. However, the replication process takes about an hour and a half - considering the SQL on the primary server took less than a minute to run this is a big suprise.
We've run a trace to see if we can identitfy what is going on. There seems to be a great number of calls to sp_MSunmarkschemaobject - we are still waiting for a trace to complete to fully analyse this however it looks like it calls this repeatedly for every stored procedure in the database. We are currently re-testing to one of the remote servers with the merge agent set to the slow profile (not much hope that this will alter the poor performance).
This task looks to be excessive - and certainly does not seem to function in a sensible manner. Has anyone else had similar issues or have any suggestions. This is very infuriating as it means that the servers are effectively offline for a minimum of and hour and a half (in fact the remoter servers take over 4 hours !).
We have a couple of databases on separate servers that have exactly the same schema, and we would like to set up merge replication between them.
Is that possible? The few times we've experimented on test databases, the subscriber database has been trashed and rebuilt using data from the publisher, rather than preserving data from both databases.
I'm really new to replication in general, so any kind of advice would be helpful.
thanks 4 ur rpl..well i have a simple doubt if u can guide me it will b gr8 help..well i am developing an application of location based services, in which i am going to store my data on SQLserver 2005 and will fetch the data from it when the user logs in to the user's PDA device so that a user can work offline on the data when there is no connection and when he gets the connection he can synchronize the data with SQLserver 2005. so now i m confuse whether i should use the merge replication or ado.net synchronization framework or RDA?what is the difference between merge replication and synchronization and which is better?
I have a question for you. One of our client has asked us how is the data secured over the air during Replication?
I have read that Merge Replication uses TLS (Transport Layer Security) protocol to secure the data over the air. But I was wondering if it is all done automatically ? Or do we need to install certificates like for SSL.
We have a Windows Mobile 5.0 application using SQL Mobile that runs over 50 devices and they all synchronise with a single Publisher SQL Server 2005 using GPRS connection. We haven't got any SSL certificates installed on the server.
Now how can i make use of TLS in my application to secure my data over the air (using TLS)?
I am designing a mobile application.I am having database in sql server 2005 and i want to synchronize this database with sql server ce database on pda.I am using merge replication method for synchronization.Please give step by step procedure for data synchronization using merge replication between sql server 2005 and sql server ce 2.0,also let me know whether sql server enterprise manager is needed or will it work without it,if needed give me the link from where to download it as it is not available in sql server 2005. Reply soon. Thanks in advance.
We had merge replication setup between 2 tables, Table A and Table B using SQL 2000. This was working 100%. The users asked to disable updated/deletes to both these tables if data existed on 2 other tables. Table AA and Table BB. We implemented it as follows:
1) Created Insert/Update/Delete triggers for Table A & B. It basically check for Table A is there a record in Table AA, if it exists, raise an error and don€™t commit.
2) Removed all foreign constraints from Table AA and BB
3) Added Table AA and BB to the current replication.
Then all hell broke loose, we got conflicts all other the place saying that Table AA cannot be updated because records does not exist in Table X. To our surprise we found triggers generated by Erwin in 1998 €“ that check for €œforeign contsraints€? and removed them immediately.
We continued to get conflicts but could see from the error messages it was generated by the triggers in point 1. We added the NOT FOR REPLICATION clause and everything has been running smoothly or so we thought€¦..
After 2 months we got a call that data is missing. It€™s random data and the only explanation I have is that replication caused that. My biggest reason for saying this is tracking the application audit trail I€™ve found that all the data missing was added during the period we had all the conflicts.
I need a solid explanation for this and can anyone confirm that this is possible?
We have a database that's using merge replication between two servers, and we need to insert a lot (about 1GB) of data into it.
The servers, however, are separated by a 192k WAN connection, so it's impractical to rely on the merge replication to send the data across to the subscriber.
Is there a way to insert the data at both ends? I can get the data out there on a DVD or a laptop easily enough. Can I load the data into both copies of the database and tell the merge agent that it's not to be replicated?
When I reinitialized a subscription from SQL2005 db (publisher) to a SQL 2000 DB (Subscriber) and had the Upload changes before reinitialization turned on, the data from the SQL 2000 db didn't get uploaded. This resulted in a loss of 2 days worth of data.
Does anyone have any idea if there's anywhere that data is kept. (unfortunately a series of errors caused our backups to not be running on the 2000 db either).
I'm having an error with merge replication my suscriber are downloading old data, all this start happen after I install SP4.
let explaint this with more details ,
if a made an update to a row in a table from the publisher , the agent history show that 1 change have downloaded to the suscriber, a put an audit in the suscriber and yes one update was downloaded but the row still the same, look like the suscriber downloaded old data instead of the curent change that i made.
If i delete the row, the rows gets delete at the suscriber, if a insert the row the row get insterted ok.
The problem only happen with rows inserted at the suscriber, that later on get update at the publisher. My suscriber has MSDE 2000 SP3 an my Server has SQL Enterprice 2000 SP4.
We have configured one-way merge replication in our topology. That is data flows from Subscriber to Publisher only. We have a publisher and a subscriber. There are 3 publications in this category and each publication has a subscription. We use SQL Server 2005 SP1 in both the servers. The retention period is the 14 days (default). After this period, I get the following error in the subscription in Replication Monitor. The Error message is
Error messages:
The Merge Agent failed after detecting that retention-based metadata cleanup has deleted metadata at the Publisher for changes not yet sent to the Subscriber. You must reinitialize the subscription (without upload). (Source: MSSQL_REPL, Error number: MSSQL_REPL-2147199402)
I read the post http://forums.microsoft.com/MSDN/ShowPost.aspxPostID=372790&SiteID=1
which said that this error might be solved in the SP2. We have not yet applied SP2, but even after applying SP2, will this error be solved for One Way Merge Replication since the data from publisher will not go to the subscriber always in this type of topology??.
Kindly get back to us regarding this as soon as possible. Thanks in advance.
I want to transfer data between an sql server 2005 and my pocket pc. So i've installed a sql server CE on it.
I know that i can use the merge replication but, i don't have any IIS. So, i've got to transmit data though email that have got a file attached. I Know how to email from sql server 2005. When the pocket receive that email, i don't know how to insert the data in sql ce from outlook compact. Doi have tu use SSIS , in that case how can i run the package?
I have the merge replication (Push)Â for SQL Server 2008.The right-click option for Start and Stop Synchronizing in subscriptions has been disabled. How can I stop the merge replication and start again ?
This error happened 1 week after I created the merge replication. The merge data cannot sync to subscriber and only solution I can use is drop and re-create the merge replication again but error would be re-occur a week later. The merge replication work fine before and start from middle on May 2015, but it keep happen right now.
1st for displaying data before inserting the value into table 2nd for displaying data after inserting the value into table just see the screeenshot which i have taken
actually the problem is, we are not getting any type of errors even though I have try/catch blocks. I have colored that particular statement in RED color, where it is getting stuck, while executing, if we sit 40-45 minutes then also the execution pointer is not going on next step.
please look into this, and if u have any idea/person by which/whom we can solve this then please tell us.......
-chaukse rahul
using System; using System.Collections.Generic; using System.ComponentModel; using System.Data; using System.Drawing; using System.Text; using System.Windows.Forms; using System.Reflection; using System.Data.SqlServerCe; using System.Data.Common;
namespace ProjectSQLMobile2 { public partial class Form1 : Form { private DataSet dsMemberList;
private void buttonInsertData_Click(object sender, EventArgs e) { //add directly into ce database sample: SqlCeConnection cn = new SqlCeConnection(@"Data Source=""TestLast.sdf"""); string SQL = "INSERT INTO MembershipData(MemberName) VALUES('" + textBox1.Text + "')"; cn.Open(); SqlCeCommand cmd = new SqlCeCommand(SQL, cn); cmd.CommandType = CommandType.Text;
try { /* * This statement is taking too much time to be executed * what should we do??????? */ cmd.ExecuteNonQuery(); } catch (SqlCeException ex) { DisplaySQLCEErrors(ex); } finally { cn.Close(); } Merge(); } } }
please help,
ur help will be appriciates
NOTE:- I am getting output in 1st datagrid, but in 2nd dtagrid there is no output
This is just too simple. I must be doing something wrong. I have tried everything, redoing the SQLMobile merge publication, setting permissions, and need help ASAP.
The basic test with the http://localhost/sqlmobile/sqlcesa30.dll returns the message:
SQL Server Mobile Server Agent 3.0
I have tried the merge replication with both my local SQL Server 2005 installation and a remote development box. Both instances return the same error.
I am able create and replicate a database using the sample application:
I am new to Sql server and had just finished the MErge Replication setup on one of the PRoduction server. Today I got the request to change one of the Datatype of one the Published Article. Please help as what are the correct step to make it happen. Is there any production downtime required and if yes , then how much.
If I want to access or modify my local subscription data(not the configuration) how can I do that(from sql server 2005 or from asp.net)? Also can i update directly to the local subscriptions data or do I need another layer which will update from a table for example??
I'm working on a replication topology that is completely merge. We have a single consolidated instance (SQL 2005 SP1 Standard) that holds all data and is a continuous push merge publication filtered by region to regional instances (SQL 2005 SP1 Standard). Then we have individual user instances (SQL Express SP1) that pulls from the republished regional instances which is filtered by user. Both publications have Replicate Schema Changes set to true.
I'm testing out changes to tables and sps on a test system I've been using this process:
1-Run Snapshot on the Consolidated instance
2-Verify all published articles have a status of 2 in sysmergearticles
3-Run Regional Snapshot
4-Verify all published articles have a status of 2 in sysmergearticles
5-Run alter table scripts
6-Once all three levels have the table changes, run the alter sp scripts
I've gotten to step 5 and and the changes get replicated to the regional instance just fine however only the existing column changes get replicated to the SQLExpress instance, not the new columns. Looking at the articles in the regional publication it shows the new columns, but they are not selected. I know I can manually select them (or probably write a script that adds them to the publication although sp_repladdcolumn has been depreciated), but isn't there a way to make this a completely automated process since it's just a republished database? Also is the process I'm using the correct one?
I've got the following problem: Our product is delivered with SQL2000 and SQL2005.
Now, there are some schema changes, which I'd like to deploy with T-SQL on the publishers. With SQL2000 I do it with the sp_addmergecolumn etc and on SQL 2005 (if replication compatibility level is 90) with replication of DDL. So far so good.
But how can my T-SQL Script determine, wether the replication of DDL is on? I know there exists sp_helpmergepublication, but how do I get the column replicate_ddl of the result-set?
I would like to use SSIS tool to move the data from one database schema to another database schema.
For example:
Source table has
1. UserName (varchar 20) (no null)
2. Email (varchar 50) (can be null)
Destination table has
1. UserID (uniqueidentifier - GUID)
2. UserName (varchar 50) (no null)
3. EmailAddress (nvarchar 50) (can be null)
4. DateTime
Questions:
1. What controls do I use in my Data Flow to make data move between databases with different data types and include new value in UserID as a new GUID and DateTime as a date (GETDATE)?
OLE DB Source, OLE DB Destination, Data Converson and .....
How do I insert Guid and Date at the same time?
2. I have many tables to do data moving. Any sugestions? How do I architect my project? If I create many data flows for each table - it will look complicated.
I am using SQL Server 2000. I am working on a project where there will be multiple databases on a single instance of SQL Server. Each database will have the exact same schema but will be accessed by different groups of users. What is the easiest way to sychronize changes between the databases, so that if I add a column to one database, it will be reflected in the other databases. If I add, remove, or alter a stored procedure, I want the change to be made in the other databases. I want the data in each database to remain isolated. In other words, I do not want replication of the data, only the schema of the databases. I would like to have a single "master" database that I use to make any schema changes and all the other databases be schema mirrors of this database each with their own data. I have looked into SQL Server replication, but this didn't seem to work the way I wanted and I wasn't able to publish column changes etc.
I'm looking for a tool that can extract only the schema from a databasein a form that can be used to generate that schema in another emptydatabase. This is to facilitate our disaster recovery processes wherewe need the objects only, not the data, and need to replicate this toour disaster recovery site over the WAN. There are plenty of toolsthat can handle a single database, but does anyone know of any toolsthat could handle multiple databases where many of the objects (storedprocs and views) are dependent on objects in other of the databases(tables). This is a home-grown ETL suite so making changes to the codeto remove these dependencies would take way too much effort. I amlooking for something that can either extract the schema for all 3databases and handle the object creation ordering to account for thedependencies (a simple method would be to extract by object type acrossall databases, e.g. tables for all dbs before views before procs), or abackup/restore tool that allows you to restore the objects only withoutdata. Worst case we could write something to generate the DDL or useSQL DMO, but ideally we would prefer to purchase a (relativelyinexpensive) tool to do it.Thanks,Simon