Consolidation - Changing Replicated Data In A Central Subscribing Site
Sep 25, 2006
Hi all,
I am new to replication and have a few questions.
1) Are there any "hooks" available to insert processing when a subscriber is about to copy data from a replicating site?
2) Is it possible for a subscriber to change only his local copy of the data - without replicating the changes back to the publisher?
I realise that once the data changes in one place it isn't really replicated anymore, and I realise that my limited knowledge of the subject might well mean I'm not even asking the right questions. Therefore, I shall try to describe as best I can my scenario.
I wish to use many servers for transactional input (to distribute the workload) and use replication to publish the inputted data to a subscribing central site. One of the tables I wish to replicate has an identity column as primary key, but the records should otherwise be unique - i.e. no two records should differ only in the value of the key. Another table, which should also be replicated, uses this id value as a foreign key.
I can use the identity increment and seed to guarantee no key violations will occur when copying data to the central server. However, there is another issue: Several servers can create the same record but with different id values.
I need to "merge" such records by deleting duplicate entries in the table with the identifier as primary key, and update the foreign keys correspondingly. To clarify (I hope!), here's an example of what data I might have on the central site after copying data from two input sites:
TRANSACTION table
amount = 200, metadata_id = 1001 // Replicated from server INPUT_1
amount = -117, metadata_id = 2001 // Replicated from server INPUT_2
METADATA table:
id=1001 Actitiy=Sales, Country=USA
id=2001 Activity=Sales, Country=USA
What I would like is basically for the central site to identify that metadata 2001 is really the same as metadata 1001, update the foreign key in the TRANSACTION record accordingly and not import (or delete, if this "merging" is done in a post-treatment) the duplicate metadata record.
If anyone can offer any advice on how to achieve this I would appreciate your input.
I need to drop and recreate indexes in some of my tables that are currently been replicated. I am not sure how this will affect my ongoing replication. Will this cause a problem for me? Please help
Hi guys, can you please tell me what are steps we need to take in order to consolidate two datamarts into one? My issue us that we have two datamarts and the client wants the data in these two marts to be in one mart. Please help. I will be grateful if you refer me to any documentation if possible.
there are several remote locations where sql is running, my company has asked me to find a way to collect all the data from the remote locations to a central location automatically,for example day to day data should be synced at night time from 2am to 7 am and it should be compressed automatically before data transfers to the central location. NOTE there is no domain only standalone workstations
Is it possible to replicate data from 3 publishers to a single/central subscriber transactionally? In other words I have Server A, Server B, Server C with databases A,B,C respectively. I need to replicate 2 articles from A,2 from B and 2 from C to a central Server D that hosts database D. D will have only 6 articles. The replication is Transactional Replication.
If it is possible what will be the drawbacks of such implementation? (if one server goes down will the whole replication break?) If not possible then what is the best way of implementing this?
Greetings: I am trying to gather into a central location the missing index data from the sys DMV's for dynamic index creation in the next step. In trying to use a cursor, I get the following errors:
Msg 154, Level 15, State 3, Line 20
variable assignment is not allowed in a cursor declaration.
Msg 102, Level 15, State 1, Line 94
Incorrect syntax near 'Get_Data'.
Msg 16916, Level 16, State 1, Line 2
A cursor with the name 'Get_Server' does not exist.
Msg 16916, Level 16, State 1, Line 3
A cursor with the name 'Get_Server' does not exist.
Here is the SQL:
--CREATE PROCEDURE usp_Get_Missing_Index_Data
--AS
--Declare @Sql2 nvarchar(4000)
Declare @Sql nvarchar(4000)
DECLARE Get_Server Cursor -- gets a server name from a list of servers
for
Select MachineName from rsqlaudit1.DBStatistics.dbo.servers
Open Get_Server
Declare @Server nchar(20)
Fetch Next from Get_Server Into
@Server
While (@@FETCH_STATUS = 0) --and (@@FETCH_STATUS <> -2)
BEGIN
DECLARE Get_Data Cursor
FOR
select @sql= 'select distinct id.*
, gs.avg_total_user_cost
, gs.avg_user_impact
, gs.last_user_seek
,gs.unique_compiles
from '+@Server+'.master.sys.dm_db_missing_index_group_stats gs
,'+@Server+'.master.sys.dm_db_missing_index_groups g
,'+@Server+'.master.sys.dm_db_missing_index_details id
where gs.group_handle = g.index_group_handle
and id.index_handle = g.index_handle
order by gs.avg_user_impact desc'
exec (@Sql)
Open Get_Data
DECLARE @Handle int,
@database smallint,
@object int,
@equality nvarchar(4000),
@inequality nvarchar(4000),
@Included nvarchar(4000),
@statement nvarchar(4000),
@avg_user_cost float,
@avg_user_impact float,
@last_seek datetime,
@compiles bigint
Fetch NEXT FROM Get_Data INTO
@Handle,
@database,
@object,
@equality,
@inequality,
@Included,
@statement,
@avg_user_cost,
@avg_user_impact,
@last_seek,
@compiles
While (@@FETCH_STATUS = 0) --and (@@FETCH_STATUS <> -2)
BEGIN
insert into rsqlaudit1.DBStatistics.dbo.Missing_Index_data
In our environment (SQL 2005) we have a database that uses Transactional Replication to sync data between two SQL 2005 servers. There is a web app that reads/writes data to the publisher server and the other server (that gets the replicated data) is used by some other internal applications.
At times, there is a need to delete some data from the publisher server...but this can ONLY happen once the data has been successfully replicated to the second server. Is there any way to determine if a row has been replicated successfully?
I have an application that uses web-based merge replication. My publisher is SQL 2005 and my subscriber is SQL 2005 Express. I control the replication with RMO code. If I make changes to the data in both databases using SQL Server Management Studio Express, my RMO code correctly syncs the two databases. However if I make changes to the data at the subscription through my application, these changes are not picked up by the replication process, even though the changes are present if you check the tables through Management Studio. What would cause these changes to not be recognized? Any ideas would be appreciated.
I am subscribing the report from the report manager in File Share mode.But the Report is not getting stored in the specified location after the Schedule time. This would be the problem the with Reporting Configuration,Because the trigger run status is blank in report manager.Could anyone help me out in this issue.
I have a report in sql that i would like users in sharepoint to be able to subscribe to, however when i try to subscribe to the report i get the following error -
The current action cannot be completed because the user data source credentials that are required to execute this report are not stored in the report server database. (rsInvalidDataSourceCredentialSetting).
I have a report in sql that i would like users in sharepoint to be able to subscribe to, however when i try to subscribe to the report i get the following error -
The current action cannot be completed because the user data source credentials that are required to execute this report are not stored in the report server database. (rsInvalidDataSourceCredentialSetting).
I have two publications on a SQL Server 2000 database. I am able to create two subscriptions from another SQL Server 2000 database and synchronize both in succession.
However when I try to repeat this going from SQL Server 2000 to SQLCE 2.0 it fails. The first goes OK. The second fails. I get error 80004005, 28521 (The SQL Server CE database is already enabled for publication. Is it possible to do what I am trying to do on CE?
The reason I have 2 publications is because the first is non-filtered and goes very fast via bcp files when reinitialized. The second is dynamically filtered and not as fast. Breaking them up makes reinits go much faster.
I have seven SQL servers with 1 or 2 databases on each. I need to move the databases and consolidate them down to one or two servers. I am new to this and hear about detaching and attaching the databases. Is this the way to go? If so, can someone tell me how. I am sure it's not as simple as copying the way things are done in the Windows environment.
We are using push subscription using transactional replication. Is there a recommended value for retention period on distributionDb? We are using default value of 72 hrs and recently we saw an issue where data was not replicated with an error that subscription was inactive. When I searched, I fid that it is related to the retention period setting on distribution DB.
I need some advice and I think I have a possible solution, I just need some approval that it is the right thing to do!
I have 73 Databases which I look after and they are all SQL 2000 and SQL 2005 DBs and backuing up these DBs is becoming an Administration nightmare. So what I was planning is to recreate all the backup jobs jobs on a SQL 2005 Server and monitor them from there!
Is this a good idea or should I think of something else?
Can anyone point me in the direction of some best practice resources for consolidating Sql 2005 on a 64 but Unisys platform?
The web seems a bit sparse unfortunately and most of the best practice information seems oriented around 2000. This is a great article but Im not sure how much has changed in 2005 http://www.microsoft.com/technet/prodtechnol/sql/2000/deploy/64bitconsolidation.mspx .
here is the idea, we are trying to automate our weekly import process. we are running the sub packages with it own configuration file in sequence. i want to integrate the process in to a main package ?
i also want to create a job to run this main package. do i need any configuration file for the main package?
i know how to do it in sql2000 but i don't know in sql2005. can anyone point me in the right direction that would be great
Package1 and Configfile1 Package2 and Configfile2 and so on...
Currently i am running it manually one by one on SMS using integration services. I have assigned to automate the process or consolidate into a single package. I can automate it using execute package tasks without config file. there is no option in the execute package task to point to the right config file.
Dear All, i've one database replicated from production server. now i need to change one perticular table column datatype. what steps i need to follow to do this?
thankyou very much
Arnav Even you learn 1%, Learn it with 100% confidence.
Hello AllI was wondering if there's a way to monitor/measure data-transferbeing taking place between 2 serves in a replicated environment.I cannot see any counters, etc. to monitor this..??thanksSunit
Is there somewhere that I can tell when the last time either the publication or better yet a subscrition replicated data (in a system table or view maybe)??
I want to set up monitoring to make sure I am aware if something for some reason does not replicate
Can anyone point me in the direction of some best practice resources for consolidating Sql 2005 on a 64 but Unisys platform?
The web seems a bit sparse unfortunately and most of the best practice information seems oriented around 2000. This is a great article but Im not sure how much has changed in 2005 http://www.microsoft.com/technet/prodtechnol/sql/2000/deploy/64bitconsolidation.mspx .
How would I best go about changing a published table's column from smallint to int? I could not find anything about it in BOL or MS.com. I do not think EM/Replication Properties allows the change. I suspect I have to run "Alter Table/Column" on the Publisher and each Subscriber the old-fashioned way. Is that true?
We have a master database (SQL 2014 Std) from which data are imported from XML files (send by en ERP system) using SSIS. There is about 12 other servers (SQL 2014 Express) located in remote warehouses. People will uses PocketPC to scan barcode of products in the warehouses and all operations must be forwarded to the master DB to be exported in a XML file for the ERP system.Now, each warehouses are independant. How can I setup the replication so only data belonging to a specific warehouse is replicated to its corresponding DB? I thought about creating views, one for each warehouses, and setup a replication for each warehouse, so there would be 12 merge replications configured. Is it fine?
I'm trying to use 2005 Integration Services to import data from a web address into a SQLServer 2005 database.
The address I want to download data from is http://www.nymerc.com/futures/innf.txt
I'm not sure how I am supposed to access the data on the website. What kind of connection manager do I use? Flat File? HTTP? When I try to use a flat file connection manager, I set the connection string to 'http://www.nymerc.com/futures/innf.txt', but when I click OK, the connection string gets changed to 'c:documents and settings....Temporary Internet FilesContent.IE5V01H744)innf.txt'
Is this expected?
What's the best practice for using a web page as a data source?