I'm working with two databases on to physicaly separated servers: #1
is on a local machine en I can only connect to #2 via an ADSL
connection.
I need to syncrhonise the data. This works fine via the wizard from
the Enterprise Manager. But I want to get this done via a script form
the Query Analyzer that I can run whenever I want to. Something like:
insert into srv1.database.dbo.table1
select * from srv2.database.dbo.table1
When I do this I get an error about "linked servers". That surprises
me, because I have no linked servers configured and yet the wizard
works fine?
I am in the position where I have to transfer data from an old database schema to a new database schema. During the transfer process alot of logic has to be performed so that the old data gets inserted into the new tables and are efficiently done so that all foreign keys are remained and newly created keys (as the new schema is Normalised alot more) are correct.
Is it best if I perform all this logic in a Stored Procedure or in C# code (where the queries will also be run)?
I have been giving myself a headache over deploying and synchronising sql databases.
Can anybody point me in the right direction of a tool that can help me to synchronise changes of structure / data between two databases. I have used the demo version of SQL compare, but I think its way too expensive. Does Microsoft produce a tool that can perform similar functions? If not can anybody point mew in the way of a tutorial that can help me with migration / deployment / synchronisations of SAL Databases.
For our application we want two Publisher models: 1) running SQL Server 2005 standard on Windows Server 2003 2) running SQL Server 2005 standard (or workgroup) on Windows XP pro
The reaseon is simply cost. The first for clients that may have 10s / 100s of SQLce subscribers, the second for just 1-5 SQLce subscribers.
We want to use Web syncronisation. Problem is when setting up on the XP box I get to the final bits of Microsoft's setup document and it starts to tell you to set up an application pool. As far as I can tell this is only available on IIS6 on Windows Server 2003.
Searching the Internet I don't seem to be able to find a definative 'don't be a pratt' you can't run Web Synchronisation from a XP pro box.
Could someone let me know if it is or is not possible. If I could get SQL Express to be a publisher (to SQLce subscribers) I would have prefered not to have the cost of a full verision of SQL Server for our smaller clients.
hi every body please help me out on this issue , since i am at final stage in finishing up the module.
Previously i had done web synchronisation using RMO it is working fine at my office where i have an 2003 server which is an domain controller , publisher i have configured on the 2003 server , and i had subscribers on xp Machines where it was well working fine, clients were not on the domain , it is an independent Machine. we had an security error which was resolve by this same forum, ie the client windows login id should exists on the server too. only then synchronisation is happening at our office,
Now the problem is i have an same setup , but the server 2003 is hosted at a remote place , which is not an domain controller, now if i try to synchronise from our office i am getting "A Security Erorr Occured" , is it so that the publisher should be configured as an domain controller please any body resolve this issue
iam able to open the web sync diagnostic information using URL https://itneersrv.abitta.com:443/Synchronise/replisapi.dll in the internet explorer but while running throuth the c# code to run Merge agent i am gettin the following error ,
[0%] InProgress The Merge Agent could not connect to the URL 'https://itneersrv. abitta.com:443/Synchronise/replisapi.dll' during Web synchronization. Please ver ify that the URL, Internet login credentials and proxy server settings are corre ct and that the Web server is reachable. A security error occurred
I am trying the synchronise through Internet where i have an publisher is at different country and subscriber at my office in India , i have given all privileges but then too i am getting same error , can any body help out on this..i have posted serveral times , few people responed but i could'nt resolve this issue
We have a central server, with offices from other country's synchronising with our main DB. Some are fine, but many for some reason keep getting the same error message time after time:-
"The process could not log conflict information"
You then go into the problem and it says it can not read the file of a particular table due to Error 53.
I have no idea what this means. If anyone can help, I'd be very appreciative!
We have a fair number of databases in different countries and we synchronise with them everyday. Apart from all the problems we have, recently it has gone well. In HQ, we have local database and also 1 which the countries synch with. Therefore, the local database needs to synch with the synch database. That is successful every time. The only problem we have is that after synchronisation is complete, there are differences in data in the 2 databases and we don't understand why - after all, synchronisation is meant to keep the databases the same. If anyone has any ideas why this is happening, I would be very grateful if you could get back to me. Thanks.
I have a question regarding replication and or database synchronisation:
I am doing some work on a SCADA control system in .Net - (SCADA=[S]upervision, [C]ontrol And [D]ata [A]quisition)
Obviously, these are mission critical systems that must have uptimes of greater 99%.
Tradiotionally, we have used custom written database in assembler or C, but these are getting rather long in the tooth, and, in my opinion, represent a massive man-time investment developing something that already exists 'off the shelf' (for example, our custom written database, the first version of which roled out nearly 20 years ago doesn't support SQL)
So, I'm looking to see if there is an off the shelf DB that can do what we need it to do. I lean towards Microsoft because of it's integration with the .Net platform.
However, probably the most important criteria for us is that of database synchronisation - as data from the plant arrives it is stored in a database (obviously) - however, at the SAME TIME, it is sent to a back up (standby) database which is identical in every respect, except it does not 'answer' queries, it only listens and keeps it's database up to date. Then, should the 'primary' database server fall over, the standby will become active, and it is up-to-date ready to run.
Does SQL server offer such a facility that changes to a database can be immediately and automatically 'echoed' to a standby database? Automatic promotion from standby to main is not an issue, as this decision would be taken by the application software 'in front' of the database - if it cannot contact the main DB server, it will elect to start talking to the standby, thus, philisophically (sp?) the standby is now the main server.
Hope I haven't confused anyone here with my rambling?
The problem is my hunch (not being a DB guru) is that replication is NOT what I am looking for - I *think* replication does sync data and schema (correct me if i'm wrong) but not immediately - it is scheduled? Yes?
I am looking more for synchronisation - but it has to be 'on the fly' - not every 15 minutes or 8 hours etc...
If SQL server cannot do it, then it's not a major problem, since the DB access layers of the applications themselves can simply write their data to two DB's, and queue writes in a buffer if one of the partners is offline, such that they will be written to the partner when it comes online - however, I'm all for an easy life, and if the boys at Microsoft have already settled this issue, then i'm all for that!
I am using a sql server database for storing gps data from a remote location. The average data stored in the db is around 2 GB/day. For security reasons, my application is working on another server which has same db. I need to update data in the second database within an interval of 2 seconds. The user entry in the application server has to be synchronised with first server also. Is replication is possible in such a small interval?. Anybody have any suggestion?
We have a SQL Server 2005 database supporting our main ecommerce site. It is hosted remotely by our ISP. We currently maintain a local copy of this database by running daily stored procedures to keep the local copy up to date. However we'd like to explore the use of replication over HTTPS as an alternative means of keeping the local copy in sync with the ISP database.
Could anyone comment on the technical feasibility of doing this, given that transaction throughput on the ISP server can be several hundred hits per hour? Is web-based synchronisation likely to suffer from poor performance, or to adversely affect users of our ecommerce site?
i have done Web Synchronisation Using RMO , where i have an remote server at a remote place, where the Virtual Directory is Configured With SSL, for testing purpose if i access from the Client Computer entering http://72.17.246.214/SyncAbita/replisapi.dll?diag " in the address of IE giving an errormessage "Unable to Reach Remote Host", if i remove SSL on the Server , i am able to access "SQL Websync diagnostic information" Page from the Client Machine, is it not possible to get perform Web Synchronisation With SSL, then i have inStalled 7 days Trial Version SSL on the Sever. Can any body help me out on this Issue
I hope this is the right forum for my question. I am very new in Databases. I have such a problem:
There is a Database on SQL-Server in a country X. This database has tables like
Table 1,Table 2 ,.....Table 10;
I develope an Application with Visual C#.NET 2.0. I use a local SQLExpress database, which has tables like
Table1,Table 2..Table 6,Table A,Table B.
As you see Table 1-Table 6 are same in both databases (not the whole database ). And these tables must always have the same information (Server database changes->local database must read the changes). Because of that there must be a Synchronisation between them.
How can I solve this Problem? Do I need to write code or is there any simple solution?
I am using a sql server database for storing gps data from a remote location. The average data stored in the db is around 2 GB/day. For security reasons, my application is working on another server which has same db. I need to update data in the second database within an interval of 2 seconds. The user entry in the application server has to be synchronised with first server also. Is replication is possible in such a small interval?. Anybody have any suggestion?
Im looking for some recommendations on tools I can use to keeo my dev/test/prod databases in synch. Im tired of doing this manually. I use SQLSever 99% of the time but if there is a tool that can manage other database also I would be interested in hearing about it also.
Can anyone guide me. which is the best method for real time synchronisation of my production server. Is it Transactional Replication or Stand By Server?
Yes, I know synchronisation to alternate partners is deprecated in SQL2005 but....
In SQL2000 there is a Sync Partners tab in the publication properties dialog that allows you tick a checkbox for each co-publisher to be enabled as an alternate synchronisation partner. What is the equivalent in SQL2005?
I've set up replication in SQL2000 following these instructions http://support.microsoft.com/?kbid=321176 and it works. Now I'm trying to do the same thing in SQL2005 but I can't find a substitute for steps 10 & 11 in the section "Set Up the Alternate Synchronisation Partner". What's the answer?
I am developing a application on PDA using c# compact framework. I would like to do synchornisation of sql mobile 2005 database with sql server 2005 database using web service.
The reason for going to webservice is based on the client request.
Will anybody help me on this, how to perform conflict management etc.
We are using HTTPS anonymous merge subscriptions....
Sometimes when trying to synchonise, we will get the following error messages returned to the subscriber....
The upload message to be sent to Publisher '**thewebserver**' is being generated The merge process is using Exchange ID '0F65CFCB-AF17-47DC-8D98-493A44C243E0' for this web synchronization session. The Merge Agent could not connect to the URL 'https://**thewebserver**/client/replisapi.dll' during Web synchronization. Please verify that the URL, Internet login credentials and proxy server settings are correct and that the Web server is reachable. The Merge Agent could not connect to the URL 'https://**thewebserver**/client/replisapi.dll' during Web synchronization. Please verify that the URL, Internet login credentials and proxy server settings are correct and that the Web server is reachable. The Merge Agent received the following error status and message from the Internet Information Services (IIS) server during Web synchronization: [401 :'Unauthorized']. When troubleshooting, ensure that the Web synchronization settings for the subscription are correct, and increase the internet timeout setting at the Subscriber and the connection timeout at the Web server.
If I then go to a web brower, put in the HTTPS address, it brings up the logon dialog - I put in the admin username and password to confirm the connection and that's fine.
We try and synchronise again, and this time it works - it's as though I have 'woken' it up again and it's happy to play.
Is increasing the timeouts as suggested by the error message the way to go ? If so, where does one set the 'internet timeout setting at the subscriber', and the 'connection timeout at the webserver' ?
I downloaded orcas and created an application which also has sql server synchronisation, now if i wish to run it on another m/c, where can i get the runtime for 3.5, I searched but could not find it.
If my application updates a SQL Express database at the same time as my service is performing synchronisation of the same database via RMO, intermittently some of the rows updated will not be included the next time synchronisation is performed.
Does anyone know what's happening and what I can do about it?
We have about 70 clients, each with their own databases synching to a copy of their database on a remote webserver. They synchronise using a Windows Service which we wrote using RMO.
For completeness, our setup is as follows:
subscription.CreateSyncAgentByDefault = False -- I wrote a windows service to synchronise subscription.UseWebSynchronization = True subscription.InternetSecurityMode = AuthenticationMethod.BasicAuthentication subscription.SubscriberType = MergeSubscriberType.Anonymous
Sometimes, one of our clients synchronisation goes up the spout - it might take hours to synchronise - despite their internet connection being fine. And once this starts to happen, nothing will fix it. We've restarted their server, rebooted their router etc etc.
Well, the only way of fixing it is to recreate the publication, copy the backup, reinitialise and start again, which costs us a days work and inconveniences the client greatly.
What would be the likely trigger for this. How can we stop this happening again.
If it helps we could create a trace to help resolve this.....
I am wondering if it is possible to use SSIS to sample data set to training set and test set directly to my data mining models without saving them somewhere as occupying too much space? Really need guidance for that.
I have used both data readers and data adapters(with datasets) in the projects that I have worked on. I am trying to get some clarification on when I should be using which one. I think I am doing this correctly but I want to be sure I am developing good habits.
As the name might suggest, it seems like a datareader is for only reading data. I have read that the data adapter and dataset are for a disconnected architecture. Or, that they can be used for this type of set up. I have been using the data adapter and datasets when writing to a database and the datareader when reading from a database.
Is this how these should be used? Is the data reader the best choice for reading data? Am I doing this the optimal way from a performance stand point?
......................................................thanks in advance
We already integrated different client data to MDS with MS Excel plugin, now we want to push back updated or new added record to source database. is it possible do using MDS? Do we have any background sync process to which automatically sync data to and from subscriber and MDS?
When I enter over 4000 chars in any ntext field in my SQL Server 2005 database (directly in the database and through the application) I get an error saying that the data could not be updated because string or binary data would be truncated.Has anyone ever seen this? I cannot figure out what is causing it, ntext should be able to hold a lot more data that this...
I have a requirement to implement CDC for 50+ tables to implement incremental data changes warehouse/reporting rather than exporting the whole table data. The largest table is having more than half a billion records.
The warehouse use a daily copy of OLTP db (daily DB refresh). How can I accomplish this. Is there a downside in implementing CDC just for the sake of taking incremental changes on the tables?
Is there any performance impact if we enable CDC on OLTP db?
Can we make use of the CDC tables on the environment we do daily db refresh so that the queries don't hit OLTP database?
What is the best way to implement CDC to take incremental changes for reporting.
Hi,This is driving me nuts, I have a table that stores notes regarding anoperation in an IMAGE data type field in MS SQL Server 2000.I can read and write no problem using Access using the StrConv function andI can Update the field correctly in T-SQL using:DECLARE @ptrval varbinary(16)SELECT @ptrval = TEXTPTR(BITS_data)FROM mytable_BINARY WHERE ID = 'RB215'WRITETEXT OPERATION_BINARY.BITS @ptrval 'My notes for this operation'However, I just can not seem to be able to convert back to text theinformation once it is stored using T-SQL.My selects keep returning bin data.How to do this! Thanks for your help.SD
I'm using Script Component to load data into Oracle DB due to the poor performance issue. Now, I found it will missing some data during the transmission. Please see the screenshot below:
[DTS.Pipeline] Error: "component "Excel Source" (1)" failed validation and returned validation status "VS_NEEDSNEWMETADATA".
and also this:
[Excel Source [1]] Warning: The external metadata column collection is out of synchronization with the data source columns. The column "Fiscal Week" needs to be updated in the external metadata column collection. The column "Fiscal Year" needs to be updated in the external metadata column collection. The column "1st level" needs to be added to the external metadata column collection. The column "2nd level" needs to be added to the external metadata column collection. The column "3rd level" needs to be added to the external metadata column collection. The "external metadata column "1st Level" (16745)" needs to be removed from the external metadata column collection. The "external metadata column "3rd Level" (16609)" needs to be removed from the external metadata column collection. The "external metadata column "2nd Level" (16272)" needs to be removed from the external metadata column collection.
I tried going data flow->excel connection->advanced editor for excel source-> input and output properties and tried to refresh the columns affected. It seems that somehow the 3 columns are not read in from the source file? ans alslo fiscal year, fiscal week is not set up up properly in my data destination? anyone faced such errors before?
When I execute the below stored procedure I get the error that "Arithmetic overflow error converting expression to data type int".
USE [FileSharing] GO /****** Object: StoredProcedure [dbo].[xlaAFSsp_reports] Script Date: 24.07.2015 17:04:10 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO
[Code] .....
Msg 8115, Level 16, State 2, Procedure xlaAFSsp_reports, Line 25 Arithmetic overflow error converting expression to data type int. The statement has been terminated. (1 row(s) affected)