I am running merge replication (SQL 2000 with SP2) with an anonymous pull subscription. The application vendor has come out with update that requires adding a table to a database. The vendor has created scripts that will add the table, as well as some stored procedures. If I apply the scripts to both servers and add the table as a new article to the publication, am I going to have to apply a snapshot of the entire database (which is very large)?
I have added a new table to a database (existing publication) using T-SQL, I then opened up publisher properties, and ticked the new table/article so that it would be added to the subscriber. It did not show up.
I did not use a snapshot to initialize the subscription.
Immediate Synch is 0. allow_anonymous is 0.
I mark the subscription to be reinitialized. When I start the snapshot agent I get '0% A snapshot was not generated because no subscriptions needed initialization'.
What could I be doing wrong, or missed out? Do I need to drop and recreate the subscription to get the article to show up?
First of all; My Oracle publication works fine when I don't explicit specify the shema_option parameter for the articles I'm adding to the publication. The reason why I then want to explicit specify the parameter is as following.
I'm developing a replication solution to get data from our production server (Oracle) to our Data Warehouse (SQL Server). The SQL Server (and the Data Warehouse code) uses the SQL_Latin1_General_CP1_CI_AS collation. When I don't explicit specify the schema_option, the nvarchar columns of the replicated tables are created using the SQL_Latin1_General_CP1_CS_AS collation and this results in some comparison errors, when for instance a select statement is trying to compare two nvarchar strings using different collations.
I've tried to specify the schema_option parameter as "@schema_option = 0x80" (Replicates primary key constraints.) to avoid the use of the SQL_Latin1_General_CP1_CS_AS collation when creating the destination tables - I'm not sure it's enough? No matter what, I'm getting an error when I'm doing it (see below).
Message 2006-07-13 12:00:15.529 Applied script 'ITEMTRANSLATION_2.sch' 2006-07-13 12:00:15.544 Bulk copying data into table 'ITEMTRANSLATION' 2006-07-13 12:00:15.544 Agent message code 20037. The process could not bulk copy into table '"ITEMTRANSLATION"'. 2006-07-13 12:00:15.591 Category:NULL Source: Microsoft SQL Native Client Number: 208 Message: Invalid object name 'ITEMTRANSLATION'. 2006-07-13 12:00:15.591 Category:NULL Source: Number: 20253
The questions are now whether I actually have a schema_option alternative for Oracle Publishing? If so, what is the solution, and eventually how can I avoid the error stated above?
If I'm not able to avoid the article columns getting created with the "wrong" collation, is there then any other obviously solution to the problem?
Hi, everyone, I am new in SQL server 2005. I had setup SQL server 2005 P2P replication. Somehow it did not work one of two way replication. I tried to delete the publication. However I could not do it. have the same problem. When I tried to delete the publication, I got the publication " " does not exist.[SQL server error: 20026]. I tried to use sp_droppublication, it gave me error "the database is not enabled for publication". Nevertheless, I can see the publication in MS SQL Management Studio and Publication monitor with OK status. I could not find the distribution database either.
Could you anyone has ideas to delete this publication? I am sorry I am not a programmer. Please give me more detail explanation if you can. Thanks.
MS has post a KB article on how to post a question on a online forum...
http://support.microsoft.com/kb/q555375
At first I though this was hilarious, but there is actually some good information in here that is applicable on this forum as well as anywhere else. Perhaps Brett could add this as a sticky?
Dear All, i'm in the transactional replication environment. we need to add one new table to the publisher. it is sql server 2005 environment. please explain me the steps
and in another table, i need to change the data type of a table.
please guide me
Arnav Even you learn 1%, Learn it with 100% confidence.
If I want to unpublish an article thru Enterprise Manager, it does not allow me to uncheck it at publication properties. It only allows me to uncheck articles if I delete the subscription first. Does anybody know how tp do this in Enterprise Manager without deleting the subscription?
So i execute sp_dropsubscription and sp_droparticle for the table for all publication to that article in the database.
Usually i just alter the table and then execute sp_adddarticle and sp_addsubscription afterwards and everything is cool.
However in this case after i drop the subscriptions for the article, when i try alter the table it says it is being replicated.
I query sysobjects and see that the table has replinfo = 1 , this is snapshot replication, but i only have transactional replication ? I used to have snapshot but that was dropped long ago.
I then query sysarticles and i find the table, however the pubid (publication id) for the table is equal to the publication id of another databases publication ??? I also have found in sysarticles articles with a publication id's equal to publications that do not exist, for example articles will have pubid of 2 , but if i run sp_helppublication on all user databases there is no publciation with an id of 2 ?
Please help, even after i drop all subscriptions to an article it seems that sql server thinks the article belongs to another databases publciation or publications that no longer exist?
Maybe there is some sort of cleanup sp i can run or something but it seems to me sql server has gotten articles confused with old deleted publications that no longer exist.
Therefore after dropping all subscriptions i still cannot alter tables as sql server think it is still being published but it is not!
i need to filter an article based on a user-supplied datetime filter (the datetime parameter is specified by the subscriber just before replication). at the same time i need to filter again by user (different subscribers get different rows).
i already did the user-based filter using HOST_NAME( ). but the difficulty here (al least i think so) lies in passing 2 parameters to the filter. i cannot rely on using SUSER_SNAME to pass the user filter, because no one will want to create 500 user accounts. so i guess the only solution here is to pass both parameters using only HOST_NAME( ) and then write 2 splitting functions which uses HOST_NAME( ) as its parameter. am i right ?
publisher/distributor is sql server 2005, all subscribers use sql mobile.
I'm setting up Transaction Replication b/w SQL Server 2K and SQL Server 2K5. I have published Tables, Views and SPs as articles. When I try to modify the published Stored procedure, the changes are not replicated.
When I Reinitialize the Subscription and start the Snapshot agent, it is copying the changes
made. But all articles are reinitialized again, So it takes huge time to do this.
Is there any way to just reinitialize only the changed article? Or Is there any work around for this problem?
The chapter from my book that talks about upgrading packages from DTS to SSIS has been posted here : http://www.quepublishing.com/articles/article.asp?p=605035&rl=1
Didn't even know it until I happened to be doing a search on upgrading and the article popped up in the search...
the msdn article explains how to input new data to a databse and return the primary key of the inputted row. It works perfectly. However, it explains how to insert a value, but there is a problem. It seems to set the value to be submitted in the VB code, instead of having the value be the user's input in the textbox. Look for the following line in the code below: newRow("ClientFileNumber") = "ClientFileNumber" It inputs "clientfilenumber" instead of what is actually inputted in the texbox. Does anyone know how to bypass this, to not have the words "clientfilenumber" inputted?Dim sqlconn As SqlConnection = New SqlConnection("Data Source=.SQLEXPRESS;AttachDbFilename=|DataDirectory|Client.mdf;Integrated Security=True;User Instance=True") Dim catDA As SqlDataAdapter = New SqlDataAdapter("SELECT * FROM Orders", sqlconn) catDA.InsertCommand = New SqlCommand("JobInsert", sqlconn) catDA.InsertCommand.CommandType = CommandType.StoredProcedure catDA.InsertCommand.Parameters.Add("@ClientFileNumber", SqlDbType.VarChar, 50, "ClientFileNumber")Dim NewJobNumber As SqlParameter = catDA.InsertCommand.Parameters.Add("@Identity", SqlDbType.Int, 0, "OrdersID") NewJobNumber.Direction = ParameterDirection.Output
sqlconn.Open() Dim catDS As DataSet = New DataSetcatDA.Fill(catDS, "Orders")
Dim newRow As DataRow = catDS.Tables("Orders").NewRow() newRow("ClientFileNumber") = ""catDS.Tables("Orders").Rows.Add(newRow) catDA.Update(catDS, "Orders")
Hi, I'm building a straightforward-ish news site. Currently, the story details (e.g. headline, description, author, date) are stored in a SQL DB. The news is categorised and the category details are also stored in an SQL table. However - what is the best way to store the actual news article? At the moment, when the user enters a story, they input the details and article text. The details are saved in the DB and the text is saved as an XML file with filename corresponding to the article DB primary key. When the story is "read" the app pulls the details from the DB and loads up the appropriate XML file from disk. For various reasons, I need to keep the article stored as XML, either in a file or in the DB. The DB provides for faster sorting and retrieval, and I don't want to store large amounts of data (i.e. the article itself) in the DB if avoidable. I guess there are a few ways to do it - 1. Store details in DB and article as XML file.2. Store details and article in same DB table3. Store details in one DB table and articles in another I would imagine that 3 would be the best, but would there be a performance hit? What is the maximum size of field (i.e. article size) I can have in a table? Cheers Graham Wilson.
We're replicating columns in a table from 7.0 server to SQL2k server. Once we've got the data on SQL2k, we were able to take advantage of DTS that was not available in version 7.0.
I recenlty added a column to the article and changed the DTS accordingly (transformation columns). However, I'm getting this error ever though I added the new column in the DTS.
------- The number of columns in the bcp file does not match what is defined in the DTS package. Regenerate the package.
-----
I've recreated the package with no luck. This is becoming urgent as it's supposed to go into production soon.
I am having a server where replication is set up between 2 differnt databases. It is currently running. I want to add a couple of tables to the replication. I tried using sp_addArticle, but after executing it, in the properties of the publication it shows the new tables, but at the database level the tables are missing.
I tried with sp_addsubscription but I am getting strange error:
Server: Msg 14100, Level 16, State 1, Procedure sp_addsubscription, Line 240 Specify all articles when subscribing to a publication using concurrent snapshot processing.
What can I do to publish the tables into the target database?
I posted a link to a prior article in here, that one about highperformance hierarchies, and have the first two parts of a new series.Hopefully this is of value to someone.http://www.yafla.com/papers/SQL_Ser..._sql_server.htmThanks.
I've uploaded a new version of my article on Dynamic Search Conditionson http://www.sommarskog.se/dyn-search.html. I've revised the article tocover SQL 2005, and made a general overhaul of the content. There was a*very* embarrassing error that I've corrected.I've also added a new interesting method for static SQL. I've found that ifyou say:SELECT ...FROM tblWHERE (key1 = @key1 AND @key1 IS NOT NULL)OR (key2 = @key2 AND @key2 IS NOT NULL)OR (key3 = @key3 AND @key3 IS NOT NULL)This will use indexes if all columns are indexed, and furthermore SQLServer will decide at run-time which index(es) to access. The articleincludes a trick where you can combine this with the normal conditions fordynamic searches for very good performance under some circumstances.I also cover the new OPTION (RECOMPILE) to force statement recompile.I was hoping that it could lead to just as good query plans as dynamicSQL, but it's far cry from that.--Erland Sommarskog, SQL Server MVP, Join Bytes!Books Online for SQL Server 2005 athttp://www.microsoft.com/technet/pr...oads/books.mspxBooks Online for SQL Server 2000 athttp://www.microsoft.com/sql/prodin...ions/books.mspx
In our replication environment, the subscriber is initially set up with an snapshot of the publisher database. However, after that, the subscriber and publisher are different and we can never re-initialize from a snapshot again (we purge data on the publisher to reduce the database size but do not purge the same data on the subscriber; we do this by stubbing out the stored procedures on the subscriber that purge data on the publisher).
If an article is added or dropped from the publication, using snapshot and synchronize, just these changes are propagated to the publisher (without an entire new snapshot).
However, if an Article Property is changed (change SCALL to MCALL under Statement Delivery options for the UPDATE statement), the interface REQUIRES an entire new snapshot. Is there any way I can avoid the new Snapshot? It overwrites the subscriber database and this cannot happen!
I have setup Transactional replication in SQL 2005 between two servers and have about 200 tables being replicated. The problem is that every time, I add or drop a table to replication, and start the Snapshot agent, it re-initializes every article and re-loads every article. This process takes 1 hour to complete and CPU usage goes to 100% during that time.
This behaviour seems very different from SQL 2000 where I would start Snapshot agent and only the relevant tables were added/dropped.
Has that functionality changed from 2000 to 2005? Am I not doing something right? Thanks, Amir