Deadlock: Could Not Perform Retention-based Meta Data Cleanup

Sep 8, 2006

Hi SQL Replication Gurus:

I got some issues in my production environment, so please help me out. The following is the message I got from the replication monitor and I don't what to at this point.

Appreciate you help.

Yong

==========================================================================================

Command attempted:


{call sp_mergemetadataretentioncleanup(?, ?, ?)}

Error messages:


The merge process could not perform retention-based meta data cleanup in database 'TT'. (Source: Merge Replication Provider, Error number: -2147199467)
Get help: http://help/-2147199467

Transaction (Process ID 73) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction. (Source: ply-db-svr1, Error number: 1205)
Get help: http://help/1205

View 8 Replies


ADVERTISEMENT

Retention-based Meta Data Cleanup

Apr 18, 2005

Hi

I am running a couple of sql 2000 SP3a servers with merge and snapshot replication. One server acting as publisher and distributor and the rest subscribers. On one of the server I have got the error below and have tried most of the suggestions by msdn. This server has not crashed ever before or any hardware problems. It has been running for a couple of months and no problems. This has not happened no any of the other servers. Any suggestions would be greatly appreciated as the only resolution I have left is to bring up a new instance, setup replication and see if this would resolve the issue. Stopping and starting of agents don't work.


[4/18/2005 11:59:20 AM]BRAAMPDC1ICAS2000.HO_Master: {call sp_MSgetversion }
[4/18/2005 11:59:20 AM]BRAAMPDC1ICAS2000.distribution: {call sp_MShelp_subscriber_info (N'BRAAMPDC1ICAS2000', N'EASTSRV3')}
Connecting to Subscriber 'EASTSRV3.ICASData'

Server: EASTSRV3
DBMS: Microsoft SQL Server
Version: 08.00.0760
user name: dbo
API conformance: 2
SQL conformance: 1
transaction capable: 2
read only: N
identifier quote char: "
non_nullable_columns: 1
owner usage: 31
max table name len: 128
max column name len: 128
need long data len: Y
max columns in table: 1024
max columns in index: 16
max char literal len: 524288
max statement len: 524288
max row size: 524288

[4/18/2005 11:59:27 AM]EASTSRV3.ICASData: {call sp_MSgetversion }
Percent Complete: 2
Connecting to Subscriber 'EASTSRV3'
Percent Complete: 3
Retrieving publication information
Percent Complete: 4
Retrieving subscription information
Percent Complete: 4
The merge process is cleaning up meta data in database 'HO_Master'.
Percent Complete: 4
The merge process cleaned up 0 row(s) in MSmerge_genhistory, 0 row(s) in MSmerge_contents, and 0 row(s) in MSmerge_tombstone.
Percent Complete: 4
The merge process is cleaning up meta data in database 'ICASData'.
The merge process could not perform retention-based meta data cleanup in database 'ICASData'.
Percent Complete: 0
The merge process could not perform retention-based meta data cleanup in database 'ICASData'.
Percent Complete: 0
Category:NULL
Source: Merge Replication Provider
Number: -2147199467
Message: The merge process could not perform retention-based meta data cleanup in database 'ICASData'.
Percent Complete: 0
Category:COMMAND
Source: Failed Command
Number: 0
Message: {call sp_mergemetadataretentioncleanup(?, ?, ?)}
Percent Complete: 0
Category:SQLSERVER
Source: EASTSRV3
Number: 11
Message: General network error. Check your network documentation.

View 1 Replies View Related

Publication Retention To Increase, How To Avoid Missing Metadata After Cleanup

Mar 29, 2007

I currently use 7 days for subscription expiration setting for my two merge publications, which will cause metadata to clean up very 7 days. Now I need to increase the retention period to be 14 days. How I can avoid missing metadata after cleanup? Microsoft ms151188 (http://msdn2.microsoft.com/en-us/library/ms151188.aspx) warns that publisher may not have enough metadata, which may lead to non-convergence. I want to change this setting without causing any data loss.



Thanks much,

View 1 Replies View Related

Recovery :: How To Perform Server Database Backup Retention Periods

Sep 22, 2015

For the best practice I issued full SQL Server database, differential and transaction log backups.  I have setup a process to backup to local disks and then also copy the files to a centralized set of storage.  On a weekly basis the centralized file system is backed up to a tape backup device. I know I can get data off of the tapes, but that process is time consuming, not well tested from my perspective and I am not in control of the overall process.  Can you offer some recommendations from a SQL Server backup retention perspective?

View 6 Replies View Related

How To Catch An Error Raised In The Forach Loop Container And Perform Cleanup Jobs Accordingly?

Nov 29, 2007

Hello everyone,

I have a package that should accomplish the following task:
- Loop on all files in a given directory.
- Before proccessing the current file in the Enumerator, it should be moved to a folder called "importing"
- Load the content of the file into a destination DB.
- After a successfull load, the file is moved from the "importing" to the "success" folder.
- If anything went wrong during the load process, the file should be moved from the "importing" to the "error" folder and an e-mail should be sent out to a certain admin account.

What I have done so far:
My control flow looks like this:
Foreach Loop containing the following tasks:
- File System Task: moving the current file from its original path to the "importing" folder
- Data Flow Task1: performing some transformations and insertions into the DB and raw files.
- Data Flow Task2: performing some transformations and insertions into the DB and raw files.
- Data Flow Task3: performing some transformations and final insertions into the DB.
- File System Task: moving the current file from the "importing" to the "success" folder

Question:
I do not know how to catch the event that one of my three Data Flow Tasks has failed and in this case perform two simple tasks, namely...
- File System Task: moving the current file from the "importing" to the "error" folder
- Send Mail Task: sending a configured e-mail message to a ceratin administrator.

Thanks in advance...

Regards,
Samar

View 8 Replies View Related

SQL Server 2008 :: Perform Checksum Based On The Source Table Data?

Aug 10, 2015

I'm trying to load data from old SQL server 2000 to new SQL server 2014. I need to do a checksum to check if all the source data is loaded in the target database(SQL server 2014). I've created the insert statement for the same which works. I need to use checksum to make sure all the source rows are loaded in the target table. I haven't done checksum before.

Here is my insert statement:

INSERT INTO [Test].[dbo].[Order_tab]
([rec_id]
,[date_loaded]
,[Name1]
,[Name2]
,[Address1]
,[Address2]

[code]....

View 2 Replies View Related

Backup Data Retention Time?

Nov 12, 2007

I have just started in the scary world of SQL Server admin and am trying to unravel the mysteries of backups etc.
If I run 'BACKUP DATABASE xxx TO DISK = 'D:DB_Backupsxxx.bak' WITH RETAINDAYS = 7' each day, each db backup if appended to the same '.bak' file and the RETAINDAYS protects the backup from being deleted by SQL Server. OK so far. But does anyone understand what criteria is used to decide when to overwrite the older backups? My backup file is getting bigger everyday, with no sign of any of the old data being deleted! Do I have to wait for the entire disk to become full before they start to get overwritten? Or should I just not worry and trust that it will do it all correctly?
Any ideas would be much appreciated.

View 5 Replies View Related

Pipeline Error-excel Source-data Reader Does Not Read In Meta Data

Apr 16, 2008

Hi all, i got this error:


[DTS.Pipeline] Error: "component "Excel Source" (1)" failed validation and returned validation status "VS_NEEDSNEWMETADATA".

and also this:

[Excel Source [1]] Warning: The external metadata column collection is out of synchronization with the data source columns. The column "Fiscal Week" needs to be updated in the external metadata column collection. The column "Fiscal Year" needs to be updated in the external metadata column collection. The column "1st level" needs to be added to the external metadata column collection. The column "2nd level" needs to be added to the external metadata column collection. The column "3rd level" needs to be added to the external metadata column collection. The "external metadata column "1st Level" (16745)" needs to be removed from the external metadata column collection. The "external metadata column "3rd Level" (16609)" needs to be removed from the external metadata column collection. The "external metadata column "2nd Level" (16272)" needs to be removed from the external metadata column collection.


I tried going data flow->excel connection->advanced editor for excel source-> input and output properties and tried to refresh the columns affected.
It seems that somehow the 3 columns are not read in from the source file?
ans alslo fiscal year, fiscal week is not set up up properly in my data destination?
anyone faced such errors before?

Thanks

View 13 Replies View Related

Meta Data

Jun 28, 2003

I am trying to get a grasp of Meta Data. Why is this important to application development?

View 7 Replies View Related

Meta-data

May 7, 2007

how can we extract metadata for a database tables.

do we have any sql's for that??

View 2 Replies View Related

SQL 2012 :: Backup Cleanup Of Differential And Log Backups Based On Full Backup?

Feb 19, 2015

Using Ola Hallengren's scripts I do a full backup of a database on a Sunday. Then differential backups every 6 hours and log backups every hour. I would like to keep a full week of backups based off the full backup done on Sunday. Is there a way for me to clear out the diff and log folders after the successful full backup on Sunday nights?

View 2 Replies View Related

Meta Data Help Needed[:(]

Aug 1, 2007

Hi Guys,
I have a DataBase in which I have several Tables.
What I want is an SP or Query which takes as its parameter the "tablename".
The Output Should be a having three fields only.
Field name, DataType Of the Field, Length of the DataType.
For Example
Suppose the StoredProcedure Name is "SP_GetTables" 
if i have a table named "tbl_Users" with fields
UserName varchar(50)
UserPass varchar(20)
UserAge int
UserStatus bit
In my program side if I pass the parameter as "tbl_Users" to the StoredProcedure SP_Users,
I should get the O/P as
Field Name    DataType  Length
UserName      varchar      50
UserPass       varchar      20
UserAge         int
UserStatus     bit
Regards,
Naveen.
 

View 3 Replies View Related

Gather Meta Data

Jul 23, 2005

Hi,I would like to prepare a data dictionary for my database (northwind).I have framed the below SQLSELECT'NAME ' = a.name,'DESCRIPTION' = b.value,'Type ' = type_name(a.xusertype),' ' AS 'Values','NULL ' = case when a.isnullable = 0 then ' ' else 'X' end,' ' AS 'PK',' ' AS 'FK'FROMsyscolumns a,sysproperties bWHEREa.id = 2073058421 AND --- Customers Tablea.number = 0 ANDb.id = a.id ANDb.smallid = a.colidORDER BY a.colidand the output would be:NAME DESCRIPTION Type Values NULL PK FK-------------- --------------- ---------- ------ ----- ---- ----CustomerID Customer ID ncharCompanyName Company Name nvarcharContactName Contact Name nvarchar XContactTitle Contact Title nvarchar XAddress Address nvarchar XCity City nvarchar XRegion Region nvarchar XPostalCode Postal Code nvarchar XCountry Country nvarchar XPhone Phone # nvarchar XFax Fax # nvarchar XPK and FK is where I need to print whether the column is Primary Key orForeign Key.If CustomerId is defined as Primary Key then PK should have X printed.Thats the objective.How will I accomplish this ?Thanks in advance,Anu

View 3 Replies View Related

Meta Data Catalogue

Aug 6, 2005

I am wanting to set up some kind of metadata catalogue to managemetadata records of the data we collect and create for my companiesclients.I am thinking I want to do this using XML and SQL Server and have sometype of web-based browser to search for records.Any suggestions on existing applications or resources that I could use?Thanks

View 1 Replies View Related

Expose SQL Meta Data Via ASP

Jul 20, 2005

Hi,Apologies if this is better posted in an ASP group, but here goesanyway ...Is it possible to work out what parameters a stored procedure expects,using ASP?I would like to take the name of a stored procedure, work out whatinput parameters it has and build a form based on them in ASP.Thanks,MB.

View 2 Replies View Related

AdHoc Reporting Against SQL 2K W/meta Data?

Feb 6, 2004

Okay... We have a SQL2K database that has about 500 tables or so. It is normalized to a reasonable level and enforces all relationships with PK/FKs, not triggers. Hence, for a database-minded person it is fairly easy to read (as easy as a 500+ table database can be!).

Our users need adhoc query capabilities. Our report writer is simply overwhelmed. He doesn't need to be spending time writing a report that is intended to be run once.

I expect the best alternative would be to use some sort of adhoc reporting tool that is based off meta data. We (the DBAs) could be responsible for maintaining the meta data and STILL have a manhour savings over developing all these reports.

Here's the catch... We are on a TIGHT budget (aerospace industry is still reeling a bit). Is anyone using a product or aware of a product that might be just the ticket for us? We have been investigating a product by LogiXML called LGX AdHoc (http://www.logixml.com/products/AdHoc/adhoc.htm). Looks promising. Anyone use or familar with it?

View 1 Replies View Related

Snapshot Meta Data Problems

Sep 22, 2005

Hi There

View 9 Replies View Related

Getting Meta-data For Linked Servers

Oct 25, 2006

My customer has a .NET application that reads meta data from SQL Server, Oracle, DB2, and several propritary databases. Because each DBMS stores the meta data using various techniques, they have written custom code for each DBMS. They are working on a generic ODBC/OLEDB suppport, but in the interim I was trying to use SQL Server to link to an Access database. The Access linked server works fine for queries in Query Analyzer, but I would like to be able to programatically read the metadata for an Access DB (tables, columns, types, etc) via the linked server. SQL Server's usual mechanism for storing meta-data in the Master database aparently is not used for Linked Servers.

Does SQL Server expose Linked Server meta data?
How would one retrieve this meta data if it is exposed?

Thanks,

Ben

View 3 Replies View Related

Cannot Open MSDB Database From DTS Meta Data

Oct 20, 2006

We are running SQL Server 2000, SP4. I recently noticed that my DTS packages were missing from the local server. Since I had saved them as Structure Storage Files, I imported them back into DTS Local Packages. That was yesterday. Today I opened the Meta Data browser to see what was in it. (We don't use Meta Data Services.) It displayed a message:"An error occurred while trying to access the database information. The msdb database could not be opened." These (restore of DTS pkgs and Meta Data error) may not be related, but I need to know why I am getting this error, because I am about to upgrade to SQL Server 2005 for Workgroups.

1. Does anyone know what causes this error message?

2. Are local DTS packages stored in the MSDB database?

View 1 Replies View Related

Saving DTS Package In Meta Data Services Error

Apr 27, 2004

I am running SQL Server 2000 SP3 and I am trying to save a DTS package into Meta Data Services and I am receiving the following "Package Error"

Error Source: Microsoft Data Transformation Services (DTS) Package
Error Description: General Error -2147217355 (80041035).

I have searched for this error and I cannot find anything related to it. Also, I saw some of the comments about right clicking "Data Transformation Services" and checking the box for allow save to Meta Data Services, however, I do not see that checkbox to allow for this. Has anyone else had this problem and resolved it? I'm beginning to get very frustrated with it.

View 1 Replies View Related

SQL Server 2000 Meta Data Services Packages

Dec 5, 2006

Would someone help me how to Move existing Meta Data Services packages to SQL Server storage (in the msdb database) or to structured storage files before you upgrade from SQL Server 2000 to SQL Server 2005? This is a before action recommendation by Upgrade Advisor.

View 1 Replies View Related

Power Pivot :: Meta Data Query Updates

Sep 1, 2015

I have been given a request by a business analyst to update the text 'old' to 'new' within the column names / measure names and associated calculations within a PowerPivot model. There are hundred of columns / measures / calculations, etc. which need to be renamed.

Is there any way of updating these changes to the model other than making these changes manually or is there some way of doing the following type of operation in PowerPivot; -

UPDATE tblColumnNames SET Column_name, etc REPLACE ('old','new', all columns),('old','new', all measures),('old','new', all calculations)
FROM
tblColumnNames

View 2 Replies View Related

Rebbrui.rll Is Missing:meta Data Services Failed To Initialize After Sp3

Jan 29, 2003

I installed sp3 on my sql2000 server and now when I open up enterprise manager I get the following:

1st message:
rebbrui.rll is missing

2nd message:
snap in failed to initialize
name: meta data services

Thanks in advance for your help!!!

View 1 Replies View Related

Meta Data Services SQL 2000 Msdb Database Could Not Be Opened

May 26, 2006



Hello anyone / everyone,

If you are having trouble accessing Meta Data Services on W2k3 and SQL 2000 and get the error that the "msdb database could not be opened" I found that hot fix 912812 on the operating system is the culprit. Remove it and Meta Data goes back to working, although I am being asked to approve the ActiveX control every time the page refreshes.

Hope this helps somebody, to took me long enough to track it down.

If anyone has a resolution to the ActiveX question I'd love to know the answer.

Thank you,

Uncle Davy


View 5 Replies View Related

The Merge Process Is Cleaning Up Meta Data In Database 'xDatabase'.

Jan 24, 2007



Hi all,

When I get this message "The merge process is cleaning up meta data in database 'xDatabase'." at distributor MS SQL 2005 replicator monitor on one of his suscription, this proccess generate so proccess charge to the suscriber that users note difference in performance. I wonder if exist a way to change some kind of parameter for this proccess run on specific schedule ?

Any help would be appreciated.

other info I had suscribers that only replicate in one way, maybe this metadata is causing this overcharge.



View 5 Replies View Related

Integration Services :: Perform Lookup On Large Dataset Based On A Small Dataset

Oct 1, 2015

I have a small number of rows in a dataset, Table 1.  There is a CLOB on a large dataset, Table 2.  They join on a PK.  I would like to retrieve this CLOB and add it to the data flow for Table1.  In short I want to emulate the following:

Table 1:  Small table without CLOB, 10 rows. 
Table 2: Large table with CLOB, 10,000,000 rows

select CLOB
from table2
where pk = (select pk from table1)

I want this to return the CLOBs for the small number of rows in Table 1.  The PK is indexed obviously so it should be a fast look up.

Table 1 and Table 2 live on different Oracle databases.  How do I perform this operation efficiently in SSIS?  It seems the Lookup and Merge Join wont do this.

View 2 Replies View Related

SP To Perform Query Based On Multiple Rows From Another Query's Result Set

Nov 7, 2007

I have two tables .. in one (containing user data, lets call it u).The important fields are:u.userName, u.userID (uniqueidentifier) and u.workgroupID (uniqueidentifier)The second table (w) has fieldsw.delegateID (uniqueidentifier), w.workgroupID (uniqueidentifier) The SP takes the delegateID and I want to gather all the people from table u where any of the workgroupID's for that delegate match in w.  one delegateID may be tied to multiple workgroupID's. I know I can create a temporary table (@wgs) and do a: INSERT INTO @wgs SELECT workgroupID from w WHERE delegateID = @delegateIDthat creates a result set with all the workgroupID's .. this may be one, none or multipleI then want to get all u.userName, u.userID FROM u WHERE u.workgroupIDThis query works on an individual workgroupID (using another temp table, @users to aggregate the results was my thought, so that's included)         INSERT INTO @users             SELECT u.userName,u.userID                 FROM  tableU u                LEFT JOIN tableW w ON w.workgroupID = u.workgroupID                WHERE u.workgroupID = @workGroupIDI'm trying to avoid looping or using a CURSOR for the performance hit (had to kick the development server after one of the cursor attempts yesterday)Essentially what I'm after is:             SELECT u.userName,u.userID
                FROM  tableU u
                LEFT JOIN tableW w ON w.workgroupID = u.workgroupID
                WHERE u.workgroupID = (SELECT workgroupID from w WHERE delegateID = @delegateID) ... but that syntax does not work and I haven't found another work around yet.TIA!    

View 1 Replies View Related

Mirroring :: Email Deadlock Information When A Deadlock Occurs

Nov 10, 2015

Is there a way to send out an email woth deadlock information (victim query, winner query, process id's and resources on which the deadlock occurred) as soon as a deadlock occurs in a database or at instance level?I currently has trace flag 1222 turned on. And also created an alert that send me an email whenever a deadlock occurs. but it just says that a deadlock occurred and I log into sql server error log and review the information.

View 5 Replies View Related

Meta Data Synchronization With Excel File Source Once We Update The File

Dec 28, 2007



hi all;

1. Excel file Source--> monthly Revenue details
2. Derived Colum Transoformations
3. Oledb Destination

its my flow in one of my packates (ETL job)
Excel file contains monthly revenue details, i wanna import the excel data to my database staging table, so i've created the package.
its working fine...

Problem
if we change the new data for the next month and running the package its not running;
the same file, same format, only we delete the contents, of the file except first row of the excel sheet,
and pasting the new data;
new data is coming from Oracle DataBase in the form of excel sheet ( manually they will copy the data and sending to us)

i open that package in design mode and while double clicking the excel file source it says <column name>'s Meta Data needs to be synchronized
Do you want to Fix this issue automatically with the available external column's meta data

Clearly noted that its a data type issue; i have changed the corresponding data types as it is in the previous Excel sheet which is equivalant to the Table its copying to.

now the package is running with validation warnings, External Column "Invoice Amount" needs to be updated...etc. some 2 or three warning messages i can able to see in the package Execution wizard,

ok, i'm ready to accept these warnings, and i want my package running from my server;( packages had been deployed in to the Centeralized server; every time if we want to run the package, we have the asp.net webpage, that is executing the package in an On_click event)

The package is not running from the server, its due to the meta data change in the Excel file( i guess)

please suggest me some guide lines to resolve this meta data issue, i want my excel sheet meta data should not change when we have new updates in it;

otherwise suggest me some solutions that i can validate the excel sheet before running the package and testing whether the data is in correct format or not? its a kind of Data Profiling activity;

i know its some what crazy, but i need to maintain the system with permanent solution, instead of facing this meta data mismatch issue!!!

some what lenthy explanation--> its needed for my dear powerful microsoft responders. i think i 've explained my problem clearly, if i don't let me know your queries, i'll try my level best.

View 3 Replies View Related

A Word About Meta-data, Pass Through Columns And Derived Columns

Oct 13, 2006

Here's another one of my bitchfest about stuff which annoy the *** out of me in SSIS (and no such problems in DTS):

Do you ever wonder how easy it was to set up text file to db transform in DTS - I had no problems at all. In SSIS - 1 spent half a day trying to figure out how to get proper column data types for text file - OF Course MS was brilliant enough to add "Suggest Types" feature to text file connection manager - BUT guess what - it sample ONLY 1000 rows - so I tried to change that number to 50000 and clicked ok - BUT ms changed it to 1000 without me noticing it - SO NO WONDER later on some of datatypes did not match. And boy what a fun it is to change the source columns after you have created a few transforms.

This s**hit just breaks... So a word about Derived Columns - pretty useful feature heh? ITs not f***ing useful if it DELETES SOME of the Code itself after there have been changes in dataflow. I cant say how pissed off im about that SSIS went ahead and deleted columns from flow & messed up derived columns just because the lineageIDs dont match.

Meta-data - it would be useful if you could change it and refresh it - im just sick and tired of it that it shows warnings and errors when there's nothing wrong - so after a change i need to doubleclick all my transforms so that those red & yellow boxes would disappear.

Oh and y I passionately dislike Derived columns - so you create new fields based on some data - you do some stuff - combine multiple columns to one, but you have no way saying remove the columns from the pipeline. Y you need it - well if you have 50K + rows with 30+ columns then its EXTRA useless memory overhead for your package.

Hopefully one day I will understand how SSIS works (not an ez task I say) - I might be able to spend more time on development and less time on my bitchfest - UNTIL then --> Another Day - Another Hassle with SSIS

View 5 Replies View Related

Using SSIS To Perform A Data Import Of An Excel Spreadsheet

Oct 15, 2007

I am new to SSIS. 
I am interested in using SSIS to import an excel spreadsheet into a SQL server database. My biggest concern is how to handle/manage errors that might occur when the import process occurs. Can anyone give me any guidance on this?
 I could write some C# code to do the import and to create a custom .txt file listing errors that occur on import. Using C# code to do the import seems like I would just be reinvinting the wheel so to speak.

View 3 Replies View Related

ODBC-based To OLE DB-based Data Transfer

Oct 7, 2006

I would like to transfer selected data from an ODBC-based table to a OLEDB-based table. However, there isn't a data flow source on the Data Flow Design screen to accomodate such an action. Please help!

View 1 Replies View Related

Errorlog Retention

Apr 20, 2004

I am trying to change the default number of SQL errorlogs from 6 to 12. Does anyone know how to change that?

View 7 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved