SSIS:package Contains Two Objects With The Duplicate Name
Feb 22, 2007
public static void CreateDestDFC1()
{
destinationDataFlowComponent1 = dataFlowTask.ComponentMetaDataCollection.New(); destinationDataFlowComponent1.Name = "SQL Server Destination 1"; destinationDataFlowComponent1.ComponentClassID = "{5244B484-7C76-4026-9A01-00928EA81550}";
managedOleInstance1 = destinationDataFlowComponent1.Instantiate(); managedOleInstance1.ProvideComponentProperties(); managedOleInstance1.SetComponentProperty("BulkInsertTableName", "Employee"); managedOleInstance1.AcquireConnections(null); managedOleInstance1.ReinitializeMetaData(); managedOleInstance1.ReleaseConnections();
}
//Second one here..
public static void CreateDestDFC2()
{
destinationDataFlowComponent2 = dataFlowTask.ComponentMetaDataCollection.New(); destinationDataFlowComponent2.Name = "SQL Server Destination 2"; destinationDataFlowComponent2.ComponentClassID = "{5244B484-7C76-4026-9A01-00928EA81550}";
managedOleInstance2 = destinationDataFlowComponent2.Instantiate(); managedOleInstance2.ProvideComponentProperties(); managedOleInstance2.SetComponentProperty("BulkInsertTableName", "Customer");
managedOleInstance2.AcquireConnections(null);
managedOleInstance2.ReinitializeMetaData();
managedOleInstance2.ReleaseConnections();
}
And its giving a error.can anyone say why? or can anyone change this?
The package contains two objects with the duplicate name of "component "SQL Server Destination" (50)" and "component "SQL Server Destination" (22)".
I've begun to get the above error from my package. The error message refers to two output columns.
Anyone know how this could happen from within the Visual Studio 2005 UI? I've seen the other posts on this subject, and they all seemed to be creating the packages in code.
Is there any way to see all of the columns in the data flow? Or is there any other way to find out which columns it's referring to? Thanks!
I am creating a Data Flow custom component in SSIS, and my input/output columns are dynamically created depending on the value of a custom property of the component.
Then, as soon as I modify this custom property, I make some job in SetComponentProperty(), that I have overrided in my PipelineComponent, to create my input and ouput columns.
After validation, I have the following error message:
Error 3 Validation error. Data Flow Task: DTS.Pipeline: The package contains two objects with the duplicate name of "output column "AMOUNT" (4714)" and "input column "AMOUNT" (4711)". Package.dtsx 0 0
I create an input column named "AMOUNT" with an ID 4711 and an output column named "AMOUNT" with an ID of 4712.
What is surprising is that when I manualy create the same columns with the default advanced editor of my component, using the menu "Show Advanced Editor...", there is no error !!!
I think that I am doing something wrong but I don't know what...!
Have someone any idea about that?
Here is the code I use to create new input columns:
We are in the process of trying to automate our production releases (what a concept ;-)
The database is SQL server 2005 All objects are being stored in VSS Using Nant and Cruise Control for the actual migrations. I have two directories - Create (for a brandnew database) and Change (db object changes)
In my 'Change' script, I do the following -
1 - Take backup of database 2 - Migrate objects from 'change' directory to production 3 - Script out all objects of database and save in the 'Create' directory
For the #3, I was hoping I could create an SSIS package that would script out all database objects and save them on the VSS server.
I'm new to SSIS and want to verify it's something that can be done before I start down that path. If anyone has any examples or references, it would be much appreciated.
hi All, I have to remove non-useful and duplicate records containing NULL , Blanks and extra spaces (either on left or right side of the column values) etc from all the tables in my server's DB XYZ weekly thru a a scheduled job with the help of a Stored Proc, that s i guess called Purging og DB. Plz help how i can do it with T-SQL.
Also i have to find out and remove all the duplicate DB objects(tables) from the DB .e.g. a table existing with name TABLE_TEST or TABLE_DEBUG etc for an original table TABLE , making sure no any of the base table is dropped.
p.s. Does anyone have any needles I can borrow? I think sticking them in my eyes would be nicer than working with SSIS.
===================================
An error occurred while objects were being copied. SSIS Designer could not serialize the SSIS runtime objects. (Microsoft Visual Studio)
===================================
Could not copy object 'Preparation SQL Task' to the clipboard. (Microsoft.DataTransformationServices.Design)
------------------------------ For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft%u00ae+Visual+Studio%u00ae+2005&ProdVer=8.0.50727.762&EvtSrc=Microsoft.DataTransformationServices.Design.SR&EvtID=SerializeComponentsFailed&LinkId=20476
------------------------------ Program Location:
at Microsoft.DataTransformationServices.Design.DtsClipboardCommandHelper.SerializeRuntimeObjects(ICollection logicalObjects) at Microsoft.DataTransformationServices.Design.ControlFlowClipboardCommandHelper.InternalMenuCopy(MenuCommand sender, CommandHandlingArgs args)
===================================
Invalid access to memory location. (Exception from HRESULT: 0x800703E6) (Microsoft.SqlServer.ManagedDTS)
------------------------------ Program Location:
at Microsoft.SqlServer.Dts.Runtime.PersistImpl.SaveToXML(XmlDocument& doc, XmlNode node, IDTSEvents events) at Microsoft.SqlServer.Dts.Runtime.DtsContainer.SaveToXML(XmlDocument& doc, XmlNode node, IDTSEvents events) at Microsoft.DataTransformationServices.Design.DtsClipboardCommandHelper.SerializeRuntimeObjects(ICollection logicalObjects)
Hey, I've a few jobs which call SSIS packages. If I run the SSIS package, it runs fine but if I try to run the job which calls this package, it fails. Can someone help me troubleshoot this issue? None of my jobs that call an SSIS package work. All of them fail.
I create a new SSIS package, drag in an Execute DTS 2000 Package Task, select and embed a DTS package which consists only one one task (as above), and then change the source & destination details (svr + user/pwd). Then when I go to the Copy tab, I get the following error when I hit Select Objects, to view the objects which the embedded DTS package should copy:
SQL-DMO error 21776: general error.
On further inspection, none of the objects selected for copy within the atomic/original DTS package, remain selected for copy within the embedded DTS package.
I have googled to search for an answer to this one, but to no avail. Any ideas would be greatly welcomed.
And there is a task (Execute SSIS package) in First package that calls the execution of second package.
I m continuously receiving an error "Failed to decrypt protected XML node "PackagePassword" with error 0x8009000B "Key not valid for use in specified state.". You may not be authorized to access this information. This error occurs when there is a cryptographic error. Verify that the correct key is available."
As we are running first package by job, job runs successfully logging above error
The protection level of second package is set to "EncryptSensitiveWithUserKey"
The problem is this: In SQL 2000 DTS there was an option for "Copy Objects and Data Between SQL Servers". However, this option has been removed in SQL 2005 SSIS. Apparently the only way to do this in SQL 2005 is to create a .DTSX package in SQL Server Business Intelligence Developement Studio or VS 2005. You do this by creating a new Integration Service Project and using the Transfer SQL Objects Task. Within the properties of this task you can select any of the options that were available in the SQL 2000 DTS export wizard. I have set up a test package that will copy a stored procedure from one db to another but I am unable to get it to work. It runs fine but the result is that the SP is not copied.
I am new to Visual Studio and I think I probably just need help in know ing how to run a package in SQL Server Management Studio. I was able to import the package into SSIS in the Management Studio and run it without errors not with the expected result (the copy of an SP from one db to another). I'm sure there are people besides me who would like to have the ability to easily perform ad hoc copies of objects between SQL servers. If anyone has any experience with using a SSIS package to do this please help. Thanks!
Hi, i have a ssis package that copies a database using online method of the database transfer task. I realize that this task does not copy my keys and indexes, therefore i went and used the SQL Transfer Object Task which would transfer my keys and indexes, but does not copy the default value of a column, is there something i need to change/turn on for this to work ?
What is the security requirement for running this task? I have no problem running this taks as a system administrator but all my users who are in the db_dtsltduser cannot run the same task successfully. They are DBOs in both databases involded in the task. Thank you in advance for your help.
Challenge: Datapumping. To copy daily production data from N x 100 SQL 2k servers to one central SQL2k5 server. Sometimes copying task might demand transferring schema objects like temp tables and procedures from sql2005 to sql2000.
Since system organisation is different, what would be the best approach ?
Can a SSIS server, i.e. staging server be used to create packages that update SQL Server 2000 Analysis services objects on another server running SQL server 2000 OLAP?
It appears that the OLAP connection manager ion SSIS supports only connections (and thus updates) to 2005 OLAP objects. I work for a company that has a huge investment in a 3rd party DW that uses Analysis Services 2000. the DW tool vendor will not support an upgrade to SSAS 2005. We wish to extend the DW from other data sources. My thought was to use a staging server with SSIS, used solely as the ETL tool for all new development (thus no more DTS development), and to update the sql server 2000 operational data store on another box.
I can find no documentation on how to process sql server 2000 analysis services objects from within an SSIS package. Any ideas?
I have a SSIS job, one of the last steps it performs is to execute a SQL 2000 DTS package. This has to be done as a SQL 2000 DTS package as it is performing rebuilds of SQL 2000 Analysis Services dimensions and cubes. We've found that when the DTS fails the SSIS job is happily completing showing as a success, we would prefer to know it went wrong.
As far as I'm aware SSIS merely starts the DTS off and doesn't care about it's result. I've taken a look in to turning on the logging for the execute DTS package and thought that the ExecuteDTS80PackageTaskTaskResult would give me the answer I need...but is merely written to the log not available as an event-handler. It also looks like it is not safe to put a SQL task in as the next item to go look at the SQL 2000 system tables to look at the log for the DTS package as the SSIS documentation warns that the DTS package can continue to run after the execute DTS package task has ended.
Ideally I want any error raised within the DTS package to cascade up to be an error in the SSIS job, I can then handle it appropriately. I cannot find a way to do this. Is there a way?
If not, can anyone suggest how in the remainder of the SSIS tasks I can be sure that the DTS has completed before I start any other tasks that will check for the SQL 2000 log of its execution?
I have developed an SSIS package for ETL purpose. I am invoking the SSIS package through .Net console application by referencing the ManagedDTS Assembly. I am able to execute the package in Sql Server 2005 Developer Edition and it runs fine till completion.
But when i try to execute the packahe in Sql Server 2005 Standard edition, by invoking the package through .Net console application the status of the package is failure.
Can any one help me how to over come this problem.
I am in the process of moving from a 32-bit SQL Server 2005 Enterprise (9.0.3054) to a 64-bit SQL Server 2005 Enterprise (9.0.3054 with 4 CPUs and 8GB of memory on Win 2003 SP2) and the process has been very frustrating to say the least. I am having a problem with packages that I created on my 64-bit SQL Server. I am importing a few tables from the 32-SQL Server into the 64-bit SQL Server using the Task --> Import to create the package.
Sometimes when I am creating a package I get the following error in a message box:
SQL Server Import and Export Wizard
The SSIS Runtime object could not be created. Verify that DTS.dll is available and registered. The wizard cannot continue and it will terminate.
Additional information: Attempted to read or write protected memory. This is often an indication that other memory is corrupt. (System.Windows.Forms)
Other times when I run a package that has run successfully before I get the following error:
Faulting application dtexecui.exe, version 9.0.3042.0, stamp 45cd726d, faulting module unknown, version 0.0.0.0, stamp 00000000, debug? 0, fault address 0x025d23f0.
The package appears to hang when running. By this I mean that the Package Execution Progress shows progress up to a point then it just stops. (The package takes about 17 seconds to run normally) CPU usage is at 1% and the package cannot be stopped.
I have deleted and re-created the package several times and I have also re-installed the service pack on the SQL Server (9.0.3054) but that did not help.
I would like to standardize SSIS development so that developers all start with the same basic template. I have set it up so it is an available template ( http://support.microsoft.com/kb/908018 ) but I would like it to be the default when a new project or package is created. Is this an option?
I have a CSV file which contains some duplicate record and i have to load this file in SQL server database using SSIS package .
What i have to do is read the file and if the same record entry is occur more than 10 times for a particular unique combination ( like ID , Date , Time ) then i need to take only one record for that occurance.
I am having a problem where duplicate log statements are being written to a log file (as defined by a log provider).
I believe that this is because in the logging dialog box, I have ticked the checkbox next to a child task to override the logging functionality. I need to do this because it is a script task and I want to capture "ScriptTaskLogEntry" events (something that I cannot do at the parent level). However by doing this I seem to get the script events written at the parent, as well as at the Script Task level.
Is there any way of avoiding this, but still capturing the log events from the script task?
Another issue that is possibly linked is that I am getting an error from the log provider:
The SSIS logging provider "SSIS log provider for Text files" failed with error code 0x800700EA ((null)). This indicates a logging error attributable to the specified log provider.
Could this be because of the parent and child task are both attempting to write to the same log provider?
I would like to fetch the data flow component name while package is executing. Since system variable named [System::SourceName] only fetches name of the control flow tasks? Is there a way to capture them?
The master package has a configuration file, specifying the connect strings The master package passes these connect-strings to the child packages in a variable Both master package and child packages have connection managers, setup to use localhost. This is done deliberately to be able to test the packages on individual development pc€™s. We do not want to change anything inside the packages when deploying to test, and from test to production. All differences will be in the config files (which are pretty fixed, they very seldom change). That way we can be sure that we can deploy to production without any changes at all.
The package is run from the file system, through a job-schedule.
We experience the following when running on a not default sql-server instance (called dkms5253uedw)
Case 1: The master package starts by executing three sql-scripts (drop foreign key€™s, truncate tables, create foreign key€™s). This works fine.
The master package then executes the first child package. We then in the sysdtslog get:
Error - €œcannot connect to database xxx€? Info - €œpackage is preparing to get connection string from parent €¦€?
The child package then executes OK, does all it€™s work, and finish. Because there has been an error, the master package then stops with an error.
Case 2: When we run exactly the same, but with the connection strings in the config file pointing to the default instance (dkms5253), the everything works fine.
Case 3: When we run exactly the same, again against the dkms5253uedw instance, but now with the exact same databases defined in the default instance, it also works perfect.
Case 4: When we then stop the sql-server on the default instance, the package faults again, this time with
Error - €œtimeout when connect to database xxx€? Info - €œpackage is preparing to get connection string from parent €¦€?
And the continues as in the first case
From all this we conclude, that the child package tries to connect to the database before it knows the connection string it gets passed in the variable from the master package. It therefore tries to connect to the default instance, and this only works if the default instance is running and has the same databases defined. As far as we can see, the child package does no work against the default instance (no logging etc.).
We have tried delayed validation in the packages and in the connection managers, but with the same results (error).
So we are desperately hoping that someone can help us solve this problem.
I am working on a project that is creating a new application, my area of the project being the migration of data from the old application database, transforming it, and populating the new database.
The transformations to the old data are done in a staging database.
At the end of the process, the staging database ends up with a lot of new applications tables, populated with the migrated legacy data.
We need to move these tables from the staging database to (initially) our test databases, but ultimately what will be the live database.
We have tried using the "Transfer SQL Server Objects Task" in SSIS, but have ran into a problem that a lot of the database tables have default values for columns.
These default values are not brought over.
Example. Tables contain a "GUID" field, which has a default of value of newid()
Right clicking and the table generating the CREATE script generates
[GUID] [uniqueidentifier] ROWGUIDCOL NOT NULL CONSTRAINT [DF_tbCRM_Client_GUID] DEFAULT (newid()),
However, the Transfer objects task does not create this default of newid()
Examining the SQL generated by the Import / Export Wizard when investigating this shows that the wizard generates this column as
[GUID] uniqueidentifier NOT NULL
and the column default value is lost.
Is there something i should be setting somewhere to force SSIS to bring these column definitions over correctly?
I am interested in Passing value from a child Package variable to the Parent package that calls it in ssis.
I am able to call the Child package using the execute package task and use Configurations to pass values from the parent variable to the child, but I am not able to pass the value from the child to the parent.
I have a variable called datasetId in both the parent and child. it gets computed in the child and needs to be passed to the parent...
Deployed Report having SSIS package as source do not work when Indirect Package configuration is used in ETL package. It seems ETL package when called/executed from Report manager does not recognize environment variable to pick up the dtsconfig file.
The Report works when Direct package configuration is used to same dtsconfig file.
What could be the reason? Any solution for this? This will cause our build/deployment to QA and Prod very difficult.
I now have two SSIS package, "TESTING" and "LOADING". The "TESTING" package have an execute package task that call the "LOADING" package. When I want to execute the TESTING package, how can I setup the connection string so that I can edit the password of the database connected by the "LOADING" package?
I have 3 source for IS flow. One is flat file, one is DB table and one is output bad data. It might be a situation when I could have duplicate primary key since records come from 3 sources (flat file, db table, reject (output) table). Can any one give me suggestion how to handle duplicate primary key problem in this situation.
First post here. Anyway, I have a question regarding SSIS. I'm currently given a task that requires reading a flat file, applying duplicate removal as well as invalid data removal, processing it, and finally writing it to a SQL Server 2005 DB.
Part of the processing requires checking for partial duplicates in the batches of records provided in the text file. For example, the record contains a a phone number, status, timestamp of creation and various other entries. If a phone number is repeated (meaning, duplicate entry), a column called 'Status' must be checked, and only entries with the status of 'C' is allowed through.
Another part of the processing requires that if the phone number is repeated along with various other entries including status, the timestamp of creation is checked and only the entry with the most recent timestamp is accepted.
I would like to know how to implement this in SSIS without using table objects and scripts, as my experience tells me that doing this in a script can really take a hit on system performance. The task is expected to handle tens of thousands of records in a day.
I've a dtsx package which runs nightly to do following:
1. select data from a SQL replicated table 2. do some lookups (Lookup, Derived Column, Multicast, Conditional Split, etc.) 3. insert into another SQL table on another server using "Table or view - fast load", rows per batch = 10000, maximum insert commit size = 10000, and "redirect row" on error output on destination to an error log text file. Once in a while, I found duplicate records in the error log; these rows cannot be inserted into destination table due to primary constraint. For example, transaction_id=111000 appears twice in the error log but it is a unique key in the source table.
My questions: 1. What could be the cause of duplicating rows during ETL in SSIS? I've asked this before and have spent so much time research but still could not find the reason. This link is from my previous post:
2. For a daily extract data with over millions of rows, what would be best to set rows per batch, maximum insert commit size, etc? I've read some posts on this forum and decide to use 10000 for both, but once in a while there's just one duplicate rows that causes the whole batch of 10000 rows not committed.