Programmatically Adding A Script Component To Data Flow Task
Feb 2, 2007
Dear all,
I am developing tools for automatic creation of data warehouse tables, cubes and SSIS packages. Generating the SSIS Data Flows works very well using the SSIS components for OLE DB Source, Derived Column, Lookup and OLE DB Destination.
However for some of the advanced functionality I need to use Script Component. I have managed to add it in the Data Flow with all inputs and outputs, but how do I populate it with my code? I've seen there is a component property called "SourceCode" and one called "BinaryCode". The "SourceCode" contains the code, but also some extra metadata.
Questions:
Do you know if there is any programmatic support to generate the Source Code property with the metadata necessary?
Do you know how to compile the Source Code and generate the property BinaryCode?
Example from my code below:
// Create script component
IDTSComponentMetaData90 script = dataFlowTask.ComponentMetaDataCollection.New();
script.ComponentClassID = app.PipelineComponentInfos["Script Component"].CreationName;
CManagedComponentWrapper scriptWrapper = script.Instantiate();
script.InputCollection.New();
script.OutputCollection.New();
scriptWrapper.ProvideComponentProperties();
script.Name = "Logics";
// Create path
IDTSPath90 scriptPath = dataFlowTask.PathCollection.New();
scriptPath.AttachPathAndPropagateNotifications(lastComponent.OutputCollection[0], script.InputCollection[0]);
// Populate input and output columns
IDTSInput90 scriptInput = script.InputCollection[0];
IDTSVirtualInput90 scriptVInput = scriptInput.GetVirtualInput();
foreach (IDTSOutputColumn90 col in oledbSrc.OutputCollection[0].OutputColumnCollection)
{
scriptWrapper.SetUsageType(scriptInput.ID, scriptVInput, col.LineageID, DTSUsageType.UT_READONLY);
IDTSOutputColumn90 tmp = script.OutputCollection[0].OutputColumnCollection.New();
tmp.Name = col.Name;
tmp.SetDataTypeProperties(col.DataType, col.Length, col.Precision, col.Scale, col.CodePage);
}
// Make script asynchronous
script.OutputCollection[0].SynchronousInputID = 0;
Thanks for any assistance and Best Regards,
Johan Åhlén,
Business Intelligence consultant at IFS
View 2 Replies
ADVERTISEMENT
Aug 1, 2007
Is it possible to add a Fuzzy Grouping Transformation in a Data flow task by Programmatically ? If it possible, what is the C# or VB .net code for that ?
View 1 Replies
View Related
Sep 4, 2007
Has anyone done this? I can't find anything in the documentation
that describes this. The closest I get is to the InnerObject property
of the TaskHost class. There is an example of programming a bulk
insert task. But I can't find anything on programmatically setting
the column mappings (source to dest) of a simple data flow task. Any
help is appreciated!
View 7 Replies
View Related
Feb 6, 2007
Dear Colleagues,
I'm trying to develop a custom Data Flow Transformization component in SSIS.
I compiled it without errors, installed it in the GAC and in the Pipeline Components-folder however I always get the following message when I'm trying to drag the component onto the designer surface:
The component could not be added to the Data Flow task.
Please verify that this component is properly installed.
------------------------------
ADDITIONAL INFORMATION:
The data flow object "RisikoKennzahlenKomponenten.MarktwertTransformation,
RisikoKennzahlenKomponenten, Version=1.0.0.0, Culture=neutral,
PublicKeyToken=cfa8722b8086ac2d" is not installed correctly on this computer.
(Microsoft.DataTransformationServices.Design)
Program Location:
at
Microsoft.DataTransformationServices.Design.DtsBasePackageDesigner.GetPipelineInfo(String creationName, IServiceProvider serviceProvider)
at
Microsoft.DataTransformationServices.Design.DesignUtils.GetNewPipelineComponentObjectName(IDTSComponentMetaDataCollection90
parentCollection, String clsid, IDTSComponentMetaData90 componentMetadata,
PipelineComponentInfo& pipelineComponentInfo)
at
Microsoft.DataTransformationServices.Design.PipelineTaskDesigner.AddNewComponent(String clsid, Boolean throwOnError)
This happens with EVERY custom component on my computer. The same components work fine on other machines.
Does anyone have an idea?
Regards
Arne Janning
View 1 Replies
View Related
Dec 18, 2007
I am trying to cleanly shutdown a dataflow task, which contains a script component, when RunningPackage.Stop() is called from the SSIS runtime.
I've been going in ever decreasing circles with no success - it looks like the cleanest way to find out whether RunningPackage.Stop() has been called is to call FireQueryCancel(). But I can't find any reference to anything useful in a dataflow task script component that gives me something that implements IDTSComponentEvents. The nearest thing seems to be Me.ComponentMetaData which gives a reference to IDTSComponentMetaData90, but this only has methods for calling FireError, FireInformation, FireProgress, FireWarning, and FireCustomEvent. But no FireQueryCancel.
Is there a way in a script component that I can find out the state of QueryCancel?
Any help would be apprecieated.
View 2 Replies
View Related
Feb 19, 2008
Hello Helpers,
I need to know how to use my private function - created as a scalar-valued-function in SQL Server 2005 - in script component (here a transformation is used) in a data flow task to transform a two-digit-month into a tree-sign-month:
Example: '01' should be transformed into 'Jan'
Many thanks for alle your commitment and help!
Ulrike
View 4 Replies
View Related
Nov 19, 2006
Hi,
I have a flat file(pipe delimited), like below( only two rows shown)
1|001|B|C|002|A|003|
1|005|D|C|003|A|004|
I have to import column 2( that is 001), column 5 (that is 002), column 7(that is 003) from the above flat file to column in Database that has DataType Numeric DataTyp (4,3).
In the Database the columns shoule have values like(.001), (.002), (.007). I have used flat file source, Script component, and OleDbDesitnation in my DataFlow task.
In the script component, I have added a "." for each column as a string.
I cannot import to OldeDbDestination becuase of Data conversion issue.
Is there any way we can solve this..
Thank you.
View 2 Replies
View Related
Oct 9, 2006
Hi
I am trying to develop a data flow task in C#. I need to add an expression tab in the task where i can write expressions on the input columns and map it to outputs. Please let me know how to go about it. I am new to SSIS coding so dnt have much idea abt it.
Thanks in advance,
Vipul
View 9 Replies
View Related
Aug 29, 2007
Hello,
Is it possible to use existing data flow components (Merge Join, aggregation,...) in a custom data flow component?
Thanks,
Yoann
View 15 Replies
View Related
Dec 28, 2007
Hi,
I'm trying to implement an incremental data pull (Oracle to SQL) based on Andy's blog:
http://sqlblog.com/blogs/andy_leonard/archive/2007/07/09/ssis-design-pattern-incremental-loads.aspx
My development machine is decent: 1.86 GHz, Intel core 2 CPU, 3 GB of RAM.
However it seems the data flow task gets hung whenever I test the package against the ~6 million row source, as can be seen from these screenshots. I have no memory limitations on the lookup transformation. After the rows have been cached nothing happens. Memory for the dtsdebug process hovers around 1.8 GB and it uses 1-6 percent of CPU resources continuously. I am not using fast load to insert new records into my sql target table. (I am right clicking Sequence Container 3 and executing this container NOT the entire package in the screenshots)
http://i248.photobucket.com/albums/gg168/boston_sql92/1.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/2.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/3.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/4.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/5.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/6.jpg
The same package works fine against a similar test table with 150k rows.
http://i248.photobucket.com/albums/gg168/boston_sql92/7.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/8.jpg
The weird thing is it only takes 24 minutes for a full refresh of the entire source table from Oracle to the SQL target table.
Any hints,advice would be appreciated.
View 18 Replies
View Related
Jan 9, 2007
Hi there,
I have created a package which simply imports data from a flat file to a SQL Server table. But I need to incorporate a data conversion component by which I may change the source-destination column mapping programmatically. So what I thought that I need to add a data conversion component into the dataflow task. After adding this component (I found its component id as {C3BF62C8-7C5C-4F85-83C3-E0B6F6BE267C}) I have created a path which establishes the mapping between output columns of source component and the input columns of data conversion component. Now I am not sure how to establish the mapping between the data conversion component€™s input column collection and output column collection.
I am giving my code snippet here,
IDTSComponentMetaData90 sourceDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New();
sourceDataFlowComponent.ComponentClassID = "{90C7770B-DE7C-435E-880E-E718C92C0573}";
€¦ €¦ €¦. // Code for configuring the source data flow component
IDTSComponentMetaData90 conversionDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New();// creating data conversion
conversionDataFlowComponent.ComponentClassID = "{C3BF62C8-7C5C-4F85-83C3-E0B6F6BE267C}";// This is the GUID for data conversion component
CManagedComponentWrapper conversionInstance = conversionDataFlowComponent.Instantiate();//Instantiate
conversionInstance.ProvideComponentProperties();
// Now creating a path to connet the source and conversion
IDTSPath90 fPath = dataFlowTask.PathCollection.New(); fPath.AttachPathAndPropagateNotifications(
sourceDataFlowComponent.OutputCollection[0],
conversionDataFlowComponent.InputCollection[0]);
// Sould I need to accuire connect for data conversion? Im not sure
conversionInstance.AcquireConnections(null);
conversionInstance.ReinitializeMetaData();
// Get the input collection
IDTSInput90 input = conversionDataFlowComponent.InputCollection[0];
IDTSVirtualInput90 vInput = input.GetVirtualInput();
foreach (IDTSVirtualInputColumn90 vColumn in vInput.VirtualInputColumnCollection){
conversionInstance.SetUsageType(
input.ID, vInput, vColumn.LineageID, DTSUsageType.UT_READONLY);
}
.. . // Well here I am stucked. What I need to do here to establish a map
// between conversionDataFlowComponent.InputCollection[0] and
// conversionDataFlowComponent.OutputCollection[0]?
As you can see I am just away from creating the mapping between input and output collection. Can anybody give me an idea how can I achieve this?
I will appreciate all kind of suggestions and comments.
Regards
Moim
View 11 Replies
View Related
Mar 6, 2007
HI All,
In several threads there has been discussion regarding adding connection managers to a package's data flow, etc. My challenge is that I have a large solution that contains many packages, and I need to change the connection manager linked to the data flow in all of the packages. When the solution was initially designed, data sources were used, and it has become a tedious maintenance issue to keep those in sync. We want to use a standard OLEDB connection manager, but adding a connection manager to each package and editing the corresponding data flow tasks in each package to use that new connection manager is a daunting task. I've coded a .Net module to access the packages, remove the old connection manager (data source) and add the new OLEDB data source. However, as I traverse the objects in the package hierarchy, when I come to the data flow object, the innerobject is not a dts object, but rather a _com object.. I can't seem to find any documentation/examples as to how to iterate the tasks within a data flow and change the connection manager. If you have any information, that would be quite helpful. If you reply with a code sample, if you would be so kind as to relate it to one of the sample packages provided with SSIS so I can run it, that would be great.
Thank you.
Steve.
View 1 Replies
View Related
Jul 26, 2007
I have a int value that I pulled from a table in a database and it is stored in a variable. I would like to increment this value in a script component and then insert it into a field in my ole db destination. Is there any example out there of using a script component to do something similar to this.
Do I want to select source, destination or transformation?
Can I/how do I access my variable from within the script component.
Any asistance or examples you can point me towards would be greatly appreciated.
View 3 Replies
View Related
Jul 2, 2007
There is a table with a column that contains Xml documents. For each record from my Data Flow Source, I want to pass in the Xml document and the node to interrogate, and return the value contained in the node. Like the Crm component, this is probably one I will have to write from scratch in C#, but I would like to avoid having to create the custom component if it already exists in the public arena.
Does anyone know of any Xml Ssis Data Flow Components that are downloadable for free?
View 3 Replies
View Related
Feb 28, 2008
Hi, all experts here,
Do we always have to use SCD component for the loading of data into data warehouse to handle changes of rows?
I am looking forward to hearing from you and thank you very much in advance for your help.
With best regards,
View 4 Replies
View Related
May 4, 2006
Hi,
I am trying to create a simple BI Application for SSIS. In Visual Studio 2005 I just get a Data Flow Task from the toolbar and add it to the project. When I double click it I get the following error:
The task with the name "Data Flow Task" and the creation name "DTS.Pipeline.1" is not registered for use on this computer.
Then when I try to delete it it gives this other error:
Cannot remove the specified item because it was not found in the specified Collection.
I am creating this application in an administrator account in this computer, so I doubt the problem is related to permissions. I am running SQL Server 2005 and Visual Studio 2005 in WinXP Tablet PC Edition.
Any suggestions why this is happening and how to fix it?
View 17 Replies
View Related
Mar 12, 2008
I am using SQL 2005 SSIS. I am joining several large tables and then the move result into another table in the same database.
I would like know which method is faster:
Use Execute SQL Task to insert the result set to the target table
Use the Data Flow Task to insert the result set to the target table. (Use OLE DB source to execute SQL command and then use the SQL destination)
Could you tell me why then other is slower?
Thanks.
View 7 Replies
View Related
Oct 2, 2007
I have a stored procedure that is executed via a sql script task that returns a full result set. I map this result set to a variable or object type. Is there a way to use this variable as a data source in a subsequent data flow task?
A.
View 14 Replies
View Related
Dec 10, 2007
Is there a default destination component used when a new data flow is created? The reason I ask is simply curiosity. I have an xml file with 2 pieces of data: item A and item B. A should simply get copied out of the file. B should undergo a quick transform. I set up an XML source such that two columns are mapped correctly to the XML source data of A and B. I set up my data transform task as well. So, if I leave those two components on the .dtsx page with no other components, then will there be a default data flow destination already created? ...OR, do you always have to have a destination component?
Thanks for the input. I am just curious.
View 4 Replies
View Related
Jan 12, 2006
I have a package variable that I set via an ExecuteSQL task. I want to reference it in a data flow script component. In the Script component I enter the variable into the ReadOnlyVariables collection, then in the script I reference it as Me.Variables.var. (E.G. counter = Me.Variables.var)
I'm getting errors when the data flow starts:
Error: 0xC0047062 at Provider, Set Surrogate Key [4261]: System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> Microsoft.SqlServer.Dts.Pipeline.ReadWriteVariablesNotAvailableException: The collection of variables locked for read and write access is not available outside of PostExecute.
I have no problem referencing other variables that I have in DerivedColumn transformations. I've tried putting the variable in the ReadWriteVariables collection but I get the same error. I don't understand why this is so difficult. Please help.
View 6 Replies
View Related
Apr 27, 2006
Hi,
I've read the various posts and articles regarding this matter, but I seem to have problems getting to work:
In my control flow, I start by declaring a variable named "LastJobLedgerEntryID", to identify the records I need to add to the stage. From there I would like to use this variable in the source component in my dataflow, i.e.:
"SELECT [Entry No_],[Job No_],[Posting Date],[Document No_],[Type],[No_],[Description],[Quantity],[Direct Unit Cost],[Unit Cost],[Unit Price],[Chargeable],[Job Posting Group],[Global Dimension 1 Code],[Global Dimension 2 Code],[Work Type Code] FROM mytable WHERE [Entry No_] > " + @[User::LastJobLedgerEntryID]
But this fails? I should note that the variable LastJobLedgerEntryID is stored as a int32, and with the default value of 0
Could someone please help me with this?
Thanks in advance!
View 5 Replies
View Related
Dec 3, 2007
I'm wondering if it is possible to create a flat file source on the fly while bypassing the following step:
On the Connection Managers page, add or create the Flat File connection manager, using a descriptive name such as MyFlatFileSrcConnectionManager. Then close the Script Transformation Editor.
I want to create the connection totally in script, yet i'm having a hard time proving this out...does anybody have any
experience with this?
Ryan
View 4 Replies
View Related
Oct 30, 2007
Hello-
I have an SSIS package which I've been using for nearly a year now. Basically the package is responsible for looping through a directory, and importing pipe delimited files into a database.
The issue I'm having is with the UnDoubleError data flow component. I've been using it to remove the qualifiers from the data being imported, in this case it happens to be double-quotes. ex: { " " } I have found that the component will insert a single double-quote { " } when it finds a null string, or two consecutive qualifiers for that matter. If there are two qualifiers with a space seperating them, then it will insert an empty string, or rather a string with one space in it, and the qualifiers will be removed.
The dilemma is since the log files have some columns which are technically NULL, by that I mean columns where there exists only two qualifiers ex: { "" } the UnDoubleError component then inserts a single double-quote. I want to retain the NULL value if it exists, though still use the component to remove the qualifiers when there actually is data there.
Any suggestions?
Thanks for the help...
View 1 Replies
View Related
Dec 20, 2007
Hi,
I'm trying to get a record count out of a databse using OLE DB Source and row count tasks but keep getting an error. I set up a variable as int32 and select the variable name in the row count task and when I go to the Input Columns tab to select a field to count, it gives me this error:
Error at Data Flow Task[Row Count[505]]: The component "Row Count" (505) has forbidden the requested use of the input column with lineage ID 32.
I don't even know what this means?
thanks,
View 4 Replies
View Related
Jan 30, 2006
I am getting the following exception when attempting to read from a DataReaderDestination:
System.Exception was unhandled
Message="Could not obtain a DataReader object from the specified data flow component."
Source="Microsoft.SqlServer.Dts.DtsClient"
StackTrace:
at Microsoft.SqlServer.Dts.DtsClient.DtsCommand.internalPrepare(Boolean fReaderRequired)
at Microsoft.SqlServer.Dts.DtsClient.DtsCommand.ExecuteReaderInThread()
at Microsoft.SqlServer.Dts.DtsClient.DtsCommand.ExecuteReader(CommandBehavior behavior)
at CA3DataImportTool.ViewSSISOutput.btnRun_Click(Object sender, EventArgs e) in C:Documents and Settings
heinMy DocumentsVisual Studio 2005ProjectsCA3DataImportToolCA3DataImportToolViewSSISOutput.cs:line 35
at System.Windows.Forms.Control.OnClick(EventArgs e)
at System.Windows.Forms.Button.OnClick(EventArgs e)
at System.Windows.Forms.Button.OnMouseUp(MouseEventArgs mevent)
at System.Windows.Forms.Control.WmMouseUp(Message& m, MouseButtons button, Int32 clicks)
at System.Windows.Forms.Control.WndProc(Message& m)
at System.Windows.Forms.ButtonBase.WndProc(Message& m)
at System.Windows.Forms.Button.WndProc(Message& m)
at System.Windows.Forms.Control.ControlNativeWindow.OnMessage(Message& m)
at System.Windows.Forms.Control.ControlNativeWindow.WndProc(Message& m)
at System.Windows.Forms.NativeWindow.DebuggableCallback(IntPtr hWnd, Int32 msg, IntPtr wparam, IntPtr lparam)
at System.Windows.Forms.UnsafeNativeMethods.DispatchMessageW(MSG& msg)
at System.Windows.Forms.Application.ComponentManager.System.Windows.Forms.UnsafeNativeMethods.IMsoComponentManager.FPushMessageLoop(Int32 dwComponentID, Int32 reason, Int32 pvLoopData)
at System.Windows.Forms.Application.ThreadContext.RunMessageLoopInner(Int32 reason, ApplicationContext context)
at System.Windows.Forms.Application.ThreadContext.RunMessageLoop(Int32 reason, ApplicationContext context)
at System.Windows.Forms.Application.Run(Form mainForm)
at CA3DataImportTool.Program.Main() in C:Documents and Settings
heinMy DocumentsVisual Studio 2005ProjectsCA3DataImportToolCA3DataImportToolProgram.cs:line 18
at System.AppDomain.nExecuteAssembly(Assembly assembly, String[] args)
at System.AppDomain.ExecuteAssembly(String assemblyFile, Evidence assemblySecurity, String[] args)
at Microsoft.VisualStudio.HostingProcess.HostProc.RunUsersAssembly()
at System.Threading.ThreadHelper.ThreadStart_Context(Object state)
at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state)
at System.Threading.ThreadHelper.ThreadStart()
If I use the example from SQL Server BOL (http://msdn2.microsoft.com/en-us/library/ms135917.aspx), and make a new package for the sample in the same project, the sample works. The only thing that I can see that is significantly different between my code and the sample is that my DataReaderDestination has a lot more data in it, but here's the relevant code:
string dtexecArgs;
string dataReaderName;
DtsConnection dtsConnection;
DtsCommand dtsCommand; //IDbCommand
IDataReader dtsDataReader;
DataTable dtsTable;
dtexecArgs = @"/FILE ""C:Documents and Settings
heinMy DocumentsVisual Studio 2005ProjectsCA3DataImportToolML3000_IntegrationProjectPackage.dtsx"" ";
dataReaderName = "DataReaderDest";
dtsConnection = new DtsConnection();
dtsConnection.ConnectionString = dtexecArgs;
dtsConnection.Open();
dtsCommand = new DtsCommand(dtsConnection);
dtsCommand.CommandText = dataReaderName;
dtsDataReader = dtsCommand.ExecuteReader(CommandBehavior.Default); // EXCEPTION HERE
Please help!
Richard Hein
View 5 Replies
View Related
Nov 9, 2007
Hi
After having read the forums I understand that it is not possible to use parameters in the "Ole Db Command", when connecting to Oracle.
What I have to do, is to delete a record in an Oracle table from within the "Data Flow", based on parameters.
It does not seem to be possible to create the SQL statement in a variable (with parameters generated) and use the variable in the "Ole Db Command"
Instead I probably need to use the "Script Component"?
So what I need is step by step guidance for how to connect to and interact with an Oracle database from a Script Component, what to import and where, what to declare and where, and generally a way to get this to work.
Any help appreciated.
Olafur
View 4 Replies
View Related
Nov 14, 2007
Does anyone know how to hook up to data flow pipeline events via custom solution (C#)? I am trying to write code to log start and end times of components(lookup,merge joins etc) in a data flow task. I tried with a class inheriting from the EventsProvider class but it didn't work as this is only for container tasks. Any ideas will be greatly appreciated.
View 3 Replies
View Related
Aug 28, 2007
Hi,
I am creating a SSIS Programmatically. I facing problem in setting name to data flow compenent.
here is the sample code
IDTSComponentMetaData90 oledbItemMasterSource;
oledbItemMasterSource = dataFlow.ComponentMetaDataCollection.New();
oledbItemMasterSource.ComponentClassID = "DTSAdapter.OLEDBSource";
oledbItemMasterSource.Name = "InventTable";
oledbItemMasterSource.Description = "Get Item Master";
After creating this package i found that 'InventTable" is not set to OLE Db Source component. where as i can set name to OLE DB Flat File destination component.
Kindly suggest me a way fix this issue.
Thanks-
aravind
View 1 Replies
View Related
Jan 1, 2008
How do I tell the destination of the dataflow to use the stage table that has not been created yet?
View 9 Replies
View Related
Feb 13, 2008
At our business we are getting a lot of PDF documents that are being hand keyed into a database. Has anyone heard ior know of a SSIS Data Flow Source component that I coud use to read thos documents into a data stream (?) and process?
View 5 Replies
View Related
Jan 30, 2008
Hi,
I have SSIS which will have OLF DB Source, and then have Derived Column component to managering all data from OLF DB Source. I used to have default columns such as Create Date, Update Date set as fixed date. Now we decided to put this default column values into a table to manage. I then have problem to choose which component I should use in order to have this columns selected from default table.
For example: if Create Date is null, I have to select default value from the default table; otherwise, use Create Date value and so on.
Thanks,
Megan
View 17 Replies
View Related
Jul 24, 2007
I have found not very good behavior of Union All Data Flow Component.
I have Script component and about 30 Union All Data Flow Components. I have string field named 'MyField'. Legth of the field is 15 characters. The field appears first in Script component (defined as 'MyField' Output collumn) and present in all Union All Data Flow Components. I need to change field length from 15 to 10. In Script Transformator Editor of Script component I changed 'MyField' Output collumn Length property value set it to 10. But length of the field remains 20 in all Union All Data Flow Component downstream and no errors no warnings appears. To fix this i recreated (deleted and added again) 'MyField' in all Union All 30 Data Flow Components.
If extend length from, say, 15 to 20 in the first of Union All Data Flow Components error sign appears
"Error 1 Validation error. Data Flow Task: Union All [22]: The metadata for "input column "MyField" (144)" does not match the metadata for the associated output column."
and only recreation of the field in all Union All Data Flow Components can help to fix it.
May be someone knows other solution to fix such behavior?
View 1 Replies
View Related
Dec 4, 2006
Hi,
I'm pretty new in SSIS and i have some problems with error log. I want to get detailled error description in a script component of a dataflow. for the moment I use thooses lines
Row.ErrorDesc = ComponentMetaData.GetErrorDescription(Row.ErrorCode)
and for unique constraints on a sql table I have this error : The data value violates integrity constraints.
For the same error, if i use an event handler on error, i have more row and the first of them is more explicit (Variable System::ErrorDescription)
An OLE DB error has occurred. Error code: 0x80040E2F.
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E2F Description: "The statement has been terminated.".
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E2F Description: "Cannot insert duplicate key row in object 'dbo.dimDepot' with unique index 'IX_dimDepot'.".
Is that possible to have a so detailled error text in a script componnent of a data flow? If yes, How?
Or if i use error event how can authorize the dataflow go ahead even if there is error.
thanks for you help
krest
View 1 Replies
View Related