Created A Stage Table For First Data Flow Component But...
Jan 1, 2008How do I tell the destination of the dataflow to use the stage table that has not been created yet?
View 9 RepliesHow do I tell the destination of the dataflow to use the stage table that has not been created yet?
View 9 RepliesHello dear Forum,
I guess this should be rather simple to answer but i just don't get how to do this. I just want to import some data from an Access mdb file into a SQL Server 2005 Database.
As I dont want to create all tables in Sql Server 2005 beforehand i'd like to create them out of my SSIS Package. Therefore i use the 'Execute SQL Task' in the control flow before stepping into the dataflow where only a oledb-source and an oledb-destination exist...
VERY simple... My problem is, that i can't select the table from the selectionlist in the oledb-destination because it does not exist before executing the package...
If i create the table by hand just to be able to select it, that does not help. because if everything is set up and then i delete the table (because it will be created by the package anyway) an error occurs before executing the package - in the validation phase:
Package Validation Error: "Invalid object name 'myToBeCreatedTable'".
Can't i create tables on the fly which i use then in my dataflow tasks?
Kind regards,
Wolfgang
Hello dear Forum,
I guess this should be rather simple to answer but i just don't get how to do this. I just want to import some data from an Access mdb file into a SQL Server 2005 Database.
As I dont want to create all tables in Sql Server 2005 beforehand i'd like to create them out of my SSIS Package. Therefore i use the 'Execute SQL Task' in the control flow before stepping into the dataflow where only a oledb-source and an oledb-destination exist...
VERY simple... My problem is, that i can't select the table from the selectionlist in the oledb-destination because it does not exist before executing the package...
If i create the table by hand just to be able to select it, that does not help. because if everything is set up and then i delete the table (because it will be created by the package anyway) an error occurs before executing the package - in the validation phase:
Package Validation Error: "Invalid object name 'myToBeCreatedTable'".
Can't i create tables on the fly which i use then in my dataflow tasks?
Kind regards,
Wolfgang
Hello dear Forum,
I guess this should be rather simple to answer but i just don't get how to do this. I just want to import some data from an Access mdb file into a SQL Server 2005 Database.
As I dont want to create all tables in Sql Server 2005 beforehand i'd like to create them out of my SSIS Package. Therefore i use the 'Execute SQL Task' in the control flow before stepping into the dataflow where only a oledb-source and an oledb-destination exist...
VERY simple... My problem is, that i can't select the table from the selectionlist in the oledb-destination because it does not exist before executing the package...
If i create the table by hand just to be able to select it, that does not help. because if everything is set up and then i delete the table (because it will be created by the package anyway) an error occurs before executing the package - in the validation phase:
Package Validation Error: "Invalid object name 'myToBeCreatedTable'".
Can't i create tables on the fly which i use then in my dataflow tasks?
Kind regards,
Wolfgang
Hi, all experts here,
Do we always have to use SCD component for the loading of data into data warehouse to handle changes of rows?
I am looking forward to hearing from you and thank you very much in advance for your help.
With best regards,
Hi,
I have SSIS which will have OLF DB Source, and then have Derived Column component to managering all data from OLF DB Source. I used to have default columns such as Create Date, Update Date set as fixed date. Now we decided to put this default column values into a table to manage. I then have problem to choose which component I should use in order to have this columns selected from default table.
For example: if Create Date is null, I have to select default value from the default table; otherwise, use Create Date value and so on.
Thanks,
Megan
Hello,
Is it possible to use existing data flow components (Merge Join, aggregation,...) in a custom data flow component?
Thanks,
Yoann
I am new to MDM profisee tool and currently working for Addres verification project for my organization.I wanted to clear my doubts here about Unique Identifer in Stage table and how it works.. Here what i understand till now:
Step 1) I created an Entity using MDM profisee UI and it generated a stage table in MDS database called stg.Address_leaf
Step 2) I have loaded data from external source to MDS stage table using ETL and passed Import type as 2 and Import status id as 0
Step 3) I have run store procudure system generated something stg.udp_Address_leaf to load the model and passed the version name as Version_1, Log flag as '1' and Batch tag as 'Address'
Now my below are my questions:
1) What is the field i can use in MDS stage table to populate my unique intifier value coming from source? (lets say Address_Id is my Unique value for all the records coming from source)
2) Where/how Unique Identifier is useful in this process? Will this be useful in next time load from stage to Model?
3) If i truncate and load my MDS stage table in next run and few earlier records has been updated how it will update those records in Model? will this process (code present in SP) recognize by Unique Identier column present in MDS stage table?
I have a int value that I pulled from a table in a database and it is stored in a variable. I would like to increment this value in a script component and then insert it into a field in my ole db destination. Is there any example out there of using a script component to do something similar to this.
Do I want to select source, destination or transformation?
Can I/how do I access my variable from within the script component.
Any asistance or examples you can point me towards would be greatly appreciated.
There is a table with a column that contains Xml documents. For each record from my Data Flow Source, I want to pass in the Xml document and the node to interrogate, and return the value contained in the node. Like the Crm component, this is probably one I will have to write from scratch in C#, but I would like to avoid having to create the custom component if it already exists in the public arena.
Does anyone know of any Xml Ssis Data Flow Components that are downloadable for free?
Hi all,
We are running SQL Server 2005 Standard edition on a Vista Business machine. Whenever we try to use DTS to import or export data we get this message:
The SSIS Data Flow Task could not be created. Verify that DTSPipeline.dll is available and registered.
Of course we have explicitly registered this dll and others.
Also we reinstalled SQL Server (and did it again. And did it again).
And we did the same for Visual Studio Standard (and again).
Any ideas in this elaborate group?
Thanks,
Tjerk
Dear Colleagues,
I'm trying to develop a custom Data Flow Transformization component in SSIS.
I compiled it without errors, installed it in the GAC and in the Pipeline Components-folder however I always get the following message when I'm trying to drag the component onto the designer surface:
The component could not be added to the Data Flow task.
Please verify that this component is properly installed.
------------------------------
ADDITIONAL INFORMATION:
The data flow object "RisikoKennzahlenKomponenten.MarktwertTransformation,
RisikoKennzahlenKomponenten, Version=1.0.0.0, Culture=neutral,
PublicKeyToken=cfa8722b8086ac2d" is not installed correctly on this computer.
(Microsoft.DataTransformationServices.Design)
Program Location:
at
Microsoft.DataTransformationServices.Design.DtsBasePackageDesigner.GetPipelineInfo(String creationName, IServiceProvider serviceProvider)
at
Microsoft.DataTransformationServices.Design.DesignUtils.GetNewPipelineComponentObjectName(IDTSComponentMetaDataCollection90
parentCollection, String clsid, IDTSComponentMetaData90 componentMetadata,
PipelineComponentInfo& pipelineComponentInfo)
at
Microsoft.DataTransformationServices.Design.PipelineTaskDesigner.AddNewComponent(String clsid, Boolean throwOnError)
This happens with EVERY custom component on my computer. The same components work fine on other machines.
Does anyone have an idea?
Regards
Arne Janning
Is there a default destination component used when a new data flow is created? The reason I ask is simply curiosity. I have an xml file with 2 pieces of data: item A and item B. A should simply get copied out of the file. B should undergo a quick transform. I set up an XML source such that two columns are mapped correctly to the XML source data of A and B. I set up my data transform task as well. So, if I leave those two components on the .dtsx page with no other components, then will there be a default data flow destination already created? ...OR, do you always have to have a destination component?
Thanks for the input. I am just curious.
I have a package variable that I set via an ExecuteSQL task. I want to reference it in a data flow script component. In the Script component I enter the variable into the ReadOnlyVariables collection, then in the script I reference it as Me.Variables.var. (E.G. counter = Me.Variables.var)
I'm getting errors when the data flow starts:
Error: 0xC0047062 at Provider, Set Surrogate Key [4261]: System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> Microsoft.SqlServer.Dts.Pipeline.ReadWriteVariablesNotAvailableException: The collection of variables locked for read and write access is not available outside of PostExecute.
I have no problem referencing other variables that I have in DerivedColumn transformations. I've tried putting the variable in the ReadWriteVariables collection but I get the same error. I don't understand why this is so difficult. Please help.
Hi,
I've read the various posts and articles regarding this matter, but I seem to have problems getting to work:
In my control flow, I start by declaring a variable named "LastJobLedgerEntryID", to identify the records I need to add to the stage. From there I would like to use this variable in the source component in my dataflow, i.e.:
"SELECT [Entry No_],[Job No_],[Posting Date],[Document No_],[Type],[No_],[Description],[Quantity],[Direct Unit Cost],[Unit Cost],[Unit Price],[Chargeable],[Job Posting Group],[Global Dimension 1 Code],[Global Dimension 2 Code],[Work Type Code] FROM mytable WHERE [Entry No_] > " + @[User::LastJobLedgerEntryID]
But this fails? I should note that the variable LastJobLedgerEntryID is stored as a int32, and with the default value of 0
Could someone please help me with this?
Thanks in advance!
I'm wondering if it is possible to create a flat file source on the fly while bypassing the following step:
On the Connection Managers page, add or create the Flat File connection manager, using a descriptive name such as MyFlatFileSrcConnectionManager. Then close the Script Transformation Editor.
I want to create the connection totally in script, yet i'm having a hard time proving this out...does anybody have any
experience with this?
Ryan
Hello-
I have an SSIS package which I've been using for nearly a year now. Basically the package is responsible for looping through a directory, and importing pipe delimited files into a database.
The issue I'm having is with the UnDoubleError data flow component. I've been using it to remove the qualifiers from the data being imported, in this case it happens to be double-quotes. ex: { " " } I have found that the component will insert a single double-quote { " } when it finds a null string, or two consecutive qualifiers for that matter. If there are two qualifiers with a space seperating them, then it will insert an empty string, or rather a string with one space in it, and the qualifiers will be removed.
The dilemma is since the log files have some columns which are technically NULL, by that I mean columns where there exists only two qualifiers ex: { "" } the UnDoubleError component then inserts a single double-quote. I want to retain the NULL value if it exists, though still use the component to remove the qualifiers when there actually is data there.
Any suggestions?
Thanks for the help...
I am getting the following exception when attempting to read from a DataReaderDestination:
System.Exception was unhandled
Message="Could not obtain a DataReader object from the specified data flow component."
Source="Microsoft.SqlServer.Dts.DtsClient"
StackTrace:
at Microsoft.SqlServer.Dts.DtsClient.DtsCommand.internalPrepare(Boolean fReaderRequired)
at Microsoft.SqlServer.Dts.DtsClient.DtsCommand.ExecuteReaderInThread()
at Microsoft.SqlServer.Dts.DtsClient.DtsCommand.ExecuteReader(CommandBehavior behavior)
at CA3DataImportTool.ViewSSISOutput.btnRun_Click(Object sender, EventArgs e) in C:Documents and Settings
heinMy DocumentsVisual Studio 2005ProjectsCA3DataImportToolCA3DataImportToolViewSSISOutput.cs:line 35
at System.Windows.Forms.Control.OnClick(EventArgs e)
at System.Windows.Forms.Button.OnClick(EventArgs e)
at System.Windows.Forms.Button.OnMouseUp(MouseEventArgs mevent)
at System.Windows.Forms.Control.WmMouseUp(Message& m, MouseButtons button, Int32 clicks)
at System.Windows.Forms.Control.WndProc(Message& m)
at System.Windows.Forms.ButtonBase.WndProc(Message& m)
at System.Windows.Forms.Button.WndProc(Message& m)
at System.Windows.Forms.Control.ControlNativeWindow.OnMessage(Message& m)
at System.Windows.Forms.Control.ControlNativeWindow.WndProc(Message& m)
at System.Windows.Forms.NativeWindow.DebuggableCallback(IntPtr hWnd, Int32 msg, IntPtr wparam, IntPtr lparam)
at System.Windows.Forms.UnsafeNativeMethods.DispatchMessageW(MSG& msg)
at System.Windows.Forms.Application.ComponentManager.System.Windows.Forms.UnsafeNativeMethods.IMsoComponentManager.FPushMessageLoop(Int32 dwComponentID, Int32 reason, Int32 pvLoopData)
at System.Windows.Forms.Application.ThreadContext.RunMessageLoopInner(Int32 reason, ApplicationContext context)
at System.Windows.Forms.Application.ThreadContext.RunMessageLoop(Int32 reason, ApplicationContext context)
at System.Windows.Forms.Application.Run(Form mainForm)
at CA3DataImportTool.Program.Main() in C:Documents and Settings
heinMy DocumentsVisual Studio 2005ProjectsCA3DataImportToolCA3DataImportToolProgram.cs:line 18
at System.AppDomain.nExecuteAssembly(Assembly assembly, String[] args)
at System.AppDomain.ExecuteAssembly(String assemblyFile, Evidence assemblySecurity, String[] args)
at Microsoft.VisualStudio.HostingProcess.HostProc.RunUsersAssembly()
at System.Threading.ThreadHelper.ThreadStart_Context(Object state)
at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state)
at System.Threading.ThreadHelper.ThreadStart()
If I use the example from SQL Server BOL (http://msdn2.microsoft.com/en-us/library/ms135917.aspx), and make a new package for the sample in the same project, the sample works. The only thing that I can see that is significantly different between my code and the sample is that my DataReaderDestination has a lot more data in it, but here's the relevant code:
string dtexecArgs;
string dataReaderName;
DtsConnection dtsConnection;
DtsCommand dtsCommand; //IDbCommand
IDataReader dtsDataReader;
DataTable dtsTable;
dtexecArgs = @"/FILE ""C:Documents and Settings
heinMy DocumentsVisual Studio 2005ProjectsCA3DataImportToolML3000_IntegrationProjectPackage.dtsx"" ";
dataReaderName = "DataReaderDest";
dtsConnection = new DtsConnection();
dtsConnection.ConnectionString = dtexecArgs;
dtsConnection.Open();
dtsCommand = new DtsCommand(dtsConnection);
dtsCommand.CommandText = dataReaderName;
dtsDataReader = dtsCommand.ExecuteReader(CommandBehavior.Default); // EXCEPTION HERE
Please help!
Richard Hein
Hi
After having read the forums I understand that it is not possible to use parameters in the "Ole Db Command", when connecting to Oracle.
What I have to do, is to delete a record in an Oracle table from within the "Data Flow", based on parameters.
It does not seem to be possible to create the SQL statement in a variable (with parameters generated) and use the variable in the "Ole Db Command"
Instead I probably need to use the "Script Component"?
So what I need is step by step guidance for how to connect to and interact with an Oracle database from a Script Component, what to import and where, what to declare and where, and generally a way to get this to work.
Any help appreciated.
Olafur
Does anyone know how to hook up to data flow pipeline events via custom solution (C#)? I am trying to write code to log start and end times of components(lookup,merge joins etc) in a data flow task. I tried with a class inheriting from the EventsProvider class but it didn't work as this is only for container tasks. Any ideas will be greatly appreciated.
Hi,
I am creating a SSIS Programmatically. I facing problem in setting name to data flow compenent.
here is the sample code
IDTSComponentMetaData90 oledbItemMasterSource;
oledbItemMasterSource = dataFlow.ComponentMetaDataCollection.New();
oledbItemMasterSource.ComponentClassID = "DTSAdapter.OLEDBSource";
oledbItemMasterSource.Name = "InventTable";
oledbItemMasterSource.Description = "Get Item Master";
After creating this package i found that 'InventTable" is not set to OLE Db Source component. where as i can set name to OLE DB Flat File destination component.
Kindly suggest me a way fix this issue.
Thanks-
aravind
Dear all,
I am developing tools for automatic creation of data warehouse tables, cubes and SSIS packages. Generating the SSIS Data Flows works very well using the SSIS components for OLE DB Source, Derived Column, Lookup and OLE DB Destination.
However for some of the advanced functionality I need to use Script Component. I have managed to add it in the Data Flow with all inputs and outputs, but how do I populate it with my code? I've seen there is a component property called "SourceCode" and one called "BinaryCode". The "SourceCode" contains the code, but also some extra metadata.
Questions:
Do you know if there is any programmatic support to generate the Source Code property with the metadata necessary?
Do you know how to compile the Source Code and generate the property BinaryCode?
Example from my code below:
// Create script component
IDTSComponentMetaData90 script = dataFlowTask.ComponentMetaDataCollection.New();
script.ComponentClassID = app.PipelineComponentInfos["Script Component"].CreationName;
CManagedComponentWrapper scriptWrapper = script.Instantiate();
script.InputCollection.New();
script.OutputCollection.New();
scriptWrapper.ProvideComponentProperties();
script.Name = "Logics";
// Create path
IDTSPath90 scriptPath = dataFlowTask.PathCollection.New();
scriptPath.AttachPathAndPropagateNotifications(lastComponent.OutputCollection[0], script.InputCollection[0]);
// Populate input and output columns
IDTSInput90 scriptInput = script.InputCollection[0];
IDTSVirtualInput90 scriptVInput = scriptInput.GetVirtualInput();
foreach (IDTSOutputColumn90 col in oledbSrc.OutputCollection[0].OutputColumnCollection)
{
scriptWrapper.SetUsageType(scriptInput.ID, scriptVInput, col.LineageID, DTSUsageType.UT_READONLY);
IDTSOutputColumn90 tmp = script.OutputCollection[0].OutputColumnCollection.New();
tmp.Name = col.Name;
tmp.SetDataTypeProperties(col.DataType, col.Length, col.Precision, col.Scale, col.CodePage);
}
// Make script asynchronous
script.OutputCollection[0].SynchronousInputID = 0;
Thanks for any assistance and Best Regards,
Johan Åhlén,
Business Intelligence consultant at IFS
I am trying to cleanly shutdown a dataflow task, which contains a script component, when RunningPackage.Stop() is called from the SSIS runtime.
I've been going in ever decreasing circles with no success - it looks like the cleanest way to find out whether RunningPackage.Stop() has been called is to call FireQueryCancel(). But I can't find any reference to anything useful in a dataflow task script component that gives me something that implements IDTSComponentEvents. The nearest thing seems to be Me.ComponentMetaData which gives a reference to IDTSComponentMetaData90, but this only has methods for calling FireError, FireInformation, FireProgress, FireWarning, and FireCustomEvent. But no FireQueryCancel.
Is there a way in a script component that I can find out the state of QueryCancel?
Any help would be apprecieated.
At our business we are getting a lot of PDF documents that are being hand keyed into a database. Has anyone heard ior know of a SSIS Data Flow Source component that I coud use to read thos documents into a data stream (?) and process?
View 5 Replies View RelatedI have found not very good behavior of Union All Data Flow Component.
I have Script component and about 30 Union All Data Flow Components. I have string field named 'MyField'. Legth of the field is 15 characters. The field appears first in Script component (defined as 'MyField' Output collumn) and present in all Union All Data Flow Components. I need to change field length from 15 to 10. In Script Transformator Editor of Script component I changed 'MyField' Output collumn Length property value set it to 10. But length of the field remains 20 in all Union All Data Flow Component downstream and no errors no warnings appears. To fix this i recreated (deleted and added again) 'MyField' in all Union All 30 Data Flow Components.
If extend length from, say, 15 to 20 in the first of Union All Data Flow Components error sign appears
"Error 1 Validation error. Data Flow Task: Union All [22]: The metadata for "input column "MyField" (144)" does not match the metadata for the associated output column."
and only recreation of the field in all Union All Data Flow Components can help to fix it.
May be someone knows other solution to fix such behavior?
Hi,
I'm pretty new in SSIS and i have some problems with error log. I want to get detailled error description in a script component of a dataflow. for the moment I use thooses lines
Row.ErrorDesc = ComponentMetaData.GetErrorDescription(Row.ErrorCode)
and for unique constraints on a sql table I have this error : The data value violates integrity constraints.
For the same error, if i use an event handler on error, i have more row and the first of them is more explicit (Variable System::ErrorDescription)
An OLE DB error has occurred. Error code: 0x80040E2F.
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E2F Description: "The statement has been terminated.".
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E2F Description: "Cannot insert duplicate key row in object 'dbo.dimDepot' with unique index 'IX_dimDepot'.".
Is that possible to have a so detailled error text in a script componnent of a data flow? If yes, How?
Or if i use error event how can authorize the dataflow go ahead even if there is error.
thanks for you help
krest
Hi,
I'm having my first go at developing a destination adapter which will send data to an update Web Service.
I've got some rather big gaps in my understanding. I've been following the various samples I've found on the net and have validated my mapping and picked up all the available column names and datatypes which are appearing in the Input and Output Properties tab of the Advanced Editor but I only have a tab for "Input Columns" and not "Column Mappings".
Which method defines the availble columns for the user to map?
Let me know if I haven't given enough information.
cheers
Hello Helpers,
I need to know how to use my private function - created as a scalar-valued-function in SQL Server 2005 - in script component (here a transformation is used) in a data flow task to transform a two-digit-month into a tree-sign-month:
Example: '01' should be transformed into 'Jan'
Many thanks for alle your commitment and help!
Ulrike
Hi,
I developed SSIS Data Flow Component and placed dll file into the DTSPipelinecomponents. Then I registered the component in the GAC.
But when I try to add the required component into toolbox that there is not this one in the list of SSIS Data Flow Items. What does it mean?
Thanks in advance.
Hi all,
I am creating a customer data flow component for SSIS for use in a package. I've got some custom properties that I am exposing using the supplied advanced editor (no custom property editor here).
Some of my properties are enumerated types, and I have deciphered how to get those properties to show as dropdown lists of their respective enumerations. (For those of you who may be looking as hard as I did as to how to accomplish this, see the end of this post.)
I also have a few properties which request SSIS package variable names - such as an file name variable. However, I can't figure out how to tell the advanced editor that the property is looking for an SSIS variable, so that it can show a dropdown list of package variables, much like virtually any other Microsoft supplied Data Flow component can.
Is there a Type Converter I could specify for those custom properties? Is there another way to instruct SSIS that my custom property is expecting a variable? Or do I need to code a custom UI for editing my Data Flow Task?
To create a dropdown list of values for a custom property that represents an enum, do the following:
1. Create your enum definition, such as "public enum ThisIsMyEnum { one, two }"
2. Create a new class that inherits from TypeConverter, such as "public class MyEnumConverter : TypeConverter"
3. Override "CanConvertFrom", and return true if "sourceType == typeof(string)"
4. Override "CanConvertTo", and return true if "destinationType == typeof(string)"
5. Override "ConvertFrom", and return the enum value (such as "one" or "two" in my example) that corresponds to the string passed in the parameter "value"
6. Override "ConvertTo", and return a string that corresponds to the enum value passed in the parameter "value"
7. Override "GetStandardValuesSupported" and return true
8. Override "GetStandarValuesExclusive" and return true to indicate that ONLY the enum values should be accepted
9. Override "GetStandardValues", and return a new StandardValuesCollection constructed with Enum.GetValues() of your enum, such as "return new StandardValuesCollection(Enum.GetValues(typeof(ThisIsMyEnum)));"
10. Just above your "public enum" declaration, add a "TypeConverter" attribute to link your type converter to your enum, such as "[TypeConverter(typeof(MyEnumConverter))]"
11. In "ProvideComponentProperties", after you've created your custom property like this: "IDTSCustomProperty90 propEnum = ComponentMetaData.CustomPropertyCollection.New()", add another line to specify the TypeConverter property of the property to the full assembly name of your type converter, like so: "propEnum.TypeConverter = typeof(MyEnumConverter).AssemblyQualifiedName;"
In my Script Component properties I have included "ClientReportGroupId" as a ReadWrite variable. This variable is declared as a Package Variable.
View 23 Replies View RelatedI have a Data Flow Script Component(Destination Type) and in the properties I have a read/write variable called User::giRowCount
User::giRowCount is populated by a Row Count Component previously in the Data Flow.
After reading http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=97494&SiteID=1 it is very clear that you can actually only use variables in the PostExecute of a Data Flow Script Component or you will get an error
"Microsoft.SqlServer.Dts.Pipeline.ReadWriteVariablesNotAvailableException: The collection of variables locked for read and write access is not available outside of PostExecute."
What I need to do is actually create a file in the PreExecute and write the number of records = User::giRowCount as second line as part of the header, I also need to parse a read/write variable such as gsFilename to save me hardcoding the path
(Me.Variables.gsFilename.ToString),(Me.Variables.giRowCount.ToString)
-they must go in the PreExecute sub --workarounds please-here is the complete script component that creates a file with header, data and trailer --Is there any workaround
Thanks in advance Dave
Imports System
Imports System.Data
Imports System.Math
Imports System.IO
Imports System.Text
Imports System.Configuration
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper
Public Class ScriptMain
Inherits UserComponent
'Dim fs As FileStream
Dim fileName As String = "F:FilePickUpMyfilename.csv"
'Dim fileName = (Me.Variables.gsFilename.ToString)
Dim myFile As FileInfo = New FileInfo(fileName)
Dim sw As StreamWriter = myFile.CreateText
Dim sbRecord As StringBuilder = New StringBuilder
Public Overrides Sub PreExecute()
sbRecord.Append("RECORD_START").Append(vbNewLine)
End Sub
Public Overrides Sub ParsedInput_ProcessInputRow(ByVal Row As ParsedInputBuffer)
sbRecord.Append(Row.ProjectID.ToString)
sbRecord.Append(Row.TransactionRefNum.ToString)
sbRecord.Append(Row.BillToCustomerNum.ToString)
sbRecord.Append(Row.BillToAccountNum.ToString)
sbRecord.Append(Row.BillToLineNum.ToString)
sbRecord.Append(Row.BillToReassignmentNum.ToString)
sbRecord.Append(Row.ChargeCode.ToString)
sbRecord.Append(Row.NotificationMethod.ToString)
sbRecord.Append(Row.AdjustmentAmount.ToString)
sbRecord.Append(Row.AdjustmentDate.ToString)
sbRecord.Append(Row.ReparationGivenFlag)
sbRecord.Append(Row.BillingSystemProcessingErrorCode.ToString).Append(vbNewLine)
End Sub
Public Overrides Sub PostExecute()
sbRecord.Append("RECORD_COUNT").Append((vbTab))
sbRecord.Append(Me.Variables.giRowCount.ToString).Append(vbNewLine)
sbRecord.Append("RECORD_END").Append(vbNewLine)
'Now write to file before next record extract
sw.Write(sbRecord.ToString)
'Clear contents of String Builder
sbRecord.Remove(0, sbRecord.Length)
'Close file
sw.Close()
End Sub
End Class
Has anyone got a workaround
thanks in advance
Dave
I have a data flow that consists of
OLE DB source which calls a stored proc that returns a result set
data conversion
Excel destination
I am in design mode in Business Intelligence studio. My excel destination (with an Excel Connection) shows no sheet name though I have an execute SQL task before the data flow to create the excel table called SHEET1. Needless to say, there are no output columns visible to do any mappings. I did go to the ExcelConnection to set the OpenRowset Property to SHEET1 but it seems to have no effect.
I can do the export in SQL Server Management studio and that works fine, but it is basic and does not meet my requirements. I have to customize the package to allow dynamic Excel filenames based on account names and have to split my result set into multiple excel sheets because excel 2003 has a max of 65536 rows per sheet. Also when I use the export wizard, I have the source as a table and eventually the source has to be a stored proc with input parms.
What am I missing or doing wrong? Thanks in advance
Hi,
I've created a Custom Data Flow Component and added some Custom Properties.
I want the user to set the contents using an expression. I did some research and come up with the folowing:
Code Snippet
IDTSCustomProperty90 SourceTableProperty = ComponentMetaData.CustomPropertyCollection.New();
SourceTableProperty.ExpressionType = DTSCustomPropertyExpressionType.CPET_NOTIFY;
SourceTableProperty.Name = "SourceTable";
But it doesn't work, if I enter @[System:ackageName] in the field. It comes out "@[System:ackageName]" instead of the actual package name.
I'm also unable to find how I can tell the designer to show the Expression editor. I would like to see the elipses (...) next to my field.
Any help would be greatly appreciated!
Thank you