Programmatically Creating Transformation Script Component
Nov 20, 2006
Does anyone have any examples of programmatically creating a Transformation Script Component (or Source/Destination) in the dataflow? I have been able to create other Transforms for the dataflow like Derived Column, Sort, etc. but for some reason the Script Component doesn't seem to work the same way.
I have done it as below trying many ways to get the componentClassId including the AssemblyQualifiedname & the GUID as well. No matter, what I do, when it hits the ProvideComponentProperties, it get Exception from HRESULT: 0xC0048021
IDTSComponentMetaData90 scriptPropType = dataFlow.ComponentMetaDataCollection.New();
scriptPropType.Name = "Transform Property Type";
scriptPropType.ComponentClassID = "DTSTransform.ScriptComponent";
// have also tried scriptPropType.ComponentClassID =typeof(Microsoft.SqlServer.Dts.Pipeline.ScriptComponent).AssemblyQualifiedName;
scriptPropType.Description = "Transform Property Type";
CManagedComponentWrapper instance2 = scriptPropType.Instantiate();
instance2.ProvideComponentProperties();
Any help or examples would be greatly appreciated! Thanks!
View 24 Replies
ADVERTISEMENT
Sep 7, 2007
Does anyone know how to create a Source Script Component programmatically. I can only seem to create a Transformation Script Component. I have this:
PipeLineWrapper.IDTSComponentMetaData90 sourceComponent =
((dataflowTask as TaskHost).InnerObject as PipeLineWrapper.MainPipe).ComponentMetaDataCollection.New();
sourceComponent.ComponentClassID = app.PipelineComponentInfos["Script Component"].CreationName;
Is there anything I have to do extra to make it a source script component, seeing how it defaults to a transformation.
View 1 Replies
View Related
Apr 10, 2006
Microsoft published a "Creating a custom transformation component Walkthrough" published on
http://www.microsoft.com/downloads/details.aspx?FamilyID=1c2a7dd2-3ec3-4641-9407-a5a337bea7d3&DisplayLang=en
Does anyone know where to get the Hands-On Lab Files mentioned?
Thanks
Alex
View 4 Replies
View Related
Mar 30, 2006
I am writing a custom dataflow transformation component and I need to get the name of the preceeding component.
I have been trying to find a way to get a reference to the Package object, MainPipe object or IDTSPath90 object (connecting to the IDTSInput90 of my component) from my component because I think from there I can get to the information I want.
Does anyone have any suggestions?
TIA . . . Ed
View 7 Replies
View Related
May 10, 2006
Hello,
does anyone know of a good example of how to programmatically add a Data Conversion Transformation to a package? I have come so far that I have the component in my dataflow task, but I don't know how to set the properties of it. That is, how to determine the columns that should be converted and to what data type etc.
I (as far as I understand) need a data conversion transformation since I have not managed to create (via the designer) a package that reads data from an AS400 via DB2OLEDB and stores it in a SQL Server 2005 Database. I keep getting the error "string cannot be converted from unicode to non-unicode" (or something like that) and by searching this forum I learned that the data conversion component might do the trick.
I added this conversion to a manually designed package and it solved the error, but I don't know how to re-create this package programmatically. I would really appreciate some help on this.
Also, if someone has managed to import data from AS400 to SQL server 2005 via OLEDB and NOT got the string conversion error, please let me know!
Regards,
Annika
View 3 Replies
View Related
Jun 19, 2007
Hello,
I have created SSIS package programmatically, I want to add Lookup transformation,
How can I add column from reference dataset to the transformation?
I have try to add new output column but it gives me an validation error, I write following coed to add new output column to lookup.
IDTSOutputColumn90 outputColumn = this.lookup.OutputCollection[0].OutputColumnCollection.New();
outputColumn.Name = col.Name;
outputColumn.Description = "Staging table output";
outputColumn.TruncationRowDisposition = DTSRowDisposition.RD_FailComponent;
outputColumn.ErrorOrTruncationOperation = "Copy Column";
outputColumn.SetDataTypeProperties(col.DataType, col.Length, col.Precision, col.Scale, col.CodePage);
Please suggest other way to add column from reference dataset to transformation output.
View 10 Replies
View Related
Jan 9, 2007
Hi there,
I have created a package which simply imports data from a flat file to a SQL Server table. But I need to incorporate a data conversion component by which I may change the source-destination column mapping programmatically. So what I thought that I need to add a data conversion component into the dataflow task. After adding this component (I found its component id as {C3BF62C8-7C5C-4F85-83C3-E0B6F6BE267C}) I have created a path which establishes the mapping between output columns of source component and the input columns of data conversion component. Now I am not sure how to establish the mapping between the data conversion component€™s input column collection and output column collection.
I am giving my code snippet here,
IDTSComponentMetaData90 sourceDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New();
sourceDataFlowComponent.ComponentClassID = "{90C7770B-DE7C-435E-880E-E718C92C0573}";
€¦ €¦ €¦. // Code for configuring the source data flow component
IDTSComponentMetaData90 conversionDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New();// creating data conversion
conversionDataFlowComponent.ComponentClassID = "{C3BF62C8-7C5C-4F85-83C3-E0B6F6BE267C}";// This is the GUID for data conversion component
CManagedComponentWrapper conversionInstance = conversionDataFlowComponent.Instantiate();//Instantiate
conversionInstance.ProvideComponentProperties();
// Now creating a path to connet the source and conversion
IDTSPath90 fPath = dataFlowTask.PathCollection.New(); fPath.AttachPathAndPropagateNotifications(
sourceDataFlowComponent.OutputCollection[0],
conversionDataFlowComponent.InputCollection[0]);
// Sould I need to accuire connect for data conversion? Im not sure
conversionInstance.AcquireConnections(null);
conversionInstance.ReinitializeMetaData();
// Get the input collection
IDTSInput90 input = conversionDataFlowComponent.InputCollection[0];
IDTSVirtualInput90 vInput = input.GetVirtualInput();
foreach (IDTSVirtualInputColumn90 vColumn in vInput.VirtualInputColumnCollection){
conversionInstance.SetUsageType(
input.ID, vInput, vColumn.LineageID, DTSUsageType.UT_READONLY);
}
.. . // Well here I am stucked. What I need to do here to establish a map
// between conversionDataFlowComponent.InputCollection[0] and
// conversionDataFlowComponent.OutputCollection[0]?
As you can see I am just away from creating the mapping between input and output collection. Can anybody give me an idea how can I achieve this?
I will appreciate all kind of suggestions and comments.
Regards
Moim
View 11 Replies
View Related
Jan 15, 2007
Hi,
I'm on writing a custom data flow component (transformation). For this I need access to an external datasource. As best practice, it seems that you should use a connectionmanager in the package.
But I don't really know how to access them.
I've overwritten the AquireConnections() to lookup in the RuntimeConnectionCollection for the connection managers. But It seems it doesnt get called. If I call it my own, for example in validate (just for debugging purposes) the collection is empty, although there are two connectionmanagers in the package.
What am I doing wrong or has someone a code snippet for me.
Thanks T.
View 1 Replies
View Related
Jun 22, 2007
Hi all,
I am missing something simple. I have added a new Transformation Script, put in my code to read the input rows, defined my outputs. I have tried to change the SynchonousInputId to 0, but I only get the option of None or input "Input 0" (91). What have I missed?
View 1 Replies
View Related
Feb 13, 2006
dear experts,
i'm trying to build a package programmatically from client c# application. I'm working to create three dataflow components:
- OleDB Source
- Derived Column Transformations
- OleDb Destination.
My package works good, but i have several problems when insert an expression as Value of an IDTSCustomPropriety90 object, like this one:
[LEN](#firsname.lineageID) > 5 ? <<if true>> : <<if false>>
Simple expressions like concatenate strings, for example "#firstname.lineageID" + "#lastname.lineageID" seem to work good.
Thanks a lot in advanced.
Paganelli Francesco
View 19 Replies
View Related
Jun 8, 2006
i am developing one custom transfer component, where i am building one custom object and want the same to be transfered from ComponentUI to component.I explored in this issue and came to know that we can make use of SaveToXML and LoadXML methods of IDTSPersist90 interface. The problem is i could not able to make use of this interface.If any body faced same issue and got the solution, let me know the same.
Thanks in advance
Karun
View 1 Replies
View Related
Mar 12, 2007
Hello,
Using the following documentation as a guide:
http://msdn2.microsoft.com/zh-cn/library/aa337080.aspx
I instantiated a new script component into an existing Data Flow in my SSIS project.
In the Script Transformation Editor, under the Connection Managers section, I associated the name dbConnManager to an already existing Connection Manager in the project.
My Connection Manager is of the type oOLEDB.
I then opened up the script designer and added the following lines of code where it said "Add your code here"
Dim myConnManager As IDTSConnectionManager90 = _
Me.Connections.ECFconnection
Dim dbConn As OleDb.OleDbConnection = _
CType(myConnManager.AcquireConnection(Nothing), OleDb.OleDbConnection)
When I test run the project I get the following error and the new script component is red:
Unable to cast COM object of type 'System.__ComObject' to class type 'System.Data.OleDb.OleDbConnection'. Instances of types that represent COM components cannot be cast to types that do not represent COM components; however they can be cast to interfaces as long as the underlying COM component supports QueryInterface calls for the IID of the interface.
I know the database connection works since I am using it in a component that executes before this new script component.
I am stuck...Any suggestions?
View 3 Replies
View Related
May 27, 2008
I've been trying to figure this out on my own for pretty much all of today, and part of last week. I've downloaded samples, searched this forum, blogs, etc. So I figured I would post, since it's the end of the day, and I'm not much further along.
I'm working on a custom transformation component, whose main function is to use SQL encryption/decryption to encrypt/decrypt data from the input columns, into the output columns. The component needs two strings, a key name and a certificate name, as well as the connection manager it should use to connect to SQL which will do the encryption/decryption.
Here's where I'm stuck:
1) How can I provide the key/certificate names via properties? What I'm expecting/looking for is a way to add these two properties at the component-level, which would show up under the "Custom Properties" section of the properties pane (currently, this only has one property, "UserComponentTypeName"). These key/certificate values will be used for all input columns.
2) How do I access the connection managers from within the component? What is the best way to go about using a connection manager from within my component to connect to SQL and perform the encryption/decryption? In a custom task, this was fairly simple, but it seems that same concept won't work on a transformation component.
3) Is there a better way to go about accomplishing this (column encryption via SQL from within SSIS)? Am I going about this all wrong?
As I said, I've searched for direction, but there seems to be next to nothing in the regards of a good reference for creating custom transformation components. I've looked at two MS samples, but can't seem to make any sense out of them.
Thanks in advance.
Jerad
View 3 Replies
View Related
Aug 14, 2007
Hi all,
Is there any tutorial to learn how custom transformation component works? maybe a blog, pdf or something...
Specifically, i need to learn how to generate an output column composed from 3 input columns. The problem is i dont know how to set the column value... anyone have some sample code?
Thanks!
View 6 Replies
View Related
Feb 2, 2007
Dear all,
I am developing tools for automatic creation of data warehouse tables, cubes and SSIS packages. Generating the SSIS Data Flows works very well using the SSIS components for OLE DB Source, Derived Column, Lookup and OLE DB Destination.
However for some of the advanced functionality I need to use Script Component. I have managed to add it in the Data Flow with all inputs and outputs, but how do I populate it with my code? I've seen there is a component property called "SourceCode" and one called "BinaryCode". The "SourceCode" contains the code, but also some extra metadata.
Questions:
Do you know if there is any programmatic support to generate the Source Code property with the metadata necessary?
Do you know how to compile the Source Code and generate the property BinaryCode?
Example from my code below:
// Create script component
IDTSComponentMetaData90 script = dataFlowTask.ComponentMetaDataCollection.New();
script.ComponentClassID = app.PipelineComponentInfos["Script Component"].CreationName;
CManagedComponentWrapper scriptWrapper = script.Instantiate();
script.InputCollection.New();
script.OutputCollection.New();
scriptWrapper.ProvideComponentProperties();
script.Name = "Logics";
// Create path
IDTSPath90 scriptPath = dataFlowTask.PathCollection.New();
scriptPath.AttachPathAndPropagateNotifications(lastComponent.OutputCollection[0], script.InputCollection[0]);
// Populate input and output columns
IDTSInput90 scriptInput = script.InputCollection[0];
IDTSVirtualInput90 scriptVInput = scriptInput.GetVirtualInput();
foreach (IDTSOutputColumn90 col in oledbSrc.OutputCollection[0].OutputColumnCollection)
{
scriptWrapper.SetUsageType(scriptInput.ID, scriptVInput, col.LineageID, DTSUsageType.UT_READONLY);
IDTSOutputColumn90 tmp = script.OutputCollection[0].OutputColumnCollection.New();
tmp.Name = col.Name;
tmp.SetDataTypeProperties(col.DataType, col.Length, col.Precision, col.Scale, col.CodePage);
}
// Make script asynchronous
script.OutputCollection[0].SynchronousInputID = 0;
Thanks for any assistance and Best Regards,
Johan Åhlén,
Business Intelligence consultant at IFS
View 2 Replies
View Related
Mar 7, 2007
Hi,
I am trying to apply the sample provided by Microsoft in the following article:
http://msdn2.microsoft.com/en-us/library/ms403355.aspx
I am trying to call a SSIS package from a web service hosted on the same machine as the package file is sitting. The package is running fine from the Agent and also by the "Integration Services Project" in VS.NET.
I had a lot of problems with permissions but they are resolved, at least I have no error messages to point to that direction. Now I am getting these results:
1. Error: -1073659874 / Description: The file name "\Diver-svrInputDataFilesdn_cust.txt" specified in the connection was not valid.
2. Error: -1073659875 / Description: Connection "bdn_cust" failed validation.
3. Error: -1073659874 / Description: The file name "\Diver-svrInputDataFilesdn_cust.txt" specified in the connection was not valid.
4. Error: -1073659875 / Description: Connection "SourceConnectionFlatFile" failed validation.
Where \DiverMInputFilesdn_cust.txt" is a file processed by the package.
Is there anybody who can give me some directions. Thank you in advance.
View 4 Replies
View Related
Feb 26, 2007
Hi There,
I am loading/executing packages from c# and I need to populate a temp table from user input and pass this table as a variable to the datareader source components sql command. I am using expression to build this query, but I am getting design time error when I have this command..
"select id, (SysDate + 28) as ExpiresDate from Table1 where id in (Select Id from" +@[User::Table2]+")"..
I have declared Table2 as a variable of type Object and I am creating Table2 in C# and I am assigning that Table to the user Table. But in the design mode, I am getting an error...expression cannot be evaluated.
Can anybody please tell me when I cannot do this?
Thanks,
View 3 Replies
View Related
Feb 14, 2007
How could I remark my script component, It is isolated, I just wanna check something without losing the long script written inside.
Thanks,
Fahad
View 6 Replies
View Related
Mar 26, 2007
Hi,
I have 56 fields coming into the input of an script component, The need for script component was to just to check if one of those 56 columns has a valid date or not, If valid it will parse and put in an output date column, if not, it will put in NULL.
The 55 fields should be passed on. I dont really wanna write code and define output columns. How do I do this ?
Any input in this would be appreciated.
Thanks,
View 5 Replies
View Related
Feb 19, 2008
Hello Helpers,
I need to know how to use my private function - created as a scalar-valued-function in SQL Server 2005 - in script component (here a transformation is used) in a data flow task to transform a two-digit-month into a tree-sign-month:
Example: '01' should be transformed into 'Jan'
Many thanks for alle your commitment and help!
Ulrike
View 4 Replies
View Related
Aug 30, 2006
I've seen several post asking for that possibility, but all 've read, didn't help me.Some sing SQLDMO, other suggest to use SQLSMO, others only explaining to connect to a server and then do "CREATE DATABASE".I will do this within .Net. Connecting to a SQL Server 2005 and execute "CREATE DATABASE" I could realize.But how doing this with SQLExpress? Trying to do SqlConnection.Open with a non existing DB does not work., says "file not exists".Or do I only have the wrong connection string? Can someone post here an excample connection string, which works with a non existing database?Some hints I've read make me considering to use SQLSMO. But I don't have it on my computer. Where do I get it from? Any links would be nice.
View 6 Replies
View Related
Apr 17, 2007
Hello. Im trying to create an SQLDataSource control programmatically. I need to do this because I want to do some stuff on my MasterPage's 'Page_Init' event.
heres my code (Master.master.vb): Protected Sub Page_Init(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Init
lblUser.Text = Page.User.Identity.Name
Dim PUser As New ControlParameter
PUser.ControlID = "lblUser"
PUser.Name = "LoginName"
PUser.PropertyName = "Text"
PUser.Type = TypeCode.String
PUser.DefaultValue = Page.User.Identity.Name
Dim SQLDS_Login As New SqlDataSource
SQLDS_Login.ID = "SQLDS_Login"
SQLDS_Login.ConnectionString = "I put conection string here. How do I use the one on my web.config?"
SQLDS_Login.SelectCommand = "SELECT [LoginID], [LoginName], [Role], [Status] FROM [myLogin] WHERE ([LoginName] = @LoginName)"
SQLDS_Login.SelectParameters.Add(PUser)
SQLDS_Login.SelectCommandType = SqlDataSourceCommandType.Text
GridView1.DataSource = SQLDS_Login
GridView1.DataBind()
End Sub
When i run, i get this error message:
The SqlDataSource control 'SQLDS_Login' does not have a naming container. Ensure that the control is added to the page before calling DataBind.
I never had any problem with Inserts, Updates and Deleting, but I have never made it work for Select when doing it programmatically.
Can you help me with this?
View 1 Replies
View Related
Jan 13, 2006
Hi there,
I got a user who is requesting a weekly report to be exported in csv (comma delimited) format. But this process will run weekly using schedule job and he wants the file to save to a certain directory on the network. Two part questions...
1. Is there a way to create a .csv file programmatically after runing the query?
2. How would I save the .csv file to a specified directory on the network?
TIA
View 3 Replies
View Related
Feb 15, 2007
Hi!
I am trying to write a script in VB.NET that will run a report that already exists in the system and export the results to my local machine. We are using MS Reporting Services to manage and manipulate the reports. So here's my question:
Is it possible to programmatically create a report in VB.NET based on an existing report? I noticed that crystal reports has a nice export method, but I have not been able to find anything similar for my situation. Basically I believe I would need some sort of reporting services object in .NET that would allow me to run the report and export the results. Does anyone know of such a structure, or if this is even possible? Thanks!!
Best regards,
Josh
View 1 Replies
View Related
Jan 4, 2007
Hi guys,
I was intended to write a program that will create a SSIS package which will import data from a CSV file to the SQL server 2005. But I did not find any good example for this into the internet. I found some example which exports data from SQL server 2005 to CSV files. And following those examples I have tried to write my own. But I am facing some problem with that. What I am doing here is creating two connection manager objects, one for Flat file and another for OLEDB. And create a data flow task that has two data flow component, one for reading source and another for writing to destination. While debugging I can see that after invoking the ReinitializedMetaData() for the flat file source data flow component, there is not output column found. Why it is not fetching the output columns from the CSV file? And after that when it invokes the ReinitializedMetaData() for the destination data flow component it simply throws exception.
Can any body help me to get around this problem? Even can anyone give me any link where I can find some useful article to accomplish this goal?
I am giving my code here too.
I will appreciate any kind of suggestion on this.
Code snippet:
public void CreatePackage()
{
string executeSqlTask = typeof(ExecuteSQLTask).AssemblyQualifiedName;
Package pkg = new Package();
pkg.PackageType = DTSPackageType.DTSDesigner90;
ConnectionManager oledbConnectionManager = CreateOLEDBConnection(pkg);
ConnectionManager flatfileConnectionManager =
CreateFileConnection(pkg);
// creating the SQL Task for table creation
Executable sqlTaskExecutable = pkg.Executables.Add(executeSqlTask);
ExecuteSQLTask execSqlTask = (sqlTaskExecutable as Microsoft.SqlServer.Dts.Runtime.TaskHost).InnerObject as ExecuteSQLTask;
execSqlTask.Connection = oledbConnectionManager.Name;
execSqlTask.SqlStatementSource =
"CREATE TABLE [MYDATABASE].[dbo].[MYTABLE] ([NAME] NVARCHAR(50),[AGE] NVARCHAR(50),[GENDER] NVARCHAR(50)) GO";
// creating the Data flow task
Executable dataFlowExecutable = pkg.Executables.Add("DTS.Pipeline.1");
TaskHost pipeLineTaskHost = (TaskHost)dataFlowExecutable;
MainPipe dataFlowTask = (MainPipe)pipeLineTaskHost.InnerObject;
// Put a precedence constraint between the tasks.
PrecedenceConstraint pcTasks = pkg.PrecedenceConstraints.Add(sqlTaskExecutable, dataFlowExecutable);
pcTasks.Value = DTSExecResult.Success;
pcTasks.EvalOp = DTSPrecedenceEvalOp.Constraint;
// Now adding the data flow components
IDTSComponentMetaData90 sourceDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New();
sourceDataFlowComponent.Name = "Source Data from Flat file";
// Here is the component class id for flat file source data
sourceDataFlowComponent.ComponentClassID = "{90C7770B-DE7C-435E-880E-E718C92C0573}";
CManagedComponentWrapper managedInstance = sourceDataFlowComponent.Instantiate();
managedInstance.ProvideComponentProperties();
sourceDataFlowComponent.
RuntimeConnectionCollection[0].ConnectionManagerID = flatfileConnectionManager.ID;
sourceDataFlowComponent.
RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.ToConnectionManager90(flatfileConnectionManager);
managedInstance.AcquireConnections(null);
managedInstance.ReinitializeMetaData();
managedInstance.ReleaseConnections();
// Get the destination's default input and virtual input.
IDTSOutput90 output = sourceDataFlowComponent.OutputCollection[0];
// Here I dont find any columns at all..why??
// Now adding the data flow components
IDTSComponentMetaData90 destinationDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New();
destinationDataFlowComponent.Name =
"Destination Oledb compoenent";
// Here is the component class id for Oledvb data
destinationDataFlowComponent.ComponentClassID = "{E2568105-9550-4F71-A638-B7FE42E66922}";
CManagedComponentWrapper managedOleInstance = destinationDataFlowComponent.Instantiate();
managedOleInstance.ProvideComponentProperties();
destinationDataFlowComponent.
RuntimeConnectionCollection[0].ConnectionManagerID = oledbConnectionManager.ID;
destinationDataFlowComponent.
RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.ToConnectionManager90(oledbConnectionManager);
// Set the custom properties.
managedOleInstance.SetComponentProperty("AccessMode", 2);
managedOleInstance.SetComponentProperty("OpenRowset", "[MYDATABASE].[dbo].[MYTABLE]");
managedOleInstance.AcquireConnections(null);
managedOleInstance.ReinitializeMetaData(); // Throws exception
managedOleInstance.ReleaseConnections();
// Create the path.
IDTSPath90 path = dataFlowTask.PathCollection.New(); path.AttachPathAndPropagateNotifications(sourceDataFlowComponent.OutputCollection[0],
destinationDataFlowComponent.InputCollection[0]);
// Get the destination's default input and virtual input.
IDTSInput90 input = destinationDataFlowComponent.InputCollection[0];
IDTSVirtualInput90 vInput = input.GetVirtualInput();
// Iterate through the virtual input column collection.
foreach (IDTSVirtualInputColumn90 vColumn in vInput.VirtualInputColumnCollection)
{
managedOleInstance.SetUsageType(
input.ID, vInput, vColumn.LineageID, DTSUsageType.UT_READONLY);
}
DTSExecResult res = pkg.Execute();
}
public ConnectionManager CreateOLEDBConnection(Package p)
{
ConnectionManager ConMgr;
ConMgr = p.Connections.Add("OLEDB");
ConMgr.ConnectionString =
"Data Source=VSTS;Initial Catalog=MYDATABASE;Provider=SQLNCLI;Integrated Security=SSPI;Auto Translate=false;";
ConMgr.Name = "SSIS Connection Manager for Oledb";
ConMgr.Description = "OLE DB connection to the Test database.";
return ConMgr;
}
public ConnectionManager CreateFileConnection(Package p)
{
ConnectionManager connMgr;
connMgr = p.Connections.Add("FLATFILE");
connMgr.ConnectionString = @"D:MyCSVFile.csv";
connMgr.Name = "SSIS Connection Manager for Files";
connMgr.Description = "Flat File connection";
connMgr.Properties["Format"].SetValue(connMgr, "Delimited");
connMgr.Properties["HeaderRowDelimiter"].SetValue(connMgr, Environment.NewLine);
return connMgr;
}
And my CSV files is as follows
NAME, AGE, GENDER
Jon,52,MALE
Linda, 26, FEMALE
Thats all. Thanks.
View 4 Replies
View Related
Jan 6, 2005
I am creating a page that creates a report based on a dynamically created SQL statement which is created by user input.
Everything is good except for the WHERE section, which is created from values in a list box.
For Example:
lstCriteria.items(1).value = "COMPANY = 'foo'"
lstCriteria.items(2).value = "DAY= 2"
I build my SQL statement with these values like so:
SELECT * FROM POO WHERE COMPANY = 'foo' AND DAY = 2
The problem I am having is when there are multiple values of the same type in the list box. Say:
lstCriteria.items(1).value = "COMPANY = 'foo'"
lstCriteria.items(2).value = "DAY= 2"
lstCriteria.items(1).value = "COMPANY = 'moo'"
My employer wants this to be valid, but I am having a tough time coming up with a solution.
I know that my SQL statement needs to now read:
SELECT * FROM POO WHERE COMPANY = 'foo' AND DAY = 2 OR COMPANY = 'poo' AND DAY = 2
I have code set up to read the values of each list box item up to the "=". And I know that I need to compair this value with the others in the list box...but I am not running into any good solutions.
Any HELP?
View 2 Replies
View Related
Jan 3, 2008
Hi all
this is my code and i find it in microsoft's site
if i run it with sql server connection it works but if i try to use it with sql express it give me this error:
CREATE FILE encountered operating system error 5(access denied) while attempting to open or create the physical file 'c://mydatabase.mdf'
it seems as a permission error but it isn't. I have to set something in sql express while in sql server it is already setted?
static void WriteDB()
{
String str;
//sql server connection
SqlConnection myConn = new SqlConnection("Server=localhost;Integrated security=SSPI;database=master");
//sql express connection
SqlConnection myConn = new SqlConnection("Server=localhost;Integrated security=SSPI;database=master");
str = "CREATE DATABASE MyDatabase ON PRIMARY " +
"(NAME = MyDatabase_Data, " +
"FILENAME = 'C:\MyDatabaseData.mdf', " +
"SIZE = 2MB, MAXSIZE = 10MB, FILEGROWTH = 10%) " +
"LOG ON (NAME = MyDatabase_Log, " +
"FILENAME = 'C:\MyDatabaseLog.ldf', " +
"SIZE = 1MB, " +
"MAXSIZE = 5MB, " +
"FILEGROWTH = 10%)";
SqlCommand myCommand = new SqlCommand(str, myConn);
try
{
myConn.Open();
myCommand.ExecuteNonQuery();
MessageBox.Show("DataBase is Created Successfully", "MyProgram", MessageBoxButtons.OK, MessageBoxIcon.Information);
}
catch (System.Exception ex)
{
MessageBox.Show(ex.ToString(), "MyProgram", MessageBoxButtons.OK, MessageBoxIcon.Information);
}
finally
{
if (myConn.State == ConnectionState.Open)
{
myConn.Close();
}
}
}
thanks
Marco
View 3 Replies
View Related
Nov 9, 2007
Below is C# code used to create a FuzzyLookup SSIS package programmatically. It does 95% of what I need it to. The only thing missing that I cannot figure out is how to take a Fuzzy Lookup Input column (OLE DB Output Column) and make it "pass through" the fuzzy lookup component to the OLE DB Destination. In the example below, that means I need the QuarantinedEmployeeId to make it into the destination.
Look in the "Test Dependencies" region below to get instructions and scripts used to set assembly references, create the sample tables used for this example, and insert test data.
Can anyone help me get past this last hurdle? You will see at the end of my Fuzzy Lookup region a bunch of commented out code that I've used to try to accomplish this last problem.
Code Block
using Microsoft.SqlServer.Dts.Runtime;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
namespace CreateSsisPackage
{
public class TestFuzzyLookup
{
public static void Test()
{
#region Test Dependencies
// Assembly references:
// Microsoft.SqlServer.DTSPipelineWrap
// Microsoft.SQLServer.DTSRuntimeWrap
// Microsoft.SQLServer.ManagedDTS
// First create a database called TestFuzzyLookup
// Next, create tables:
//SET ANSI_NULLS ON
//GO
//SET QUOTED_IDENTIFIER ON
//GO
//CREATE TABLE [dbo].[EmployeeMatch](
// [RecordId] [int] IDENTITY(1,1) NOT NULL,
// [EmployeeId] [int] NOT NULL,
// [QuarantinedEmployeeId] [int] NOT NULL,
// [_Similarity] [real] NOT NULL,
// [_Confidence] [real] NOT NULL,
// CONSTRAINT [PK_EmployeeMatch] PRIMARY KEY CLUSTERED
//(
// [RecordId] ASC
//)WITH (IGNORE_DUP_KEY = OFF) ON [PRIMARY]
//) ON [PRIMARY]
//GO
//SET ANSI_NULLS ON
//GO
//SET QUOTED_IDENTIFIER ON
//GO
//CREATE TABLE [dbo].[QuarantinedEmployee](
// [QuarantinedEmployeeId] [int] IDENTITY(1,1) NOT NULL,
// [QuarantinedEmployeeName] [varchar](50) NOT NULL,
// CONSTRAINT [PK_QuarantinedEmployee] PRIMARY KEY CLUSTERED
//(
// [QuarantinedEmployeeId] ASC
//)WITH (IGNORE_DUP_KEY = OFF) ON [PRIMARY]
//) ON [PRIMARY]
//GO
//SET ANSI_NULLS ON
//GO
//SET QUOTED_IDENTIFIER ON
//GO
//CREATE TABLE [dbo].[Employee](
// [EmployeeId] [int] IDENTITY(1,1) NOT NULL,
// [EmployeeName] [varchar](50) NOT NULL,
// CONSTRAINT [PK_Employee] PRIMARY KEY CLUSTERED
//(
// [EmployeeId] ASC
//)WITH (IGNORE_DUP_KEY = OFF) ON [PRIMARY]
//) ON [PRIMARY]
// Next, insert test data
//insert into employee values ('John Doe')
//insert into employee values ('Jane Smith')
//insert into employee values ('Ryan Johnson')
//insert into quarantinedemployee values ('John Dole')
#endregion Test Dependencies
#region Create Package
// Create a new package
Package package = new Package();
package.Name = "FuzzyLookupTest";
// Add a Data Flow task
TaskHost taskHost = package.Executables.Add("DTS.Pipeline") as TaskHost;
taskHost.Name = "Fuzzy Lookup";
IDTSPipeline90 pipeline = taskHost.InnerObject as MainPipe;
// Get the pipeline's component metadata collection
IDTSComponentMetaDataCollection90 componentMetadataCollection = pipeline.ComponentMetaDataCollection;
#endregion Create Package
#region Source
// Add a new component metadata object to the data flow
IDTSComponentMetaData90 oledbSourceMetadata = componentMetadataCollection.New();
// Associate the component metadata object with the OLE DB Source Adapter
oledbSourceMetadata.ComponentClassID = "DTSAdapter.OLEDBSource";
// Instantiate the OLE DB Source adapter
IDTSDesigntimeComponent90 oledbSourceComponent = oledbSourceMetadata.Instantiate();
// Ask the component to set up its component metadata object
oledbSourceComponent.ProvideComponentProperties();
// Add an OLE DB connection manager
ConnectionManager connectionManagerSource = package.Connections.Add("OLEDB");
connectionManagerSource.Name = "OLEDBSource";
// Set the connection string
connectionManagerSource.ConnectionString = "Data Source=localhost;Initial Catalog=TestFuzzyLookup;Provider=SQLNCLI.1;Integrated Security=SSPI;Auto Translate=False;";
// Set the connection manager as the OLE DB Source adapter's runtime connection
IDTSRuntimeConnection90 runtimeConnectionSource = oledbSourceMetadata.RuntimeConnectionCollection["OleDbConnection"];
runtimeConnectionSource.ConnectionManagerID = connectionManagerSource.ID;
// Tell the OLE DB Source adapter to use the source table
oledbSourceComponent.SetComponentProperty("OpenRowset", "QuarantinedEmployee");
oledbSourceComponent.SetComponentProperty("AccessMode", 0);
// Set up the connection manager object
runtimeConnectionSource.ConnectionManager = DtsConvert.ToConnectionManager90(connectionManagerSource);
// Establish the database connection
oledbSourceComponent.AcquireConnections(null);
// Set up the column metadata
oledbSourceComponent.ReinitializeMetaData();
// Release the database connection
oledbSourceComponent.ReleaseConnections();
// Release the connection manager
runtimeConnectionSource.ReleaseConnectionManager();
#endregion Source
#region Fuzzy Lookup
// Add a new component metadata object to the data flow
IDTSComponentMetaData90 fuzzyLookupMetadata = componentMetadataCollection.New();
// Associate the component metadata object with the Fuzzy Lookup object
fuzzyLookupMetadata.ComponentClassID = "DTSTransform.BestMatch.1";
// Instantiate
IDTSDesigntimeComponent90 fuzzyLookupComponent = fuzzyLookupMetadata.Instantiate();
// Ask the component to set up its component metadata object
fuzzyLookupComponent.ProvideComponentProperties();
// Add an OLE DB connection manager
ConnectionManager connectionManagerFuzzy = package.Connections.Add("OLEDB");
connectionManagerFuzzy.Name = "OLEDBFuzzy";
// Set the connection string
connectionManagerFuzzy.ConnectionString = "Data Source=localhost;Initial Catalog=TestFuzzyLookup;Provider=SQLNCLI.1;Integrated Security=SSPI;Auto Translate=False;";
// Set the connection manager as the fuzzy lookup component's runtime connection
IDTSRuntimeConnection90 runtimeConnectionFuzzy = fuzzyLookupMetadata.RuntimeConnectionCollection["OleDbConnection"];
runtimeConnectionFuzzy.ConnectionManagerID = connectionManagerFuzzy.ID;
// Set up the connection manager object
runtimeConnectionFuzzy.ConnectionManager = DtsConvert.ToConnectionManager90(connectionManagerFuzzy);
// Establish the database connection
fuzzyLookupComponent.AcquireConnections(null);
// Set up the external metadata column
fuzzyLookupComponent.ReinitializeMetaData();
// Release the database connection
fuzzyLookupComponent.ReleaseConnections();
// Release the connection manager
runtimeConnectionFuzzy.ReleaseConnectionManager();
// Get the standard output of the OLE DB Source adapter
IDTSOutput90 oledbSourceOutput = oledbSourceMetadata.OutputCollection["OLE DB Source Output"];
// Get the input of the Fuzzy Lookup component
IDTSInput90 fuzzyInput = fuzzyLookupMetadata.InputCollection["Fuzzy Lookup Input"];
// Create a new path object
IDTSPath90 path = pipeline.PathCollection.New();
// Connect the source to Fuzzy Lookup
path.AttachPathAndPropagateNotifications(oledbSourceOutput, fuzzyInput);
// Get the output column collection for the OLE DB Source adapter
IDTSOutputColumnCollection90 oledbSourceOutputColumns = oledbSourceOutput.OutputColumnCollection;
// Get the external metadata column collection for the fuzzy lookup component
IDTSExternalMetadataColumnCollection90 externalMetadataColumns = fuzzyInput.ExternalMetadataColumnCollection;
// Get the virtual input for the fuzzy lookup component
IDTSVirtualInput90 virtualInput = fuzzyInput.GetVirtualInput();
// Loop through output columns and relate columns that will be fuzzy matched on
foreach (IDTSOutputColumn90 outputColumn in oledbSourceOutputColumns)
{
IDTSInputColumn90 col = fuzzyLookupComponent.SetUsageType(fuzzyInput.ID, virtualInput, outputColumn.LineageID, DTSUsageType.UT_READONLY);
if (outputColumn.Name == "QuarantinedEmployeeName")
{
// column name is one of the columns we'll match with
fuzzyLookupComponent.SetInputColumnProperty(fuzzyInput.ID, col.ID, "JoinToReferenceColumn", "EmployeeName");
fuzzyLookupComponent.SetInputColumnProperty(fuzzyInput.ID, col.ID, "MinSimilarity", 0.6m);
// set to be fuzzy match (not exact match)
fuzzyLookupComponent.SetInputColumnProperty(fuzzyInput.ID, col.ID, "JoinType", 2);
}
}
fuzzyLookupComponent.SetComponentProperty("MatchIndexOptions", 1);
fuzzyLookupComponent.SetComponentProperty("MaxOutputMatchesPerInput", 100);
fuzzyLookupComponent.SetComponentProperty("ReferenceTableName", "Employee");
fuzzyLookupComponent.SetComponentProperty("WarmCaches", true);
fuzzyLookupComponent.SetComponentProperty("MinSimilarity", 0.6);
IDTSOutput90 fuzzyLookupOutput = fuzzyLookupMetadata.OutputCollection["Fuzzy Lookup Output"];
// add output columns that will simply pass through from the reference table (Employee)
IDTSOutputColumn90 outCol = fuzzyLookupComponent.InsertOutputColumnAt(fuzzyLookupOutput.ID, 0, "EmployeeId", "");
outCol.SetDataTypeProperties(Microsoft.SqlServer.Dts.Runtime.Wrapper.DataType.DT_I4, 0, 0, 0, 0);
fuzzyLookupComponent.SetOutputColumnProperty(fuzzyLookupOutput.ID, outCol.ID, "CopyFromReferenceColumn", "EmployeeId");
// add output columns that will simply pass through from the oledb source (QuarantinedEmployeeId)
//IDTSOutput90 sourceOutputCollection = oledbSourceMetadata.OutputCollection["OLE DB Source Output"];
//IDTSOutputColumnCollection90 sourceOutputCols = sourceOutputCollection.OutputColumnCollection;
//foreach (IDTSOutputColumn90 outputColumn in sourceOutputCols)
//{
// if (outputColumn.Name == "QuarantinedEmployeeId")
// {
// IDTSOutputColumn90 col = fuzzyLookupComponent.InsertOutputColumnAt(fuzzyLookupOutput.ID, 0, outputColumn.Name, "");
// col.SetDataTypeProperties(
// outputColumn.DataType, outputColumn.Length, outputColumn.Precision, outputColumn.Scale, outputColumn.CodePage);
// //fuzzyLookupComponent.SetOutputColumnProperty(
// // fuzzyLookupOutput.ID, col.ID, "SourceInputColumnLineageId", outputColumn.LineageID);
// }
//}
// add output columns that will simply pass through from the oledb source (QuarantinedEmployeeId)
//IDTSInput90 fuzzyInputCollection = fuzzyLookupMetadata.InputCollection["Fuzzy Lookup Input"];
//IDTSInputColumnCollection90 fuzzyInputCols = fuzzyInputCollection.InputColumnCollection;
//foreach (IDTSInputColumn90 inputColumn in fuzzyInputCols)
//{
// if (inputColumn.Name == "QuarantinedEmployeeId")
// {
// IDTSOutputColumn90 col = fuzzyLookupComponent.InsertOutputColumnAt(fuzzyLookupOutput.ID, 0, inputColumn.Name, "");
// col.SetDataTypeProperties(
// inputColumn.DataType, inputColumn.Length, inputColumn.Precision, inputColumn.Scale, inputColumn.CodePage);
// fuzzyLookupComponent.SetOutputColumnProperty(
// fuzzyLookupOutput.ID, col.ID, "SourceInputColumnLineageId", inputColumn.LineageID);
// }
//}
#endregion Fuzzy Lookup
#region Destination
// Add a new component metadata object to the data flow
IDTSComponentMetaData90 oledbDestinationMetadata = componentMetadataCollection.New();
// Associate the component metadata object with the OLE DB Destination Adapter
oledbDestinationMetadata.ComponentClassID = "DTSAdapter.OLEDBDestination";
// Instantiate the OLE DB Destination adapter
IDTSDesigntimeComponent90 oledbDestinationComponent = oledbDestinationMetadata.Instantiate();
// Ask the component to set up its component metadata object
oledbDestinationComponent.ProvideComponentProperties();
// Add an OLE DB connection manager
ConnectionManager connectionManagerDestination = package.Connections.Add("OLEDB");
connectionManagerDestination.Name = "OLEDBDestination";
// Set the connection string
connectionManagerDestination.ConnectionString = "Data Source=localhost;Initial Catalog=TestFuzzyLookup;Provider=SQLNCLI.1;Integrated Security=SSPI;Auto Translate=False;";
// Set the connection manager as the OLE DBDestination adapter's runtime connection
IDTSRuntimeConnection90 runtimeConnectionDestination = oledbDestinationMetadata.RuntimeConnectionCollection["OleDbConnection"];
runtimeConnectionDestination.ConnectionManagerID = connectionManagerDestination.ID;
// Tell the OLE DB Destination adapter to use the destination table
oledbDestinationComponent.SetComponentProperty("OpenRowset", "EmployeeMatch");
oledbDestinationComponent.SetComponentProperty("AccessMode", 0);
// Set up the connection manager object
runtimeConnectionDestination.ConnectionManager = DtsConvert.ToConnectionManager90(connectionManagerDestination);
// Establish the database connection
oledbDestinationComponent.AcquireConnections(null);
// Set up the external metadata column
oledbDestinationComponent.ReinitializeMetaData();
// Release the database connection
oledbDestinationComponent.ReleaseConnections();
// Release the connection manager
runtimeConnectionDestination.ReleaseConnectionManager();
// Get the standard output of the fuzzy lookup componenet
IDTSOutput90 fuzzyLookupOutputCollection = fuzzyLookupMetadata.OutputCollection["Fuzzy Lookup Output"];
// Get the input of the OLE DB Destination adapter
IDTSInput90 oledbDestinationInput = oledbDestinationMetadata.InputCollection["OLE DB Destination Input"];
// Create a new path object
IDTSPath90 ssisPath = pipeline.PathCollection.New();
// Connect the source and destination adapters
ssisPath.AttachPathAndPropagateNotifications(fuzzyLookupOutputCollection, oledbDestinationInput);
// Get the output column collection for the OLE DB Source adapter
IDTSOutputColumnCollection90 fuzzyLookupOutputColumns = fuzzyLookupOutputCollection.OutputColumnCollection;
// Get the external metadata column collection for the OLE DB Destination adapter
IDTSExternalMetadataColumnCollection90 externalMetadataCols = oledbDestinationInput.ExternalMetadataColumnCollection;
// Get the virtual input for the OLE DB Destination adapter.
IDTSVirtualInput90 vInput = oledbDestinationInput.GetVirtualInput();
// Loop through our output columns
foreach (IDTSOutputColumn90 outputColumn in fuzzyLookupOutputColumns)
{
// Add a new input column
IDTSInputColumn90 inputColumn = oledbDestinationComponent.SetUsageType(oledbDestinationInput.ID,
vInput, outputColumn.LineageID, DTSUsageType.UT_READONLY);
// Get the external metadata column from the OLE DB Destination
// using the output column's name
IDTSExternalMetadataColumn90 externalMetadataColumn = externalMetadataCols[outputColumn.Name];
// Map the new input column to its corresponding external metadata column.
oledbDestinationComponent.MapInputColumn(oledbDestinationInput.ID, inputColumn.ID, externalMetadataColumn.ID);
}
#endregion Destination
// Save the package
Application application = new Application();
application.SaveToXml(@"c:TempTestFuzzyLookup.dtsx", package, null);
}
}
}
View 6 Replies
View Related
Nov 15, 2006
Hi All,
I have been stuck with this problem since few days, need help regarding the same. I am enclosing the problem description and possible solutions that I have found.
Can anyone please help me out here?
Thanks and regards,
Virat
Problem Description:
I have a requirement for which I have created a data driven subscription in
SQL Server 2005, the whole thing works like this:
I have a report on Report Server which executes a stored procedure to get
its parameters; then it calls another stored procedure to get data for the
report; then it creates the report and copies it to a file share. This is
done using data driven subscription and the time set for repeating this
process is 5 minutes.
You can assume that following are working fine:
1. I have deployed the report on the Report Manager (Uploaded the report,
created a data source, linked the report to data source) - manually, the
report works fine.
2. Created a data driven subscription.
3. The data driven subscription calls a stored procedure, say
GetReportParameters which returns all the parameters required for the report
to execute.
4. The Report Manager executes the report by calling a stored procedure, say
GetReportData with the parameters provided by GetReportParameters stored
procedure; after it has generated the report file (PDF) is copied to a file
share.
For each row that GetReportParameters stored procedure returns a report (PDF
file) will be created and copied to file share.
Now, my question is
1. How to I get a notification that this file was successfully created
or an error occurred?
2. The only message that reporting service shows on 'Report Manager >
My Subscriptions' is something like "Done: 5 processed of 10 total; 2
errors."
How do I find out which record was processed successfully and which ones
resulted in an error?
Based on above results (success or failure), I have to perform further
operations.
Solutions or Work around that I have found:
1. Create a windows service which will monitor the file share folder
and look for the file name (each record has a unique file name) for the
reports that were picked up for PDF creation. If the file is not found, this
service will report an error. Now, there's a glitch there; if a report takes
very long time to execute it will also be reported as error (i.e. when this
service checks for the PDF file, the report was currently being generated).
So, I can't go with this solution.
2. I have also looked at following tables on ReportServer database:
a. Catalog - information regarding all the reports, folders, data
source information, etc.
b. Subscriptions - all the subscriptions information.
c. ExecutionLog - information regarding execution of the subscriptions
and the also manual execution of reports.
d. Notifications - information regarding the errors that occurred
during subscription execution.
For this solution, I was thinking of doing a windows service which will
monitor these tables and do further operations as required.
This looks like most feasible solution so far.
3. Third option is to look at DeliveryExtensions but in that case I
will have to manually call SSRS APIs and will have to manage report
invocation and subscription information. What is your opinion on this?
My environment details:
Windows XP SP2
SQL Server 2005
Reporting Services 2005
Please let me know if I am missing something somewhere...
View 9 Replies
View Related
Aug 14, 2007
I would like my transformation to automatically create an output column for each input column. Any tips? I can't seem to determine which event to listen to or method to override.
View 3 Replies
View Related
Feb 15, 2007
Hi,
I'm having trouble with a Script Component in a data flow task. I have code that does a SqlCommand.ExecuteReader() call that throws an 'Object reference not set to an instance of an object' error. Thing is, the SqlCommand.ExecuteReader() call is already inside a Try..Catch block. Essentially I have two questions regarding this error:
a) Why doesn't my Catch block catch the exception?
b) I've made sure that my SqlCommand object and the SqlConnection property that it uses are properly instantiated, and the query is correct. Any ideas on why it is throwing that exception?
Hope someone could help.
View 3 Replies
View Related
Sep 25, 2007
Good morning everybody,
I made a Java application to pre-process portuguese texts (stopwords, stemming, BOW creating, etc.)
I want to transform this application on a Integration Service component. I understand I will have to code this new component from zero. But I have no idea on how to start.
I'm reading and testing several tutorials on Integration Services that came with the SQL Server install package but none of them has clues on developing new components. These tutorials seams more focused on demostrate the (awesome) capabilities of Integration Services.
Is there any tutorials on how to implement new components to Integration Services ?
Any help will be very appreciated.
Thanks a lot,
-Renan Souza
View 3 Replies
View Related
Mar 13, 2006
Hi
I am currently trying to write a custom transform componet in c# that will take a row of data, perform a look-up via an external system,
then if there is a match then send the data from the extranel system down macth ouptut (which will have different columns to the input) and drop the data that
was read, else send the data down the unmacthed output which will be the same as the input.
So I would like to write a synchrons transform becuase I don't need read all the rows from the input buffer before I started processing, also I wish have millions of rows
load in memory.
Can this be done? also does any have explame code of how to do this? becuse I can't see how to send data down the match output buffer,
as this will have the lookup results data which will have diffent columns to the input data and how disgard the input data as well.
Thanks Steve
View 7 Replies
View Related