Programmatically Creating Source Script Component
Sep 7, 2007
Does anyone know how to create a Source Script Component programmatically. I can only seem to create a Transformation Script Component. I have this:
PipeLineWrapper.IDTSComponentMetaData90 sourceComponent =
((dataflowTask as TaskHost).InnerObject as PipeLineWrapper.MainPipe).ComponentMetaDataCollection.New();
sourceComponent.ComponentClassID = app.PipelineComponentInfos["Script Component"].CreationName;
Is there anything I have to do extra to make it a source script component, seeing how it defaults to a transformation.
View 1 Replies
ADVERTISEMENT
Nov 20, 2006
Does anyone have any examples of programmatically creating a Transformation Script Component (or Source/Destination) in the dataflow? I have been able to create other Transforms for the dataflow like Derived Column, Sort, etc. but for some reason the Script Component doesn't seem to work the same way.
I have done it as below trying many ways to get the componentClassId including the AssemblyQualifiedname & the GUID as well. No matter, what I do, when it hits the ProvideComponentProperties, it get Exception from HRESULT: 0xC0048021
IDTSComponentMetaData90 scriptPropType = dataFlow.ComponentMetaDataCollection.New();
scriptPropType.Name = "Transform Property Type";
scriptPropType.ComponentClassID = "DTSTransform.ScriptComponent";
// have also tried scriptPropType.ComponentClassID =typeof(Microsoft.SqlServer.Dts.Pipeline.ScriptComponent).AssemblyQualifiedName;
scriptPropType.Description = "Transform Property Type";
CManagedComponentWrapper instance2 = scriptPropType.Instantiate();
instance2.ProvideComponentProperties();
Any help or examples would be greatly appreciated! Thanks!
View 24 Replies
View Related
Feb 26, 2007
Hi There,
I am loading/executing packages from c# and I need to populate a temp table from user input and pass this table as a variable to the datareader source components sql command. I am using expression to build this query, but I am getting design time error when I have this command..
"select id, (SysDate + 28) as ExpiresDate from Table1 where id in (Select Id from" +@[User::Table2]+")"..
I have declared Table2 as a variable of type Object and I am creating Table2 in C# and I am assigning that Table to the user Table. But in the design mode, I am getting an error...expression cannot be evaluated.
Can anybody please tell me when I cannot do this?
Thanks,
View 3 Replies
View Related
Jan 15, 2008
I would like to write a custom data source component for SSIS. I was wondering if there are any tutorials / examples for this that are available.
Thanks
View 1 Replies
View Related
Oct 26, 2007
Hello,
I have a package that has a data lfow task. this task imports data from a db2 database (using the IBM Ole DB provider fro db2) and adds it to sql server database table. This package was created on the server. then though version control (using TFS source control) I check out the package on my local machine. and when I open the package I get the foll 3 errors.
Error 1 Validation error. Import Account Num from BMGP_BDR: DTS.Pipeline: The component metadata for "component "DataReader Source" (1113)" could not be upgraded to the newer version of the component. The PerformUpgrade method failed.
Error 2 Error loading BMAG Download Xref Tables - bmag.dtsx: Microsoft.SqlServer.Dts.Pipeline.ComponentVersionMismatchException: The version of component "DataReader Source" (1113) is not compatible with this version of the DataFlow. [[The version or pipeline version or both for the specified component is higher than the current version. This package was probably created on a new version of DTS or the component than is installed on the current PC.]] at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostCheckAndPerformUpgrade(IDTSManagedComponentWrapper90 wrapper, Int32 lPipelineVersion)
Error 3 Error loading BMAG Download Xref Tables - bmag.dtsx: The component metadata for "component "DataReader Source" (1113)" could not be upgraded to the newer version of the component. The PerformUpgrade method failed.
Please advice.
Thank you.
View 7 Replies
View Related
Jan 23, 2007
Hi,
I have a package which reads an Access file from a folder. My connection manager to this file is .NET providers for OledbMicrosoft Jet 4.0 OLE DB Provider.
Package works from my computer. But when I execute it on the server as a SQL Agent job, I get
The component metadata for "component "DataReader Source" (1) could not be upgraded to the newer version of the component. The PerformUpgrade method failed.
I copied the mdb file to a folder on the server which my packages have no problem reading data from.
My packages run under the same domain account as defined in proxies.
Appreciate a help.
Gulden
View 4 Replies
View Related
Jan 9, 2007
Hi there,
I have created a package which simply imports data from a flat file to a SQL Server table. But I need to incorporate a data conversion component by which I may change the source-destination column mapping programmatically. So what I thought that I need to add a data conversion component into the dataflow task. After adding this component (I found its component id as {C3BF62C8-7C5C-4F85-83C3-E0B6F6BE267C}) I have created a path which establishes the mapping between output columns of source component and the input columns of data conversion component. Now I am not sure how to establish the mapping between the data conversion component€™s input column collection and output column collection.
I am giving my code snippet here,
IDTSComponentMetaData90 sourceDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New();
sourceDataFlowComponent.ComponentClassID = "{90C7770B-DE7C-435E-880E-E718C92C0573}";
€¦ €¦ €¦. // Code for configuring the source data flow component
IDTSComponentMetaData90 conversionDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New();// creating data conversion
conversionDataFlowComponent.ComponentClassID = "{C3BF62C8-7C5C-4F85-83C3-E0B6F6BE267C}";// This is the GUID for data conversion component
CManagedComponentWrapper conversionInstance = conversionDataFlowComponent.Instantiate();//Instantiate
conversionInstance.ProvideComponentProperties();
// Now creating a path to connet the source and conversion
IDTSPath90 fPath = dataFlowTask.PathCollection.New(); fPath.AttachPathAndPropagateNotifications(
sourceDataFlowComponent.OutputCollection[0],
conversionDataFlowComponent.InputCollection[0]);
// Sould I need to accuire connect for data conversion? Im not sure
conversionInstance.AcquireConnections(null);
conversionInstance.ReinitializeMetaData();
// Get the input collection
IDTSInput90 input = conversionDataFlowComponent.InputCollection[0];
IDTSVirtualInput90 vInput = input.GetVirtualInput();
foreach (IDTSVirtualInputColumn90 vColumn in vInput.VirtualInputColumnCollection){
conversionInstance.SetUsageType(
input.ID, vInput, vColumn.LineageID, DTSUsageType.UT_READONLY);
}
.. . // Well here I am stucked. What I need to do here to establish a map
// between conversionDataFlowComponent.InputCollection[0] and
// conversionDataFlowComponent.OutputCollection[0]?
As you can see I am just away from creating the mapping between input and output collection. Can anybody give me an idea how can I achieve this?
I will appreciate all kind of suggestions and comments.
Regards
Moim
View 11 Replies
View Related
Feb 13, 2006
dear experts,
i'm trying to build a package programmatically from client c# application. I'm working to create three dataflow components:
- OleDB Source
- Derived Column Transformations
- OleDb Destination.
My package works good, but i have several problems when insert an expression as Value of an IDTSCustomPropriety90 object, like this one:
[LEN](#firsname.lineageID) > 5 ? <<if true>> : <<if false>>
Simple expressions like concatenate strings, for example "#firstname.lineageID" + "#lastname.lineageID" seem to work good.
Thanks a lot in advanced.
Paganelli Francesco
View 19 Replies
View Related
Feb 2, 2007
Dear all,
I am developing tools for automatic creation of data warehouse tables, cubes and SSIS packages. Generating the SSIS Data Flows works very well using the SSIS components for OLE DB Source, Derived Column, Lookup and OLE DB Destination.
However for some of the advanced functionality I need to use Script Component. I have managed to add it in the Data Flow with all inputs and outputs, but how do I populate it with my code? I've seen there is a component property called "SourceCode" and one called "BinaryCode". The "SourceCode" contains the code, but also some extra metadata.
Questions:
Do you know if there is any programmatic support to generate the Source Code property with the metadata necessary?
Do you know how to compile the Source Code and generate the property BinaryCode?
Example from my code below:
// Create script component
IDTSComponentMetaData90 script = dataFlowTask.ComponentMetaDataCollection.New();
script.ComponentClassID = app.PipelineComponentInfos["Script Component"].CreationName;
CManagedComponentWrapper scriptWrapper = script.Instantiate();
script.InputCollection.New();
script.OutputCollection.New();
scriptWrapper.ProvideComponentProperties();
script.Name = "Logics";
// Create path
IDTSPath90 scriptPath = dataFlowTask.PathCollection.New();
scriptPath.AttachPathAndPropagateNotifications(lastComponent.OutputCollection[0], script.InputCollection[0]);
// Populate input and output columns
IDTSInput90 scriptInput = script.InputCollection[0];
IDTSVirtualInput90 scriptVInput = scriptInput.GetVirtualInput();
foreach (IDTSOutputColumn90 col in oledbSrc.OutputCollection[0].OutputColumnCollection)
{
scriptWrapper.SetUsageType(scriptInput.ID, scriptVInput, col.LineageID, DTSUsageType.UT_READONLY);
IDTSOutputColumn90 tmp = script.OutputCollection[0].OutputColumnCollection.New();
tmp.Name = col.Name;
tmp.SetDataTypeProperties(col.DataType, col.Length, col.Precision, col.Scale, col.CodePage);
}
// Make script asynchronous
script.OutputCollection[0].SynchronousInputID = 0;
Thanks for any assistance and Best Regards,
Johan Åhlén,
Business Intelligence consultant at IFS
View 2 Replies
View Related
Mar 7, 2007
Hi,
I am trying to apply the sample provided by Microsoft in the following article:
http://msdn2.microsoft.com/en-us/library/ms403355.aspx
I am trying to call a SSIS package from a web service hosted on the same machine as the package file is sitting. The package is running fine from the Agent and also by the "Integration Services Project" in VS.NET.
I had a lot of problems with permissions but they are resolved, at least I have no error messages to point to that direction. Now I am getting these results:
1. Error: -1073659874 / Description: The file name "\Diver-svrInputDataFilesdn_cust.txt" specified in the connection was not valid.
2. Error: -1073659875 / Description: Connection "bdn_cust" failed validation.
3. Error: -1073659874 / Description: The file name "\Diver-svrInputDataFilesdn_cust.txt" specified in the connection was not valid.
4. Error: -1073659875 / Description: Connection "SourceConnectionFlatFile" failed validation.
Where \DiverMInputFilesdn_cust.txt" is a file processed by the package.
Is there anybody who can give me some directions. Thank you in advance.
View 4 Replies
View Related
Sep 8, 2006
Hi am trying without luck to load a package which contains a ScriptTask and read the source code of that task.
I can load the package and get the ScriptTask no problem.
However i am not sure how to get the source code.
I know i have to use the ScriptTaskCodeProvider and i assume the GetSourceCode() method.
This is what i have so far
ScriptTask scriptTask = taskHost.InnerObject as ScriptTask;
ScriptTaskCodeProvider codeProvider = new ScriptTaskCodeProvider();
codeProvider.LoadFromTask(scriptTask);
string sourceCode = codeProvider.GetSourceCode(scriptTask.VsaProjectName);
Any assistance greatly appreciated.
Cheers
Richard.
View 4 Replies
View Related
Aug 30, 2006
I've seen several post asking for that possibility, but all 've read, didn't help me.Some sing SQLDMO, other suggest to use SQLSMO, others only explaining to connect to a server and then do "CREATE DATABASE".I will do this within .Net. Connecting to a SQL Server 2005 and execute "CREATE DATABASE" I could realize.But how doing this with SQLExpress? Trying to do SqlConnection.Open with a non existing DB does not work., says "file not exists".Or do I only have the wrong connection string? Can someone post here an excample connection string, which works with a non existing database?Some hints I've read make me considering to use SQLSMO. But I don't have it on my computer. Where do I get it from? Any links would be nice.
View 6 Replies
View Related
Apr 17, 2007
Hello. Im trying to create an SQLDataSource control programmatically. I need to do this because I want to do some stuff on my MasterPage's 'Page_Init' event.
heres my code (Master.master.vb): Protected Sub Page_Init(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Init
lblUser.Text = Page.User.Identity.Name
Dim PUser As New ControlParameter
PUser.ControlID = "lblUser"
PUser.Name = "LoginName"
PUser.PropertyName = "Text"
PUser.Type = TypeCode.String
PUser.DefaultValue = Page.User.Identity.Name
Dim SQLDS_Login As New SqlDataSource
SQLDS_Login.ID = "SQLDS_Login"
SQLDS_Login.ConnectionString = "I put conection string here. How do I use the one on my web.config?"
SQLDS_Login.SelectCommand = "SELECT [LoginID], [LoginName], [Role], [Status] FROM [myLogin] WHERE ([LoginName] = @LoginName)"
SQLDS_Login.SelectParameters.Add(PUser)
SQLDS_Login.SelectCommandType = SqlDataSourceCommandType.Text
GridView1.DataSource = SQLDS_Login
GridView1.DataBind()
End Sub
When i run, i get this error message:
The SqlDataSource control 'SQLDS_Login' does not have a naming container. Ensure that the control is added to the page before calling DataBind.
I never had any problem with Inserts, Updates and Deleting, but I have never made it work for Select when doing it programmatically.
Can you help me with this?
View 1 Replies
View Related
Jan 13, 2006
Hi there,
I got a user who is requesting a weekly report to be exported in csv (comma delimited) format. But this process will run weekly using schedule job and he wants the file to save to a certain directory on the network. Two part questions...
1. Is there a way to create a .csv file programmatically after runing the query?
2. How would I save the .csv file to a specified directory on the network?
TIA
View 3 Replies
View Related
Feb 15, 2007
Hi!
I am trying to write a script in VB.NET that will run a report that already exists in the system and export the results to my local machine. We are using MS Reporting Services to manage and manipulate the reports. So here's my question:
Is it possible to programmatically create a report in VB.NET based on an existing report? I noticed that crystal reports has a nice export method, but I have not been able to find anything similar for my situation. Basically I believe I would need some sort of reporting services object in .NET that would allow me to run the report and export the results. Does anyone know of such a structure, or if this is even possible? Thanks!!
Best regards,
Josh
View 1 Replies
View Related
Jan 4, 2007
Hi guys,
I was intended to write a program that will create a SSIS package which will import data from a CSV file to the SQL server 2005. But I did not find any good example for this into the internet. I found some example which exports data from SQL server 2005 to CSV files. And following those examples I have tried to write my own. But I am facing some problem with that. What I am doing here is creating two connection manager objects, one for Flat file and another for OLEDB. And create a data flow task that has two data flow component, one for reading source and another for writing to destination. While debugging I can see that after invoking the ReinitializedMetaData() for the flat file source data flow component, there is not output column found. Why it is not fetching the output columns from the CSV file? And after that when it invokes the ReinitializedMetaData() for the destination data flow component it simply throws exception.
Can any body help me to get around this problem? Even can anyone give me any link where I can find some useful article to accomplish this goal?
I am giving my code here too.
I will appreciate any kind of suggestion on this.
Code snippet:
public void CreatePackage()
{
string executeSqlTask = typeof(ExecuteSQLTask).AssemblyQualifiedName;
Package pkg = new Package();
pkg.PackageType = DTSPackageType.DTSDesigner90;
ConnectionManager oledbConnectionManager = CreateOLEDBConnection(pkg);
ConnectionManager flatfileConnectionManager =
CreateFileConnection(pkg);
// creating the SQL Task for table creation
Executable sqlTaskExecutable = pkg.Executables.Add(executeSqlTask);
ExecuteSQLTask execSqlTask = (sqlTaskExecutable as Microsoft.SqlServer.Dts.Runtime.TaskHost).InnerObject as ExecuteSQLTask;
execSqlTask.Connection = oledbConnectionManager.Name;
execSqlTask.SqlStatementSource =
"CREATE TABLE [MYDATABASE].[dbo].[MYTABLE] ([NAME] NVARCHAR(50),[AGE] NVARCHAR(50),[GENDER] NVARCHAR(50)) GO";
// creating the Data flow task
Executable dataFlowExecutable = pkg.Executables.Add("DTS.Pipeline.1");
TaskHost pipeLineTaskHost = (TaskHost)dataFlowExecutable;
MainPipe dataFlowTask = (MainPipe)pipeLineTaskHost.InnerObject;
// Put a precedence constraint between the tasks.
PrecedenceConstraint pcTasks = pkg.PrecedenceConstraints.Add(sqlTaskExecutable, dataFlowExecutable);
pcTasks.Value = DTSExecResult.Success;
pcTasks.EvalOp = DTSPrecedenceEvalOp.Constraint;
// Now adding the data flow components
IDTSComponentMetaData90 sourceDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New();
sourceDataFlowComponent.Name = "Source Data from Flat file";
// Here is the component class id for flat file source data
sourceDataFlowComponent.ComponentClassID = "{90C7770B-DE7C-435E-880E-E718C92C0573}";
CManagedComponentWrapper managedInstance = sourceDataFlowComponent.Instantiate();
managedInstance.ProvideComponentProperties();
sourceDataFlowComponent.
RuntimeConnectionCollection[0].ConnectionManagerID = flatfileConnectionManager.ID;
sourceDataFlowComponent.
RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.ToConnectionManager90(flatfileConnectionManager);
managedInstance.AcquireConnections(null);
managedInstance.ReinitializeMetaData();
managedInstance.ReleaseConnections();
// Get the destination's default input and virtual input.
IDTSOutput90 output = sourceDataFlowComponent.OutputCollection[0];
// Here I dont find any columns at all..why??
// Now adding the data flow components
IDTSComponentMetaData90 destinationDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New();
destinationDataFlowComponent.Name =
"Destination Oledb compoenent";
// Here is the component class id for Oledvb data
destinationDataFlowComponent.ComponentClassID = "{E2568105-9550-4F71-A638-B7FE42E66922}";
CManagedComponentWrapper managedOleInstance = destinationDataFlowComponent.Instantiate();
managedOleInstance.ProvideComponentProperties();
destinationDataFlowComponent.
RuntimeConnectionCollection[0].ConnectionManagerID = oledbConnectionManager.ID;
destinationDataFlowComponent.
RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.ToConnectionManager90(oledbConnectionManager);
// Set the custom properties.
managedOleInstance.SetComponentProperty("AccessMode", 2);
managedOleInstance.SetComponentProperty("OpenRowset", "[MYDATABASE].[dbo].[MYTABLE]");
managedOleInstance.AcquireConnections(null);
managedOleInstance.ReinitializeMetaData(); // Throws exception
managedOleInstance.ReleaseConnections();
// Create the path.
IDTSPath90 path = dataFlowTask.PathCollection.New(); path.AttachPathAndPropagateNotifications(sourceDataFlowComponent.OutputCollection[0],
destinationDataFlowComponent.InputCollection[0]);
// Get the destination's default input and virtual input.
IDTSInput90 input = destinationDataFlowComponent.InputCollection[0];
IDTSVirtualInput90 vInput = input.GetVirtualInput();
// Iterate through the virtual input column collection.
foreach (IDTSVirtualInputColumn90 vColumn in vInput.VirtualInputColumnCollection)
{
managedOleInstance.SetUsageType(
input.ID, vInput, vColumn.LineageID, DTSUsageType.UT_READONLY);
}
DTSExecResult res = pkg.Execute();
}
public ConnectionManager CreateOLEDBConnection(Package p)
{
ConnectionManager ConMgr;
ConMgr = p.Connections.Add("OLEDB");
ConMgr.ConnectionString =
"Data Source=VSTS;Initial Catalog=MYDATABASE;Provider=SQLNCLI;Integrated Security=SSPI;Auto Translate=false;";
ConMgr.Name = "SSIS Connection Manager for Oledb";
ConMgr.Description = "OLE DB connection to the Test database.";
return ConMgr;
}
public ConnectionManager CreateFileConnection(Package p)
{
ConnectionManager connMgr;
connMgr = p.Connections.Add("FLATFILE");
connMgr.ConnectionString = @"D:MyCSVFile.csv";
connMgr.Name = "SSIS Connection Manager for Files";
connMgr.Description = "Flat File connection";
connMgr.Properties["Format"].SetValue(connMgr, "Delimited");
connMgr.Properties["HeaderRowDelimiter"].SetValue(connMgr, Environment.NewLine);
return connMgr;
}
And my CSV files is as follows
NAME, AGE, GENDER
Jon,52,MALE
Linda, 26, FEMALE
Thats all. Thanks.
View 4 Replies
View Related
Feb 1, 2006
Hello,
I've been working on an application that uploads an RDL to Reporting Services (through the SOAP webservice method CreateReport) programmatically. I'm having difficulty setting up the data source properties for my uploaded report. In particular the Data Source Credentials property.
The datasource for my report doesn't require credentials. By default after I upload the report to Reporting Services, the Data Source Credentials property is set to "Credentials supplied by the user running the report". How do I go about setting the Data Source Credentials property to "Credentials are not required" programmatically through the webservice?
Thanks in advance
View 6 Replies
View Related
Jan 6, 2005
I am creating a page that creates a report based on a dynamically created SQL statement which is created by user input.
Everything is good except for the WHERE section, which is created from values in a list box.
For Example:
lstCriteria.items(1).value = "COMPANY = 'foo'"
lstCriteria.items(2).value = "DAY= 2"
I build my SQL statement with these values like so:
SELECT * FROM POO WHERE COMPANY = 'foo' AND DAY = 2
The problem I am having is when there are multiple values of the same type in the list box. Say:
lstCriteria.items(1).value = "COMPANY = 'foo'"
lstCriteria.items(2).value = "DAY= 2"
lstCriteria.items(1).value = "COMPANY = 'moo'"
My employer wants this to be valid, but I am having a tough time coming up with a solution.
I know that my SQL statement needs to now read:
SELECT * FROM POO WHERE COMPANY = 'foo' AND DAY = 2 OR COMPANY = 'poo' AND DAY = 2
I have code set up to read the values of each list box item up to the "=". And I know that I need to compair this value with the others in the list box...but I am not running into any good solutions.
Any HELP?
View 2 Replies
View Related
Jan 3, 2008
Hi all
this is my code and i find it in microsoft's site
if i run it with sql server connection it works but if i try to use it with sql express it give me this error:
CREATE FILE encountered operating system error 5(access denied) while attempting to open or create the physical file 'c://mydatabase.mdf'
it seems as a permission error but it isn't. I have to set something in sql express while in sql server it is already setted?
static void WriteDB()
{
String str;
//sql server connection
SqlConnection myConn = new SqlConnection("Server=localhost;Integrated security=SSPI;database=master");
//sql express connection
SqlConnection myConn = new SqlConnection("Server=localhost;Integrated security=SSPI;database=master");
str = "CREATE DATABASE MyDatabase ON PRIMARY " +
"(NAME = MyDatabase_Data, " +
"FILENAME = 'C:\MyDatabaseData.mdf', " +
"SIZE = 2MB, MAXSIZE = 10MB, FILEGROWTH = 10%) " +
"LOG ON (NAME = MyDatabase_Log, " +
"FILENAME = 'C:\MyDatabaseLog.ldf', " +
"SIZE = 1MB, " +
"MAXSIZE = 5MB, " +
"FILEGROWTH = 10%)";
SqlCommand myCommand = new SqlCommand(str, myConn);
try
{
myConn.Open();
myCommand.ExecuteNonQuery();
MessageBox.Show("DataBase is Created Successfully", "MyProgram", MessageBoxButtons.OK, MessageBoxIcon.Information);
}
catch (System.Exception ex)
{
MessageBox.Show(ex.ToString(), "MyProgram", MessageBoxButtons.OK, MessageBoxIcon.Information);
}
finally
{
if (myConn.State == ConnectionState.Open)
{
myConn.Close();
}
}
}
thanks
Marco
View 3 Replies
View Related
Nov 9, 2007
Below is C# code used to create a FuzzyLookup SSIS package programmatically. It does 95% of what I need it to. The only thing missing that I cannot figure out is how to take a Fuzzy Lookup Input column (OLE DB Output Column) and make it "pass through" the fuzzy lookup component to the OLE DB Destination. In the example below, that means I need the QuarantinedEmployeeId to make it into the destination.
Look in the "Test Dependencies" region below to get instructions and scripts used to set assembly references, create the sample tables used for this example, and insert test data.
Can anyone help me get past this last hurdle? You will see at the end of my Fuzzy Lookup region a bunch of commented out code that I've used to try to accomplish this last problem.
Code Block
using Microsoft.SqlServer.Dts.Runtime;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
namespace CreateSsisPackage
{
public class TestFuzzyLookup
{
public static void Test()
{
#region Test Dependencies
// Assembly references:
// Microsoft.SqlServer.DTSPipelineWrap
// Microsoft.SQLServer.DTSRuntimeWrap
// Microsoft.SQLServer.ManagedDTS
// First create a database called TestFuzzyLookup
// Next, create tables:
//SET ANSI_NULLS ON
//GO
//SET QUOTED_IDENTIFIER ON
//GO
//CREATE TABLE [dbo].[EmployeeMatch](
// [RecordId] [int] IDENTITY(1,1) NOT NULL,
// [EmployeeId] [int] NOT NULL,
// [QuarantinedEmployeeId] [int] NOT NULL,
// [_Similarity] [real] NOT NULL,
// [_Confidence] [real] NOT NULL,
// CONSTRAINT [PK_EmployeeMatch] PRIMARY KEY CLUSTERED
//(
// [RecordId] ASC
//)WITH (IGNORE_DUP_KEY = OFF) ON [PRIMARY]
//) ON [PRIMARY]
//GO
//SET ANSI_NULLS ON
//GO
//SET QUOTED_IDENTIFIER ON
//GO
//CREATE TABLE [dbo].[QuarantinedEmployee](
// [QuarantinedEmployeeId] [int] IDENTITY(1,1) NOT NULL,
// [QuarantinedEmployeeName] [varchar](50) NOT NULL,
// CONSTRAINT [PK_QuarantinedEmployee] PRIMARY KEY CLUSTERED
//(
// [QuarantinedEmployeeId] ASC
//)WITH (IGNORE_DUP_KEY = OFF) ON [PRIMARY]
//) ON [PRIMARY]
//GO
//SET ANSI_NULLS ON
//GO
//SET QUOTED_IDENTIFIER ON
//GO
//CREATE TABLE [dbo].[Employee](
// [EmployeeId] [int] IDENTITY(1,1) NOT NULL,
// [EmployeeName] [varchar](50) NOT NULL,
// CONSTRAINT [PK_Employee] PRIMARY KEY CLUSTERED
//(
// [EmployeeId] ASC
//)WITH (IGNORE_DUP_KEY = OFF) ON [PRIMARY]
//) ON [PRIMARY]
// Next, insert test data
//insert into employee values ('John Doe')
//insert into employee values ('Jane Smith')
//insert into employee values ('Ryan Johnson')
//insert into quarantinedemployee values ('John Dole')
#endregion Test Dependencies
#region Create Package
// Create a new package
Package package = new Package();
package.Name = "FuzzyLookupTest";
// Add a Data Flow task
TaskHost taskHost = package.Executables.Add("DTS.Pipeline") as TaskHost;
taskHost.Name = "Fuzzy Lookup";
IDTSPipeline90 pipeline = taskHost.InnerObject as MainPipe;
// Get the pipeline's component metadata collection
IDTSComponentMetaDataCollection90 componentMetadataCollection = pipeline.ComponentMetaDataCollection;
#endregion Create Package
#region Source
// Add a new component metadata object to the data flow
IDTSComponentMetaData90 oledbSourceMetadata = componentMetadataCollection.New();
// Associate the component metadata object with the OLE DB Source Adapter
oledbSourceMetadata.ComponentClassID = "DTSAdapter.OLEDBSource";
// Instantiate the OLE DB Source adapter
IDTSDesigntimeComponent90 oledbSourceComponent = oledbSourceMetadata.Instantiate();
// Ask the component to set up its component metadata object
oledbSourceComponent.ProvideComponentProperties();
// Add an OLE DB connection manager
ConnectionManager connectionManagerSource = package.Connections.Add("OLEDB");
connectionManagerSource.Name = "OLEDBSource";
// Set the connection string
connectionManagerSource.ConnectionString = "Data Source=localhost;Initial Catalog=TestFuzzyLookup;Provider=SQLNCLI.1;Integrated Security=SSPI;Auto Translate=False;";
// Set the connection manager as the OLE DB Source adapter's runtime connection
IDTSRuntimeConnection90 runtimeConnectionSource = oledbSourceMetadata.RuntimeConnectionCollection["OleDbConnection"];
runtimeConnectionSource.ConnectionManagerID = connectionManagerSource.ID;
// Tell the OLE DB Source adapter to use the source table
oledbSourceComponent.SetComponentProperty("OpenRowset", "QuarantinedEmployee");
oledbSourceComponent.SetComponentProperty("AccessMode", 0);
// Set up the connection manager object
runtimeConnectionSource.ConnectionManager = DtsConvert.ToConnectionManager90(connectionManagerSource);
// Establish the database connection
oledbSourceComponent.AcquireConnections(null);
// Set up the column metadata
oledbSourceComponent.ReinitializeMetaData();
// Release the database connection
oledbSourceComponent.ReleaseConnections();
// Release the connection manager
runtimeConnectionSource.ReleaseConnectionManager();
#endregion Source
#region Fuzzy Lookup
// Add a new component metadata object to the data flow
IDTSComponentMetaData90 fuzzyLookupMetadata = componentMetadataCollection.New();
// Associate the component metadata object with the Fuzzy Lookup object
fuzzyLookupMetadata.ComponentClassID = "DTSTransform.BestMatch.1";
// Instantiate
IDTSDesigntimeComponent90 fuzzyLookupComponent = fuzzyLookupMetadata.Instantiate();
// Ask the component to set up its component metadata object
fuzzyLookupComponent.ProvideComponentProperties();
// Add an OLE DB connection manager
ConnectionManager connectionManagerFuzzy = package.Connections.Add("OLEDB");
connectionManagerFuzzy.Name = "OLEDBFuzzy";
// Set the connection string
connectionManagerFuzzy.ConnectionString = "Data Source=localhost;Initial Catalog=TestFuzzyLookup;Provider=SQLNCLI.1;Integrated Security=SSPI;Auto Translate=False;";
// Set the connection manager as the fuzzy lookup component's runtime connection
IDTSRuntimeConnection90 runtimeConnectionFuzzy = fuzzyLookupMetadata.RuntimeConnectionCollection["OleDbConnection"];
runtimeConnectionFuzzy.ConnectionManagerID = connectionManagerFuzzy.ID;
// Set up the connection manager object
runtimeConnectionFuzzy.ConnectionManager = DtsConvert.ToConnectionManager90(connectionManagerFuzzy);
// Establish the database connection
fuzzyLookupComponent.AcquireConnections(null);
// Set up the external metadata column
fuzzyLookupComponent.ReinitializeMetaData();
// Release the database connection
fuzzyLookupComponent.ReleaseConnections();
// Release the connection manager
runtimeConnectionFuzzy.ReleaseConnectionManager();
// Get the standard output of the OLE DB Source adapter
IDTSOutput90 oledbSourceOutput = oledbSourceMetadata.OutputCollection["OLE DB Source Output"];
// Get the input of the Fuzzy Lookup component
IDTSInput90 fuzzyInput = fuzzyLookupMetadata.InputCollection["Fuzzy Lookup Input"];
// Create a new path object
IDTSPath90 path = pipeline.PathCollection.New();
// Connect the source to Fuzzy Lookup
path.AttachPathAndPropagateNotifications(oledbSourceOutput, fuzzyInput);
// Get the output column collection for the OLE DB Source adapter
IDTSOutputColumnCollection90 oledbSourceOutputColumns = oledbSourceOutput.OutputColumnCollection;
// Get the external metadata column collection for the fuzzy lookup component
IDTSExternalMetadataColumnCollection90 externalMetadataColumns = fuzzyInput.ExternalMetadataColumnCollection;
// Get the virtual input for the fuzzy lookup component
IDTSVirtualInput90 virtualInput = fuzzyInput.GetVirtualInput();
// Loop through output columns and relate columns that will be fuzzy matched on
foreach (IDTSOutputColumn90 outputColumn in oledbSourceOutputColumns)
{
IDTSInputColumn90 col = fuzzyLookupComponent.SetUsageType(fuzzyInput.ID, virtualInput, outputColumn.LineageID, DTSUsageType.UT_READONLY);
if (outputColumn.Name == "QuarantinedEmployeeName")
{
// column name is one of the columns we'll match with
fuzzyLookupComponent.SetInputColumnProperty(fuzzyInput.ID, col.ID, "JoinToReferenceColumn", "EmployeeName");
fuzzyLookupComponent.SetInputColumnProperty(fuzzyInput.ID, col.ID, "MinSimilarity", 0.6m);
// set to be fuzzy match (not exact match)
fuzzyLookupComponent.SetInputColumnProperty(fuzzyInput.ID, col.ID, "JoinType", 2);
}
}
fuzzyLookupComponent.SetComponentProperty("MatchIndexOptions", 1);
fuzzyLookupComponent.SetComponentProperty("MaxOutputMatchesPerInput", 100);
fuzzyLookupComponent.SetComponentProperty("ReferenceTableName", "Employee");
fuzzyLookupComponent.SetComponentProperty("WarmCaches", true);
fuzzyLookupComponent.SetComponentProperty("MinSimilarity", 0.6);
IDTSOutput90 fuzzyLookupOutput = fuzzyLookupMetadata.OutputCollection["Fuzzy Lookup Output"];
// add output columns that will simply pass through from the reference table (Employee)
IDTSOutputColumn90 outCol = fuzzyLookupComponent.InsertOutputColumnAt(fuzzyLookupOutput.ID, 0, "EmployeeId", "");
outCol.SetDataTypeProperties(Microsoft.SqlServer.Dts.Runtime.Wrapper.DataType.DT_I4, 0, 0, 0, 0);
fuzzyLookupComponent.SetOutputColumnProperty(fuzzyLookupOutput.ID, outCol.ID, "CopyFromReferenceColumn", "EmployeeId");
// add output columns that will simply pass through from the oledb source (QuarantinedEmployeeId)
//IDTSOutput90 sourceOutputCollection = oledbSourceMetadata.OutputCollection["OLE DB Source Output"];
//IDTSOutputColumnCollection90 sourceOutputCols = sourceOutputCollection.OutputColumnCollection;
//foreach (IDTSOutputColumn90 outputColumn in sourceOutputCols)
//{
// if (outputColumn.Name == "QuarantinedEmployeeId")
// {
// IDTSOutputColumn90 col = fuzzyLookupComponent.InsertOutputColumnAt(fuzzyLookupOutput.ID, 0, outputColumn.Name, "");
// col.SetDataTypeProperties(
// outputColumn.DataType, outputColumn.Length, outputColumn.Precision, outputColumn.Scale, outputColumn.CodePage);
// //fuzzyLookupComponent.SetOutputColumnProperty(
// // fuzzyLookupOutput.ID, col.ID, "SourceInputColumnLineageId", outputColumn.LineageID);
// }
//}
// add output columns that will simply pass through from the oledb source (QuarantinedEmployeeId)
//IDTSInput90 fuzzyInputCollection = fuzzyLookupMetadata.InputCollection["Fuzzy Lookup Input"];
//IDTSInputColumnCollection90 fuzzyInputCols = fuzzyInputCollection.InputColumnCollection;
//foreach (IDTSInputColumn90 inputColumn in fuzzyInputCols)
//{
// if (inputColumn.Name == "QuarantinedEmployeeId")
// {
// IDTSOutputColumn90 col = fuzzyLookupComponent.InsertOutputColumnAt(fuzzyLookupOutput.ID, 0, inputColumn.Name, "");
// col.SetDataTypeProperties(
// inputColumn.DataType, inputColumn.Length, inputColumn.Precision, inputColumn.Scale, inputColumn.CodePage);
// fuzzyLookupComponent.SetOutputColumnProperty(
// fuzzyLookupOutput.ID, col.ID, "SourceInputColumnLineageId", inputColumn.LineageID);
// }
//}
#endregion Fuzzy Lookup
#region Destination
// Add a new component metadata object to the data flow
IDTSComponentMetaData90 oledbDestinationMetadata = componentMetadataCollection.New();
// Associate the component metadata object with the OLE DB Destination Adapter
oledbDestinationMetadata.ComponentClassID = "DTSAdapter.OLEDBDestination";
// Instantiate the OLE DB Destination adapter
IDTSDesigntimeComponent90 oledbDestinationComponent = oledbDestinationMetadata.Instantiate();
// Ask the component to set up its component metadata object
oledbDestinationComponent.ProvideComponentProperties();
// Add an OLE DB connection manager
ConnectionManager connectionManagerDestination = package.Connections.Add("OLEDB");
connectionManagerDestination.Name = "OLEDBDestination";
// Set the connection string
connectionManagerDestination.ConnectionString = "Data Source=localhost;Initial Catalog=TestFuzzyLookup;Provider=SQLNCLI.1;Integrated Security=SSPI;Auto Translate=False;";
// Set the connection manager as the OLE DBDestination adapter's runtime connection
IDTSRuntimeConnection90 runtimeConnectionDestination = oledbDestinationMetadata.RuntimeConnectionCollection["OleDbConnection"];
runtimeConnectionDestination.ConnectionManagerID = connectionManagerDestination.ID;
// Tell the OLE DB Destination adapter to use the destination table
oledbDestinationComponent.SetComponentProperty("OpenRowset", "EmployeeMatch");
oledbDestinationComponent.SetComponentProperty("AccessMode", 0);
// Set up the connection manager object
runtimeConnectionDestination.ConnectionManager = DtsConvert.ToConnectionManager90(connectionManagerDestination);
// Establish the database connection
oledbDestinationComponent.AcquireConnections(null);
// Set up the external metadata column
oledbDestinationComponent.ReinitializeMetaData();
// Release the database connection
oledbDestinationComponent.ReleaseConnections();
// Release the connection manager
runtimeConnectionDestination.ReleaseConnectionManager();
// Get the standard output of the fuzzy lookup componenet
IDTSOutput90 fuzzyLookupOutputCollection = fuzzyLookupMetadata.OutputCollection["Fuzzy Lookup Output"];
// Get the input of the OLE DB Destination adapter
IDTSInput90 oledbDestinationInput = oledbDestinationMetadata.InputCollection["OLE DB Destination Input"];
// Create a new path object
IDTSPath90 ssisPath = pipeline.PathCollection.New();
// Connect the source and destination adapters
ssisPath.AttachPathAndPropagateNotifications(fuzzyLookupOutputCollection, oledbDestinationInput);
// Get the output column collection for the OLE DB Source adapter
IDTSOutputColumnCollection90 fuzzyLookupOutputColumns = fuzzyLookupOutputCollection.OutputColumnCollection;
// Get the external metadata column collection for the OLE DB Destination adapter
IDTSExternalMetadataColumnCollection90 externalMetadataCols = oledbDestinationInput.ExternalMetadataColumnCollection;
// Get the virtual input for the OLE DB Destination adapter.
IDTSVirtualInput90 vInput = oledbDestinationInput.GetVirtualInput();
// Loop through our output columns
foreach (IDTSOutputColumn90 outputColumn in fuzzyLookupOutputColumns)
{
// Add a new input column
IDTSInputColumn90 inputColumn = oledbDestinationComponent.SetUsageType(oledbDestinationInput.ID,
vInput, outputColumn.LineageID, DTSUsageType.UT_READONLY);
// Get the external metadata column from the OLE DB Destination
// using the output column's name
IDTSExternalMetadataColumn90 externalMetadataColumn = externalMetadataCols[outputColumn.Name];
// Map the new input column to its corresponding external metadata column.
oledbDestinationComponent.MapInputColumn(oledbDestinationInput.ID, inputColumn.ID, externalMetadataColumn.ID);
}
#endregion Destination
// Save the package
Application application = new Application();
application.SaveToXml(@"c:TempTestFuzzyLookup.dtsx", package, null);
}
}
}
View 6 Replies
View Related
Jun 15, 2015
I am looking command (cmd/powershell/c#) for setting/updating credential for SSRS data source.
View 2 Replies
View Related
Nov 15, 2006
Hi All,
I have been stuck with this problem since few days, need help regarding the same. I am enclosing the problem description and possible solutions that I have found.
Can anyone please help me out here?
Thanks and regards,
Virat
Problem Description:
I have a requirement for which I have created a data driven subscription in
SQL Server 2005, the whole thing works like this:
I have a report on Report Server which executes a stored procedure to get
its parameters; then it calls another stored procedure to get data for the
report; then it creates the report and copies it to a file share. This is
done using data driven subscription and the time set for repeating this
process is 5 minutes.
You can assume that following are working fine:
1. I have deployed the report on the Report Manager (Uploaded the report,
created a data source, linked the report to data source) - manually, the
report works fine.
2. Created a data driven subscription.
3. The data driven subscription calls a stored procedure, say
GetReportParameters which returns all the parameters required for the report
to execute.
4. The Report Manager executes the report by calling a stored procedure, say
GetReportData with the parameters provided by GetReportParameters stored
procedure; after it has generated the report file (PDF) is copied to a file
share.
For each row that GetReportParameters stored procedure returns a report (PDF
file) will be created and copied to file share.
Now, my question is
1. How to I get a notification that this file was successfully created
or an error occurred?
2. The only message that reporting service shows on 'Report Manager >
My Subscriptions' is something like "Done: 5 processed of 10 total; 2
errors."
How do I find out which record was processed successfully and which ones
resulted in an error?
Based on above results (success or failure), I have to perform further
operations.
Solutions or Work around that I have found:
1. Create a windows service which will monitor the file share folder
and look for the file name (each record has a unique file name) for the
reports that were picked up for PDF creation. If the file is not found, this
service will report an error. Now, there's a glitch there; if a report takes
very long time to execute it will also be reported as error (i.e. when this
service checks for the PDF file, the report was currently being generated).
So, I can't go with this solution.
2. I have also looked at following tables on ReportServer database:
a. Catalog - information regarding all the reports, folders, data
source information, etc.
b. Subscriptions - all the subscriptions information.
c. ExecutionLog - information regarding execution of the subscriptions
and the also manual execution of reports.
d. Notifications - information regarding the errors that occurred
during subscription execution.
For this solution, I was thinking of doing a windows service which will
monitor these tables and do further operations as required.
This looks like most feasible solution so far.
3. Third option is to look at DeliveryExtensions but in that case I
will have to manually call SSRS APIs and will have to manage report
invocation and subscription information. What is your opinion on this?
My environment details:
Windows XP SP2
SQL Server 2005
Reporting Services 2005
Please let me know if I am missing something somewhere...
View 9 Replies
View Related
Dec 3, 2007
Hi,
I have an XML data file and an associated XSD file with properly defined datatypes. However, the datatype of all the data elements are always "string" datatype. For example, in my current xml file, all the data elements are of Decimal datatype which is properly defined in XSD file. However, datatype of all the output columns are of string datatype.
Is it a bug or am I doing something wrong?
Thanks
Navnish
View 1 Replies
View Related
Apr 30, 2008
Hi,
I have a problem of performance with my XML source component. In my XSD file, I have over 300 outputs, for each output of the XML component I generate a Row file (so I generate 300 Row File in the same Data Flow) that I integrate into a table in another Data Flow. When I debug just the data flow of the XML source, it takes much time over 10 hours and the XML source component keeps a yellow color, and it generate empty Row Files, but when I interrupt the debug process it generates the right row files. For information I use a machine with 8 microprocessor and 4GO of Ram, and my XML file has 850KO. So my question is, how can I increase the performance of my package without splitting the XML file.
Thank you in advance
View 7 Replies
View Related
Apr 12, 2008
Any knowledge of an SSIS source pipeline component which reads the JSON, a data interchage format.
JSON looks pretty tempting for heavy data interchange (somewhat human read-able, name/value pairs + arrays, nesting, lighter weight than most xml serializiers), and if its gaining momentum, I should think a source component would follow on (most likely third party)
View 12 Replies
View Related
Apr 4, 2007
I want to use Script Component as Source, but I don't know how to code the output rows. Someone can give me some clue or some sample codes? Thanks in advance.
View 11 Replies
View Related
Apr 6, 2008
I need to iterate through a bunch of packages and replace the source component in all data flows with a different source component type. That much I've got figured out, but I'm wondering the best way to fix all the LineageID references downstream.
In figured there would be some helper classes to do that for you or something, but the best I could come up with so far is build a dictionary... the key being the LineageIDs from the source component I just deleted... the dictionary value being the corresponding new column's LineageID. Then I run the following code. Is there a better way?
Also, derived column transforms end up broken because the expression is still referencing the old LineageIDs, so I had to write a RefreshExpression function and run it on each input and output column downstream. That function replaces the Expression (which uses the LineageID) with the FriendlyExpression (which uses the column name). Also, I had to name the new source component the same name as the old so that two-part names (i.e. [My Source].[My Col]) in the derived column expressions would still be valid. There's got to be a better way! And I'm worried that derived column transform isn't going to be the only special case.
Any better way than the following?
Code Snippet
IDTSComponentMetaData90 rawSourceComponent = pipeline.ComponentMetaDataCollection.New();
rawSourceComponent.ComponentClassID = "DTSAdapter.RawSource";
CManagedComponentWrapper inst = rawSourceComponent.Instantiate();
inst.ProvideComponentProperties();
inst.SetComponentProperty("FileName", sRawFilePath);
inst.AcquireConnections(null);
inst.ReinitializeMetaData();
inst.ReleaseConnections();
IDTSPath90 path = pipeline.PathCollection.New();
path.AttachPathAndPropagateNotifications(rawSourceComponent.OutputCollection[0], input);
//fix all lineageIDs and other lineageID-dependent properties
foreach (IDTSComponentMetaData90 componentToFix in pipeline.ComponentMetaDataCollection)
{
foreach (IDTSInput90 inputToFix in componentToFix.InputCollection)
{
foreach (IDTSInputColumn90 inputCol in inputToFix.InputColumnCollection)
{
if (lineageIdReplacements.ContainsKey(inputCol.LineageID))
{
inputCol.LineageID = lineageIdReplacements[inputCol.LineageID];
RefreshExpression(inputCol);
}
}
}
foreach (IDTSOutput90 outputToFix in componentToFix.OutputCollection)
{
foreach (IDTSOutputColumn90 outputCol in outputToFix.OutputColumnCollection)
{
RefreshExpression(outputCol);
}
}
}
and RefreshExpression looks like:
private static void RefreshExpression(IDTSInputColumn90 inputCol)
{
IDTSCustomProperty90 friendlyExpression = null;
IDTSCustomProperty90 expression = null;
foreach (IDTSCustomProperty90 prop in inputCol.CustomPropertyCollection)
{
if (prop.Name == "FriendlyExpression")
friendlyExpression = prop;
else if (prop.Name == "Expression")
expression = prop;
}
if (friendlyExpression != null && expression != null)
{
expression.Value = friendlyExpression.Value;
}
}
View 7 Replies
View Related
Apr 6, 2006
While working on a recent project, I needed to import a number of old mainframe generated text reports. Since they didn't work on the one-record/one-row construct, the Flat File Source won't work.
So, I created a custom component that allows you to specify a Regular Expression pattern, and it will parse a text file and return columns. Each capture in the RegEx pattern is returned as a column, and you can also specify column names in the regex with the standard (?<colname>) syntax. The code is fairly basic really, but if you are comfortable with regular expression syntax, it can handle a huge variety of unusual text file formats. It also includes a UI editor for the RegEx pattern property that will allow you to test the pattern against a sample text file; it will highlight each row in blue and each column in green (see screenshot)
Since this might have applications for other folks, I added it to SourceForge under GPL. So if you might find that functionality useful, please check out http://sourceforge.net/projects/textregexsource/ and send me some feedback. There isn't much there yet, so check out the screenshots/news/release notes for the basics.
If there is demand, I'll create a more robust install package and documentation. Also looking for feature suggestions.
Cheers...Geof
View 2 Replies
View Related
May 6, 2008
Hi,
I am using XML source component to integrate an xml file into an sql server data base. So for this I have tried to test only this XML component with a small XML file (43 KO) and its XSD (434KO) (not generated using XML component), so my package contain a dataflow with only the XML source component. When I execute the package, in the progress window I get:
1- Validation step 100%, (generate warning, because the outputs are not used)
2-Preparation of execution step 100%
3-Excution step
But when the "Execution step" starts it does not stop and it does not fail, so the XML component keeps the yellow color indifinely and it generate temporary file in "Temp" repository.
I run this package in a box with SQL server 2005 SP2 with, 8 Processor and 4GO of RAM.
So is there any solution or explication for this problem. Is it a limit of SSIS? , and how can I increase the SSIS RAM.
I also generated an XSD file using the XML component, but I get the same problem.
Thank you in advance
View 6 Replies
View Related
Sep 26, 2007
A colleague of mine has discovered some behaviour in the XML Source component that I am having trouble understanding or explaining. Here is the (obfuscated) XML document that we are looking to parse:
<?xml version="1.0"?>
<ArrayOfWellPatternAssociation xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" >
<WellPatternAssociation OfficeCode="Office ABC" PatternName="18 0500" API14="04029597380001" ActiveIndicator="I" AllocationToPattern="0.50" />
<WellPatternAssociation OfficeCode="Office ABC" PatternName="18 0500" API14="04029632710001" ActiveIndicator="I" AllocationToPattern="1.00" />
<WellPatternAssociation OfficeCode="Office ABC" PatternName="18 0500" API14="04029632910001" ActiveIndicator="I" AllocationToPattern="0.50" />
<WellPatternAssociation OfficeCode="Office ABC" PatternName="18 0500" API14="04029632930001" ActiveIndicator="I" AllocationToPattern="0.33" />
</ArrayOfWellPatternAssociation>
When we use the 'Generate XSD...' button it comes up with the following schema:
Code Snippet<?xml version="1.0"?>
<xsd:schema xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:xsd="http://www.w3.org/2001/XMLSchema" attributeFormDefault="unqualified" elementFormDefault="qualified" targetNamespace="http://Chevron.UpstreamSolutions.Nau/CommonReferenceData/CommonReferenceData/">
<xs:element name="ArrayOfWellPatternAssociation">
<xs:complexType>
<xs:sequence>
<xs:element minOccurs="0" maxOccurs="unbounded" name="WellPatternAssociation">
<xs:complexType>
<xs:attribute name="OfficeCode" type="xs:string" use="optional" />
<xs:attribute name="PatternName" type="xs:string" use="optional" />
<xs:attribute name="API14" type="xs:unsignedLong" use="optional" />
<xs:attribute name="ActiveIndicator" type="xs:string" use="optional" />
<xs:attribute name="AllocationToPattern" type="xs:decimal" use="optional" />
</xs:complexType>
</xs:element>
</xs:sequence>
</xs:complexType>
</xs:element>
</xsd:schema>
I'd like to draw your attention to the bit I've highlighted in red. SSIS has incorrectly defined this attribute as an unsignedLong. if you look at the data above you'll see that it has leading zeros therefore we need it to be interpreted as a string. Unfortunately (and here's the problem) there doesn't seem to be a way for us to change the generated xsd and thus refresh the external column metadata. The only way to change it is to go into the Advanced Editor and manually change the external column and output column metadata.
Is this by design? It seems very limiting if you ask me to not let us have control of what the metadata should be.
If I'm missing something please let me know.
Thanks
Jamie
View 16 Replies
View Related
Sep 21, 2007
Hello Readers,
I would like to use an OLAP Connection Manager in a script task.
I have found this link: http://msdn2.microsoft.com/en-us/library/ms136060.aspx, where a SQLConnection is used:
Dim connMgr As IDTSConnectionManager90
Dim sqlConn As SqlConnection
Dim sqlReader As SqlDataReader
Public Overrides Sub AcquireConnections(ByVal Transaction As Object)
connMgr = Me.Connections.MyADONETConnection
sqlConn = CType(connMgr.AcquireConnection(Nothing), SqlConnection)
End Sub
Does anyone know, which coding has to be used to be able to use AcquireConnection for an OLAP Connection manager?
Thanks in advice!
Cheers,
Markus
View 1 Replies
View Related
Apr 29, 2008
I am debugging a Data Flow task in my SSIS package. When I run the package in debug mode, one of the OLEDB Data Sources turns red. I have rerouted all Error Output to a flat file, and put a Data Viewer on that path: no rows get sent. When I click the Preview button on this component in Design mode, I see the expected data and get no error messages. The connection does a simple table access...no SQL command. I don't see anything different between this component and other OLEDB sources in the same package that don't trigger any errors. I've tried dropping and re-creating the component with the same results.
What else can I do to debug this?
View 7 Replies
View Related
May 10, 2006
Hi
in the acquireconnection method Using the below statment I can get a connection Object
oledbConnection = cmado.AcquireConnection(transaction) as OleDbConnection;
from the connection object I can get the connectionstring from the object by calling
oledbConnection.connectionstring() property which will have all the details like DataBase, UserName & other Inofrmation but there is no password Info.
How to get the password Information, I need that information since I will use that info to make OCI calls to fetch the data from the Oracle database in m,y custome source component.
any help is much appriciated
thanks in advance.
View 10 Replies
View Related