Integration Services :: Element Not Exist In Collection Properties Error When Trying To Edit Data Flow Expressions
May 14, 2015
I'm trying to edit the Expressions of a Data Flow task. This seems to happen when I rename some of the Data Flow components but not always. The error I get is:
Element "[ADO Net Source].[SqlCommand]" does not exist in the collection "Properties"
However, if you look at the XML, this property does exist. So I'm not sure why this should occur.
I'm using SSIS 2008 R2 with Visual Studio 2008 V 9.0.30729.4462 QFE.
<component id="1" name="ADO Net Source" componentClassID="{2E42D45B-F83C-400F-8D77-61DDE6A7DF29}" description="Extracts data from a relational database by using a .NET provider." localeId="-1" usesDispositions="true" validateExternalMetadata="True" version="4" pipelineVersion="0" contactInfo="Extracts data from a relational database by using a .NET provider.;
[Code] ....
View 3 Replies
ADVERTISEMENT
May 19, 2008
hi,
this is sanjeev,
i have SSIS package, using my c# program i want to add one execute package task to this package's sequence container.
it is creating the new package with out any probelm. but when i opened the package and try to move the newly created exeute package task it is giving the following error.
the element cannot be found in a collection. this error happens when you try to retrieve an element from a collection on a container during the execution of the package
this is my code
{
Package pkg = new Package();
string str = (string)entry.Key;
pkg.Name = str;
alEntity = (ArrayList)entry.Value;
ConnectionManager conMgr;
Executable chPackage;
TaskHost executePackageTask;
Microsoft.SqlServer.Dts.Runtime.Application app = new Microsoft.SqlServer.Dts.Runtime.Application();
//string PackagePath = @"c:Genesis.dtsx";
//p = app.LoadPackage(PackagePath, null);
p = new Package();
p.LoadFromXML(parentPackageBody, null);
p.Name = str;
//Sequence seqContainer;
IDTSSequence seqContainer;
//seqContainer = (Sequence)p.Executables["Extract Genesis Data"];
seqContainer = ((Sequence)p.Executables[0]);
string packageLocation = @"Geneva Packages";
conMgr = p.Connections["SQLChildPackagesConnectionString"];
foreach (string val in alEntity)
{
if (seqContainer.Executables.Contains("Load_" + val) == false)
{
chPackage = seqContainer.Executables.Add("STOCK:ExecutePackageTask");
executePackageTask = (TaskHost)chPackage;
executePackageTask.Name = "Load_" + val;
executePackageTask.Description = "Execute Package Task";
executePackageTask.Properties["Connection"].SetValue(executePackageTask, conMgr.Name);
executePackageTask.Properties["PackageName"].SetValue(executePackageTask, packageLocation + ddlApplication.SelectedItem.Text + @"" + executePackageTask.Name);
}
}
app.SaveToXml(Server.MapPath("../SynchronizeScript/Packages/" + ddlApplication.SelectedItem.Text + @"") + str + ".dtsx", p, null);
}
please let me know what is the wrong in my code.
thanks in advance.
regards
sanjeev bolllina
sanjay.bollina@gmail.com
View 14 Replies
View Related
Apr 25, 2006
when executing my data flow package that contains only one source and one destination
OLE db source -> SQL server destination
the following errors occurs in my output
Error: 0xC0202009 at Data Flow Task(infraction action), SQL Server Destination [3600]: An OLE DB error has occurred. Error code: 0x80040E14.
Error: 0xC0202071 at Data Flow Task(infraction action), SQL Server Destination [3600]: Unable to prepare the SSIS bulk insert for data insertion.
Error: 0xC004701A at Data Flow Task(infraction action), DTS.Pipeline: component "SQL Server Destination" (3600) failed the pre-execute phase and returned error code 0xC0202071.
i've checked the structure of my source and destination table but nothing seems to be wrong
if someone have ever faced these errors help me :D
View 22 Replies
View Related
Aug 24, 2015
I "upgraded" to Windows 10 (I was installing a new c: drive anyway). I installed SQL Server & SSIS, Visual Studio 2012, SQL Server Data Tools 2012, etc. When I try to load up my project (.sln) in SQL Data Tools I get the following warnings/errors:
Warning 1
Warning loading DataImport.dtproj: Warning: Failed to decrypt an encrypted XML node. Verify that the project was created by the same user. Project load will attempt to continue without the encrypted information.
Warning 2
Warning loading DataImport.dtproj: Warning: Failed to decrypt sensitive data in project with a user key. You may not be the user who encrypted this project, or you are not using the same machine that was used to save the project. If the sensitive
data is a parameter value, the value may be required to run the package on the Integration Services server.
Error 3
Error loading ImportFiles.dtsx: The version number in the package is not valid. The version number cannot be greater than current version number.
Error 4
Error loading ImportFiles.dtsx: Package migration from version 8 to version 6 failed with error 0xC001700A "The version number in the package is not valid. The version number cannot be greater than current version number.".
Error 5
Error loading ImportFiles.dtsx: Error loading value "<DTS:Property xmlns:DTS="www.microsoft.com/SqlServer/Dts" DTS:Name="PackageFormatVersion">8</DTS:Property>" from node "DTS:Property".
Error 6
Error loading 'ImportFiles.dtsx' : The package failed to load due to error 0xC0010014 "One or more error occurred. There should be more specific errors preceding this one that explains the details of the errors. This message is used as a return
value from functions that encounter errors.". This occurs when CPackage::LoadFromXML fails.
As well as installing Windows 10, I had also renamed by computer. I have tried renaming it back (I noticed some references to the computer name in the xml), but no difference.Have I installed the wrong versions of one of the software? If so, how could I check which one I need to install (to match the VS project/dts package)?
View 3 Replies
View Related
Nov 24, 2006
Hi, all here,
Thank you very much for your kind attention.
I am wondering if it is possible to use SSIS to sample data set to training set and test set directly to my data mining models without saving them somewhere as occupying too much space? Really need guidance for that.
Thank you very much in advance for any help.
With best regards,
Yours sincerely,
View 5 Replies
View Related
May 9, 2012
I'm a beginner in ssis. Use of Pointer in Data Flow task (Transformations)Royal PS
View 11 Replies
View Related
Dec 7, 2007
We are using an OLE DB Source for the Data Flow Source and OLE DB Destination for the Data Flow Destination. The amount of data being moved is about 30 million rows, and it is gather using a sql command. There is not other transformations in between straight from one to another. The flow starts amazingly fast but after 5 million rows it slows considerably. Wondered if anyone has experienced anything similar with large loads.
View 6 Replies
View Related
May 11, 2015
Data flow A take data from the Excel File A, Data B from Excel File B, Data C from Excel File C. What I'd like to do is that if something goes wrong on Data Flow A I would be alerted but the package should continue to running. The same for the DataFlow B, if A it's ok go on, if B fail send me the mail but continue until the end (so running the Data Flow C).
View 2 Replies
View Related
Feb 11, 2014
I have a requirement to read an encrypted file as a data source. I am not allowed to save an unencrypted text file version on disc at any time for any length of time, therefore I created a custom source component that reads an encrypted csv file, decrypts it, and then passes each row of data to the pipeline and ultimately to an ole data destination. Basically it is just a text file reader with an added class that adds functionality that decrypts the file before the component sets columns or reads rows.
The custom component, “Encrypted File Source”, has a custom property “encryptionkey” with the encryption required flag set to true (code below) and is declared as eligible to be set in the expressions.
IDTSCustomProperty100 EncryptionKey = ComponentMetaData.CustomPropertyCollection.New();
EncryptionKey.Name =
"EncryptionKey";
EncryptionKey.Description =
"Secure String key value to decrypt the file";
EncryptionKey.Value =
string.Empty;
EncryptionKey.ExpressionType =
DTSCustomPropertyExpressionType.CPET_NOTIFY;
EncryptionKey.EncryptionRequired =
true;
I want to be able to set the password for the encrypted file in the SQL Agent job that executes the SSIS project. This means I have an environment with a variable, “DataPassword”, that is set to sensitive. It maps to a Project parameter in the SQL Agent job that is also set to sensitive. And I now I want to access that sensitive Project Password inside my data flow, specifically in the Encrypted File source task that I created and set my EncryptionKey to that Project Parameter.
The problem is that SSIS says.
"expression cannot be evaluated. ... The Expression will not be evaluated because it contains sensitive parameter value "$Project::DataFilePassord" . Verify that the expression is used properly and that it portects sensitive information"
((Microsoft.DataTransformationsServices.Controls) "<v:shapetype coordsize="21600,21600" filled="f" id="_x0000_t75" o:preferrelative="t" o:spt="75"
path="m@4@5l@4@11@9@11@9@5xe" stroked="f">
[Code] ....
I am using SQL Server 2012, on a windows 7 box with VS2010 premium.
View 4 Replies
View Related
Jul 2, 2010
In my SSIS Data Flow Task, I have a query that retrieves data based on a couple of date parameters. Is there a way we can pass/use the Variables defined in the SSIS package in the query ?
(I am assigning values to those variables from C# code)
The query should look like this:
select ordernumber, customerid from salesorder
where statecode=3 and datefulfilled between @variable1 and @variable2
View 8 Replies
View Related
Apr 29, 2015
I have a Data Flow Task within a ForEach loop container. The source of the flow is ADO.NET connection and the destination is a Flat File Connection. I loop through a collection of strings in the ForEach loop. Based on the string content, I write some data to the same destination file in each iteration overwriting the previous version. I am running into following Errors:
[Flat File Destination [38]] Warning: The process cannot access the file because it is being used by another process.
[Flat File Destination [38]] Error: Cannot open the datafile "Example.csv".
[SSIS.Pipeline] Error: Flat File Destination failed the pre-execute phase and returned error code 0xC020200E.
I know what's happening but I don't know how to fix it. The first time through the ForEach loop, the destination file is updated. The second time is when this error pops up. I think it's because the first iteration is not closing the destination file. How do I force a close of the file within Data Flow task or through a subsequent Script Task.This works within a SQL 2008 package on one server but not within SQL 2012 package on a different server.
View 5 Replies
View Related
Oct 19, 2015
I am using the SharePoint adapters from Codeplex that allow me to use SharePoint source and destination tasks in SSIS for SQL Server 2008 and SharePoint 2010. I am able to pull the data from the SQL Server and insert it into the SharePoint List.
However, I prefer to just have fresh data every time, so I'd like to add a step to delete all the items in the list before inserting the new ones. Is there a way I can configure the SharePoint SSIS destination task to clear all the items before I insert new ones?
View 3 Replies
View Related
Jun 1, 2015
Using SSIS 2012 (within Visual Studio) on Windows 7.
Before allowing my Data Flow task to fire, I'd like to check the target table (OLE DB Destination) for a specific date value in a specific field. I've seen how the Lookup Task is commonly used to check for dupes before inserting, but I'm not able to use that method because the data value I want to search the table for is contained in a Global Variable (let's say "MyVariableDate").
Is there any way to check for any records in a target table where Date1 = MyVariableDate (i.e. scanning the entire table for any occurrence of MyVariableDate in the Date1 field)?
View 12 Replies
View Related
Nov 3, 2015
Suppose if I have a “Foreach Loop Container” that iterates over a list. Is it possible to execute different data flow tasks based on the input?
Example : List contains elements L1, L2 & L3.
ForEach Loop Container checks the input. If its L1 then it should execute DF Task1, If L2 then execute
DF Task2 and similarly for L3.
Is it possible to achieve this?
View 4 Replies
View Related
Sep 25, 2015
I have created an event that contains a Data flow tasks with OLE DB source & Excel Destination.
This event is executed/triggered based on an execute SQL task failure in the control flow Sequence container.
However, when I execute the Data Flow task of the Event Handler, it runs successfully but fails when I execute the whole package.
I get the below error message:
[OLE DB Source [21]] Error: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "TK463DW" failed with error code 0xC0202009. There may be error messages posted before this with more information on why the AcquireConnection method call failed.
I have tried setting the property 'DelayValidation' to 'True' on all the Control Flow and Data Flow tasks on the package and on the Event Handler, but still I could not fix this.Not sure What I am missing.
View 4 Replies
View Related
Feb 3, 2014
I recently upgraded to on 2012 SP1 CU5 and have found the SSDT gui for SSIS to be almost unusable. I can't drag or resize items. Any time i try they either automagically shrink to the tiniest possible size, shoot off to some extreme or just shake uncontrollably I didn't have these problems on previous versions (dont remember what It was).
Is there a fix for this?
View 9 Replies
View Related
Jun 4, 2015
I have huge data and i am loading data from EXCEL to database table, after loading 80 percent data i am getting some error. My package got failed and it has lots of transformation and took around 6 hours to process completely because of that i don't want it to reload from start. if i run it again it should start from next record from where i got the error.
View 3 Replies
View Related
Oct 18, 2015
We have a single generic SSIS package that is used to import several hundred iSeries tables into SQL. I am not looking to rewrite the process. But I am looking for ways to improve performance.
I have tried retain same connection, maximum insert commit size, lock table (tablock), removed some large columns, played with the log file location and size, and now I am working to tweak the defaultbuffermaxrows.
To describe the data flow task - there are six data flows tasks (dft) working at the same time. Each dtf has their own list of iSeries tables and columns and the corresponding generic SQL table names. Each dtf determines their list of tables based on the number of columns to import. So there is dft30 (iSeries table has 1-30 columns to import), dtf60 (iSeries table has 31-60 columns to import), etc. The destination SQL tables are generically called Staging30, Staging60, etc. Each column in the generic Staging tables are varchar(100). The dtfs are comprised of an OLE DB Source and an OLE DB Destination.
The OLE DB Source uses a SQL Command from Variable to build a SELECT statement. The OLE DB Source uses a connection manager that uses an IBM iAccess IBMDA400 provider. The SQL Command ends up looking like this for the dtf30. This specific example is importing from the iSeries table TDACLR and it only has two columns so it will be copied to the Staging30 table.
select TCREAS AS C1,TCDESC AS C2,0 AS C3,0 AS C4,0 AS C5,0 AS C6,0 AS C7,0 AS C8,0 AS C9,0 AS C10,0 AS C11,0 AS C12,0 AS C13,0 AS C14,0 AS C15,0 AS C16,0 AS C17,0 AS C18,0 AS C19,0 AS C20,0 AS C21,0 AS C22,0 AS C23,0 AS C24,0 AS C25,0 AS C26,0 AS C27,0 AS
C28,0 AS C29,0 AS C30,''TDACLR'' AS T0 from Store01.TDACLR
The OLD DB Source variable value looks like the following, but I am not showing the full 30 columns
select cast(0 AS varchar(100)) AS C1,cast(0 AS varchar(100)) AS C2,cast(0 AS varchar(100)) AS C3,cast(0 AS varchar(100)) AS C4,cast(0 AS varchar(100)) AS C5, ... cast(0 AS varchar(100)) AS C30.
The OLE DB Destination uses OpenRowSet Using FastLoad From Variable. The insert into Staging30 ends up looking like this.
insert bulk STAGE30([C1] varchar(100) ,[C2] varchar(100) ,[C3] varchar(100) ,[C4] varchar(100) ,[C5] varchar(100) , ... ,[C30] varchar(100) ,[T0] varchar(20)
Of course we then copy and transform the Staging30 data to the SQL table that equals T0.
But back to defaultbuffermaxrows. Previously the dtfs had default values of 10000 for DefaultBufferMaxRows and 10485760 for DefaultBufferSize. I added a SQL task to SUM the iSeries column sizes, TCREAS and TCDESC in this example, and set the DefaultBufferMaxRows by dividing the SUM of the columns max_length into 10485760. But I did not see a performance improvement. Do you think that redefining the columns as varchar(100) for the insert is significant? Should I possibly SUM the actual number of columns (2) as 2x100 or SUM the 30x100?
View 4 Replies
View Related
Sep 22, 2015
Basically i'm trying to create an SSIS workflow to download Sharepoint List data to SQL Server on a schedule of some kind.do we actually have to use the GAC install approach in order to get the Sharepoint List Destination and Sharepoint List Source entries to appear on the SSIS Project workflow entities?
View 4 Replies
View Related
Jul 26, 2006
I'm trying to set the value of the SelectedDatabases collection property of an Update Statistics Task using an expression for the task, basing it on a user variable. Ultimately, I'll set that variable's value from a configuration file at run time and thereby specify the database whose statistics I want to update.
When I set this up in the designer, I get an error at the time I try to save the package:
TITLE: Microsoft Visual Studio
------------------------------
Nonfatal errors occurred while saving the package:
Error at Update Statistics Task: The result of the expression "@[User::SelectedDatabase]" on property "SelectedDatabases" cannot be written to the property. The expression was evaluated, but cannot be set on the property.
Error at Update Statistics Task: The result of the expression "@[User::SelectedDatabase]" on property "SelectedDatabases" cannot be written to the property. The expression was evaluated, but cannot be set on the property.
------------------------------
BUTTONS:
OK
------------------------------
I used the expression editor's expression builder to select the user variable, so I assume the syntax is correct. However, I'm suspecting that there's some additional syntax required to store that value in the collection's array. I cannot find any references to any special formats and am wondering if anyone know's how to do this correctly. On the other hand, maybe it is telling me setting this property from an expression is not even allowed. If it's not allowed, why is it in the list of properties to set? In any case, none of this is clear.
Any ideas or suggestions are appreciated. Thanks,
Joe
View 8 Replies
View Related
Jul 2, 2015
Is it possible to do? I'm getting lock violations in I try to execute several tasks in parallel.
View 4 Replies
View Related
Jun 16, 2015
We have created SSIS package to load a text file into a table. Source system shares 10 text files and recently they stopped generating data for one of the text file (comping empty), after few months they will start generating the data for the empty file batch processing.
The Issue here is Data Flow task is getting failed while loading empty text file into table. How to handle this empty file load issue in SSIS package.
View 3 Replies
View Related
Sep 18, 2006
Hi,
I'm creating a custom data flow transformation in c#.
I would like to use expressions within this component in the same way as in the derived column component: specifying the expression as a custom property of an output column, then evaluating this expression for each row of the buffer and using this evaluated expression to populate my output column values.
So I've added an custom expression on my output column, and set its expression type to CPET_NOTIFY
IDTSCustomProperty90 exp = col.CustomPropertyCollection.New();
exp.Name = "Expression";
exp.ExpressionType = DTSCustomPropertyExpressionType.CPET_NOTIFY;
But in the ProcessInput method I don't manage to get the evaluated expressions, when I use exp.Value I get my expression definition and not its evaluation.
Is there a way to get these evaluated expressions ?
Thanks,
Stéphane
View 6 Replies
View Related
Jan 21, 2008
Hello,
I use a XML Data flow item transformation in SSIS (XML Source). Here is an example of the input of xml file content:<extract date="2007-12-05"> <counters> <counter category="dispatcher" name="server1"> <runtime>6</runtime> <queue>3</queue> <maxrequest>8</maxrequest> <color>blue</color> <host> <name>svo2555</name> <path>\dispatcher</path> <lastaccessed>2007-02-03</lastaccessed> </host> </counter> <counter category="gateway" name="server1"> <runtime>1</runtime> <queue>10</queue> <maxrequest>10</maxrequest> <color>purple</color> <host> <name>svo2555</name> <path>\gateway</path> <lastaccessed>2007-02-03</lastaccessed> </host> </counter> </counters></extract>
How do I catch the extract date in the transformation? SSIS only shows me the <counters> element and not the <extract> one. Why? Is there a way to get it?Regards,
View 1 Replies
View Related
May 1, 2007
I've got a package which reads a text file into a table and updates another. I set up configurations so that I could import it into the SSIS store on both my dev and live servers. Now, I'm getting this error. I tried removing the configs and am still getting it.
I've been through each step and everything looks okay. Does anyone have any idea (a) what's wrong, (b) how to localise the error or (c) get any additional information? Or do I just have to recreate the package from scratch?
TITLE: Package Validation Error
------------------------------
Package Validation Error
------------------------------
ADDITIONAL INFORMATION:
Error at PartnerLinkFlatFileImporter: The connection "" is not found. This error is thrown by Connections collection when the specific connection element is not found.
Error at PartnerLinkFlatFileImporter [Log provider "SSIS log provider for SQL Server"]: The connection manager "" is not found. A component failed to find the connection manager in the Connections collection.
(Microsoft.DataTransformationServices.VsIntegration)
------------------------------
BUTTONS:
OK
------------------------------
View 20 Replies
View Related
May 16, 2008
Hi,
How do I retrieve the connections (connection managers) collections from Custom Data Flow destination? ComponentMetadata.RuntimeConnectionCollection is empty. I would like to be able to access all the connections defined in the package from the custom data flow task.
I came across code in which it was possible to access the Connections collection using the IDtsConnectionService for custom task (destination). The custom task has access to serviceProvider, whcih can be used to get access to the IDtsConnectionService interface but not the custom data flow task.
Any help appreciated.
Thanks
Naveen
View 5 Replies
View Related
Apr 28, 2015
I found a problem in the SQL that's a part of a SSIS package written by a third party, which is in our SQL Server 2008 R2 database. I've discussed this with the third party person; she's given me the necessary changes to the SQL. But I don't know what to do next.We've got this package on a test database server. I've got SSIS open to Stored Packages and have drilled down to the necessary folders to find it. But beyond that I don't know what to do. I understand that BIDS is involved, but that's as far as my understand of it goes.
View 3 Replies
View Related
Mar 21, 2008
Hi,
I have SQL Server 2005/BIDS installed on a 64-bit server. When I open an SSIS package the properties window for each of the Data Flow tasks is empty. The properties window is there and displays correctly for other types of task but for the Data Flows it only shows the name of the task. To further compound my misery if I try to open a package that dynamically sets a property of a Data Flow task (using an expression) the package fails to open with errors concerning reading the XML of the package.
Packages containing Data Flow tasks still run on this server (both in BIDS and using dtexec) as long as they don't contain expressions that set any Data Flow properties.
Any ideas?
Thanks
View 3 Replies
View Related
Sep 21, 2015
I have an SSIS package that is pretty simple.
An Execute SQL task returns one row with two values that are correctly stored into variables.
Based off those two variables, a sequence container is chosen to execute.
That sequence container then does magic. The last step of the container has an execute SQL task that runs and stores the result in a variable - let's call this[User::result] with type Int32.
Based off the value in [User::result] one of two final tasks run. A non zero value executes one task; a zero value executes another. Basically,SOMETHING runs after this.
Simple? I thought so... except Step 4 doesn't work. If I set a post-execute breakpoint on the sequence container, the variables are populated as expected, but nothing else runs. If I add another task to run (without a conditional expression) to run after the sequence container completes, the pre-execute breakpoint shows the data looking exactly as I expect.
Those script tasks are just MessageBox.Show calls and the expression, as you can see, doesn't use variables at all.
View 5 Replies
View Related
Oct 13, 2015
I am facing an issue that Data flow task failing after loading 29000 rows out of 2lakhs rows.
I am loading data from .csv file to OLE DB Destination.
This data flow task is placed inside For each loop container.
is this issue because of any performance issue in SSIS packages such as buffer size.
find the error below:
DFT Load Data from FlatFile:Error: The conditional operation failed.
DFT Load Data from FlatFile:Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR.
The "DER Add Calc Columns" failed because error code 0xC0049063 occurred, and the error row disposition on "DER Add Calc Columns.Outputs[Derived Column Output].Columns[M_VALUE_NUM]" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
DFT Load Data from FlatFile:Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "DER Add Calc Columns" (48) failed with error code 0xC0209029 while processing input "Derived Column Input" (49). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
[code]....
View 8 Replies
View Related
Jul 20, 2015
I've deployed my ssis pkg to the server and created a sql job to run this pkg. So far, everything is fine. Today, I got a request to change some variables inside the package which is part of the .dtsconfig. I want to edit the deployed .dtsConfig but it won't allow me and always complained that this file has been opened by another program. I am sure i've closed my ssis designer and other notpad, why can't I edit and save .dtsconfig file?
View 4 Replies
View Related
Jul 7, 2015
I have declared one variable in Project param with some value.
I want to edit that varaiable through Script task using C# / VB code.
Looking for C#/VB code which needs to be used in Script task to edit project param level variable[not for package level variable].
View 3 Replies
View Related
May 14, 2015
I would like to store the file properties like (title, created by , created time, modified time) and load in to a audit table along with file name. I know how to design this process inside a foreach lookup with script task and execute sql task.
I'm trying to achieve this by editing my existing code which captures the most recent file based on its file properties.
public void Main()
{
// TODO: Add your code here
var directory = new DirectoryInfo(Dts.Variables["User::Csv_Source"].Value.ToString());
FileInfo[] files = directory.GetFiles("*.csv");
DateTime lastModified = DateTime.MinValue;
[Code] ....
View 3 Replies
View Related