Unable To Execute Package Programmatically With Expression Based Tasks
Dec 8, 2006
Hi,
I am trying to programmatically execute a package that contains an Execute SQL Task component bound to a variable for its "SqlStatementSource" property (via an expression). The variable is of type String and contains a simple value of "SELECT 1". The Execute SQL Task contains an expression that sets the SqlStatementSource property to the value of this variable.
The package runs fine when I execute it via dtexec or BIDS, but when I attempt to run it via the object model, I receive the following error message:
The result of the expression ""@[User::Sql]"" on property "SqlStatementSource" cannot be written to the property. The expression was evaluated, but cannot be set on the property.
I did a search on this forum and noticed quite a few threads about this same issue, but no explanation/solution. We have quite a few packages that have dynamically constructed SQL statements for Execute SQL Tasks, and they are all failing to run via the object model. Is there something that I am missing?
I'm trying to keep track of the ETL process inserting/updating a row in one table for each package that finish in my ETL process when executing. So far, I created a Script task that increments by one a variable (counter) and then open a connection to my database an insert/update my table. What I want to see is Step 1/30, Step 2/30 and so on. Right know I can display Step 1, Step 2 but how can I get the overall number of tasks within a package?
I have come across a situation where there 10 tasks. The second task on the flow is a script task which disables all further tasks based on a condition. I thought that the logic would be better if we force terminate the package successfully at this stage itself. How can this be done.
In several threads there has been discussion regarding adding connection managers to a package's data flow, etc. My challenge is that I have a large solution that contains many packages, and I need to change the connection manager linked to the data flow in all of the packages. When the solution was initially designed, data sources were used, and it has become a tedious maintenance issue to keep those in sync. We want to use a standard OLEDB connection manager, but adding a connection manager to each package and editing the corresponding data flow tasks in each package to use that new connection manager is a daunting task. I've coded a .Net module to access the packages, remove the old connection manager (data source) and add the new OLEDB data source. However, as I traverse the objects in the package hierarchy, when I come to the data flow object, the innerobject is not a dts object, but rather a _com object.. I can't seem to find any documentation/examples as to how to iterate the tasks within a data flow and change the connection manager. If you have any information, that would be quite helpful. If you reply with a code sample, if you would be so kind as to relate it to one of the sample packages provided with SSIS so I can run it, that would be great.
I am making some changes to a tracking SP that I'm using in about 100 SSIS packages. I don't want to go into each package and add the additional parameter (A "?") and map a variable to some ordinal position. I'd like to do this programmatically.
I have some expirence in validing variables and manipulating connection objects in packages though a utility that I wrote. However, I cannot find any guidance on How to create/edit Execute SQL Tasks via code in the package. I would like to do something like this..
Code Snippet private Microsoft.SqlServer.Dts.Runtime.Application app = new Microsoft.SqlServer.Dts.Runtime.Application(); Package p = app.LoadPackage(DTSPackage, null); p.Executables["My Execute SQL Task"]
However, the Executables returns an Executable object that is a DtsObject. Do I have to add a reference to the task and cast this executable to that type? Any help is appreciated.
I have a Package and a DataFlow Task. The Package has TransactionOption=Required. The DataFlow Task has an OLE DB Source and an OLE DB Destination. The DataFlow Task has TransactionOption=Supported. The package executes on a Workstation and DataSources for the OLE DB Source and the OLE DB Destination are on a Server.
After the package had been launched an error message showed:
[OLE DB Destination [43]] Error: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "DWH_Destination" failed with error code 0xC0202009. There may be error messages posted before this with more information on why the AcquireConnection method call failed. [DTS.Pipeline] Error: component "OLE DB Destination" (43) failed the pre-execute phase and returned error code 0xC020801C.
[Connection manager "DWH_Destination"] Error: The SSIS Runtime has failed to enlist the OLE DB connection in a distributed transaction with error 0x8004D024 "The transaction manager has disabled its support for remote/network transactions.". [Connection manager "DWH_Destination"] Error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x8004D024.
If I set TransactionOption=NotSupported in the DataFlow Task then the package executes successful.
I have some "Execute T-SQL Statement Tasks" in a package. I would like to run this same package on another SQL Server without having to change it on the other server. Since the server name can be given when setting up the connection, I think if I leave the server name out then the package could run on any server? Is my assumption correct?
I have created a logging test package that I am attempting to execute in SQLAgent. It uses an environment variable in Package Configurations to set a variable which is the expression value for the ConnectionManager's 'ole_src_Admin' ConnectionString. The next three Package Configurations use that connection to retrieve SQL Server configuration types/
Using the same account I can execute the package succesfully from cmd line The execution string is dtexec /SQL "Parent Package" /SERVER "BLAHSQL-DEV" /MAXCONCURRENT " -1 " /CHECKPOINTING OFF /REPORTING V
Attempting to execute this via SQLAgent (type CmdExec) I receive the following error: Date 9/26/2007 9:47:45 AM Log Job History (Logging Test) Step ID 3 Server QRISQL-DEV Job Name Logging Test Step Name Execute 2005 Package Duration 00:00:01 Sql Severity 0 Sql Message ID 0 Operator Emailed Operator Net sent Operator Paged Retries Attempted 0 Message Microsoft (R) SQL Server Execute Package Utility Version 9.00.3042.00 for 64-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 9:47:45 AM Info: 2007-09-26 09:47:46.28 Code: 0x40016038 Source: Parent Package Description: The package is attempting to configure from the environment variable "SQL_Parameter_Connect_String". End Info Info: 2007-09-26 09:47:46.28 Code: 0x40016040 Source: Parent Package Description: The package is attempting to configure from SQL Server using the configuration string ""ole_src_Admin";"[dbo].[Parameter]";"sv_ssis_package_store_connectionstring";". End Info Error: 2007-09-26 09:47:46.30 Code: 0xC0202009 Source: Parent Package Connection manager "ole_src_Admin" Description: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E21. End Error Info: 2007-09-26 09:47:46.30 Code: 0x40016040 Source: Parent Package Description: The package is attempting to configure from SQL
I have also succeffuly executed this package in the same manner locally using both the cmd line and SQLAgent.
I'm creating an SSIS package that will execute legacy dts packages. The package to be executed is decided at runtime using a sql query task.
Executing a dts package statically works fine, but when I try to set the details of the dts to be run via expressions, I get the error below.
To make it dynamic, I created a variable of type string, and put the package name in here. I also have a string variable for the packageid. Then I set up an expression on the Execute DTS 2000 Package Task that sets the PackageName & PackageID property to this variable.
The PackageId is a string variable I've retrieved using:
select top 1 name, cast(id AS varchar(50)) id, cast(versionid AS varchar(50)) versionid from sysdtspackages where name = @PackageName order by createdate desc
When I use the task to set the package id it works find (by selecting a dts, then changing the name), but when I try to provide the package id I get this message:
but the method signature specifies String PackageGuid, and it is a string..
any ideas??
i tried casting the variable like so:
(DT_GUID) @[User:TSPkgId], as the versionid is a uniqueidentifier on sysdtspackages.. It didn't like that at all (can't cast from type DT_WSTR to DT_GUID error code 0xC00470C2)
I would like to evaluate expressions from within my execute function in a custom task. I saw this post from about 8 months ago detailing how it would be possible:
Ive been using SSIS for a month or two and now find I need to create some custom tasks to perform some performance logging. in the the overloaded ProviderComponentProperties section I am trying to create a property which has the same look as the Expressions properties you find elsewhere (Little + on the left and a group of sub properties when expanded).
Ive have played with creating a IDTSCustomPropertyCollection90 collection then adding my sub properties to it but I cant seem to then add my new collection to the ComponentMetaData.CustomPropertiesCollection.
Im assuming the Expressions parameter is a collection added to the properties collection but I cant figure out how. Any help would be much appreciated.
I have experienced problem while trying to use variable with expression based on several other variables in tasks running parallel.
The details are as following:
There is a SSIS package with simple Control flow: one Script Task which actually do nothing and two Execute Process Tasks, they run after Script Task in parallel. Then there are three simple (EvaluateAsExpression = False) string variables ServerName, Folder and JobNumber with values ServerName = \test?, Folder = test? and JobNumber = 12345?. And there is one variable FullPath with expression @[User:: ServerName] + "\" + @[User::Folder] + "_" + @[User::JobNumber]. All the variables are of the Package scope. Then in Execute Process Tasks I have similar expressions based on FullPath variable: Execute Process Task 1 has expression @[User::FullPath] + "\date.bat" and Execute Process Task 2 has @[User::FullPath] + "\time.bat" one. As you understand these expressions define what exactly task should execute.
Then Im going to execute package from command line so appropriate XML configuration file has been created. The file contains following values for variables described above: ServerName = \LiveServer?, Folder = Job? and JobNumber = 33091?.
After series of consequent executions I have got following log file:
¦ Execute Process Task 1¦ Executing the process \LiveServerJob_33091date.bat?
¦ Execute Process Task 2¦ Executing the process \Test est_12345 ime.bat?
¦ Execute Process Task 1¦ Executing the process \Test est_12345date.bat?
¦ Execute Process Task 2¦ Executing the process \LiveServerJob_33091 ime.bat?
¦ Execute Process Task 1¦ Executing the process \LiveServerJob_33091date.bat?
¦ Execute Process Task 2¦ Executing the process \Test est_12345 ime.bat?
¦ Execute Process Task 1¦ Executing the process \LiveServerJob_33091date.bat?
¦ Execute Process Task 2¦ Executing the process \LiveServerJob_33091 ime.bat?
¦
As you can see one of Execute Process Tasks usually receive correct value of the expression (based on values of variables from the configuration file) while another - incorrect one (based on default? values of variables set directly in package). Sometimes wrong value appears in Task 1, next time in Task 2. Situations when both expressions in tasks evaluated correctly are very rare.
Then if you add some more Execute Process Tasks with similar expressions in the package (for ex. simply by copying existing tasks) youll get a good chance to catch error like this:
OnError,,,Execute Process Task 1,,,8/17/2007 2:07:12 PM,8/17/2007 2:07:12 PM,-1073450774,0x,Reading the variable "User::FullPath" failed with error code 0xC0047084.
OnError,,,Execute Process Task 1,,,8/17/2007 2:07:12 PM,8/17/2007 2:07:12 PM,-1073647613,0x,The expression "@[User::FullPath] + "\time.bat"" on property "Executable" cannot be evaluated. Modify the expression to be valid.
Seems variable with expression FullPath is locked during evaluation by one of the parallel tasks in such a way that another task cant read it value correctly. Can someone help me with the issue? Maybe there are some options I missed which could prevent such behavior of application? Please let me know how I can make the package work correctly.
i have a for-loop-conatiner with two sql-tasks inside. i'm unable to add a precendence between this two tasks inside the container. what's wrong? i want first execute one task in the container, then the other. so, i guess, i have to link the two tasks. but it's not possible.
Hi, I have a table with queries. I need to execute those queries and pass results into a variable. Then use that variable/result to execute other queries to make business decisions. EXAMPLE: TASK#1 TABLE-A :has queries below select count(*) from employee select count(* ) from mangers
I want to execute those queries and store results in @counts. How I execute all queries in table A and pass that to a variable?
TASK#2 Then I have another SQL task(may be SQL task) which use the value @counts make some decisions If @count > 1 then pass If @count <1 then fail
How can I do that?
I am still new to SSIS and not very familiar with variables. Any advice would be appreciated.
I have a simple Error row redirection (from an OLEDB Command) to redirect all rows in error to a Transform script and thereafter to a Flat file destination. This is via the red arrow (DF path) output from the OLEDB Command.
I don't understand why this leg executes even when there are no rows in error? Zero byte Flat Files get written out when there are no errors.
How come? Why would a path with a red arrow execute even when there are no errors?
Part 2 When I introduce some errors in the data to cause an integrity violation, and I hook up an OnError event handler, it is never raised even through the error rows are successfully redirected and written out to the Flat File destination.
So what consitutes an error for a Data Flow Task? Is an error raised by Sql Server for an integrity violation bubble up as an error in the SSIS package?
Maybe I'm missing something, but I can't find how to run multiple tasks in sequence while in Visual Studio debug mode. In DTS design mode I grew accustomed to right-clicking tasks one-at-a time, but in SSIS I find the additional step of having to exit Debugging mode after every task gets old after a while.
There must be a way to start execution at a certain task and have the package continue all the way to some other specified task. It would also be nice to have every task in a Group execute in sequence and stop (even if connections continue beyond the group). I could even settle for repeatedly clicking the Continue button in Debug mode, but it's always grayed out when the current task is finished!
I have two servers, A for dev and B for production.
On server A I developed a project containing a SSIS package using SQL Server business intelligence development studio. The package runs fine from the BIDS and also when I save it to SQLServer itself and run it as a scheduled job using the SQLServer Agent.
All ready to roll out to Server B I thought, so I then saved the .dtsx file to a shared network drive.
On Server B, I created a empty project with the same name as it had on Server A. I then imported the .dtsx file into the project using project > add existing item.
The package appeared to import ok but I now cannot execute any of the data flow tasks in isolation. If I right click on them, there is no option to 'execute task' as there should be, it is not greyed out, it's not there at all.
Also, if I attempt to debug the whole package I get a message saying 'This document is opened by another project'.
Can anyone help with this as my deployment to live isn't going very well to say the least!
Both server A + B are 32-bit 2005 std edition SP1 on W2003 Server std edition SP1.
In my ETL project, l need to extract rawdata using winzip, and the DOS command l use is c:program filesWinzipwinzip32.exe -min -e c: awdata est.zip c: awdata where -min for minimize and -e for extract, c: awdata est.zip for source file and c: awdata for destination. It works fine by using DOS command.
l configure the Execute Process Tasks with the following parameters, i.e. RequiredFullFileName : TRUE Executable : c:program filesWinzipwinzip32.exe Arguments : -min -e WorkingDirectory : c:program filesWinzip StandardInputVariable : User::gsRawFile StandardOutputVariable : User::gsDestDir Where User::gsRawFile = c: awdata est.zip and User::gsDestDir = c: awdata
I added tasks to refresh two tables (delete from, insert into, update) to an SSIS project . I have them running from the WinXP scheduler. The issue:
In dev the tasks integrate and execute successfully from scheduler
In prod I can right click and execute each of the three tasks without any problem, but the same tasks cause my project to fail when executed from scheduler
Questions: Any ideas about what I am failing to see? How do I get a meaningful log messages from the tasks that are failing?
Thanks for your ideas...
Installed Edition: IDE Standard
SQL Server Analysis Services Microsoft SQL Server Analysis Services Designer Version 9.00.1399.00
SQL Server Integration Services Microsoft SQL Server Integration Services Designer Version 9.00.1399.00
SQL Server Reporting Services Microsoft SQL Server Reporting Services Designers Version 9.00.1399.00
Microsoft Visual Studio 2005 Version 8.0.50727.42 (RTM.050727-4200) Microsoft .NET Framework Version 2.0.50727
Installed Edition: IDE Standard
SQL Server Analysis Services Microsoft SQL Server Analysis Services Designer Version 9.00.1399.00
SQL Server Integration Services Microsoft SQL Server Integration Services Designer Version 9.00.1399.00
SQL Server Reporting Services Microsoft SQL Server Reporting Services Designers Version 9.00.1399.00
The variable 1 is created as int32 and variable 2 is created as dattime.
When i execute the SQLtask, I get error:
[Execute SQL Task] Error: Executing the query "Exec mysp 'table1',OUTPUT,OUTPUT" failed with the following error: "Error converting data type nvarchar to int.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
What am i missing. I tried changing the data types adding the input variable also as a variable in the mapping. Nothing seems to work. Any ideas please?
I'm looking for a way to refer to a package variable within any Transact-SQL code included in either an Execute SQL or Execute T-SQL task. If this can be done, I need to know the technique to use - whether it's something similar to a parameter placeholder question mark or something else.
FYI - I've been able to successfully execute Transact-SQL statements within the Execute SQL task, so I don't think the Execute T-SQL task is even necessary for this purpose.
I have a master package, which executes child packages that are located on a SQL Server. The Child packages execute other child packages which are also located on the SQL server.
Everything works fine when I execute in process. But when I set the parameter in the mater package ExecutePackageTask to ExecuteOutOfProcess = True, I get the following error
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "Row Count" (5349).
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "SCR Custom Split" (6399).
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "SCR Data Source" (5100).
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "DST_SCR Load Data" (6149).
The child packages all run fine when executed directly, and the master package runs fine if Execute Out of Process is False.
Can a package executed from code be aborted in code? The Package class has an Execute() method but no Abort() method. Clearly the debugger can stop a package at any point, so it must be possible somehow.
I built a small application that helps me create new packages by loading a template and modifying some parameters, and eventually save the updated package to a user defined location. This application uses the SSIS programming API.
I know all the packages created this way from the same template will have the same package ID. Is there a way to regenerate the package ID programmatically? I don't want to use dtutil.exe, but rather a nice package level method in the API.
Hi, I was wondering if it is possible to change the password on a SQL Server 2000 DTS package programatically instead of justmanually using the the interface? I have over 150 DTS packages used for building BI data cubes and our password policy requires a password change every 60 days. I would like to write a windows vb.net application that loops through all the DTS packages and changes the password. Please advise. Thanks Chris
I am trying to write a program that creates packages on the fly depending on the variables you pass. for eg. It should create connection managers on the fly specific to a certain file in the variable (eg. sample.csv). the package has a dataflow task and it has flat file source and oledb destination.The problem I am facing is the flat file source when assigned to a flat file connection manager(dynamically), it is not giving any source output columns. i.e, the value for DFSource.OutputCollection(0).OutputColumnCollection.Count is Zero. But when I use the same code and reverse the source and destination(oledb as source and flatfile as destination), it is working fine. I searched everywhere for resources on how to develop packages programmatically, but could not find any except one example on msdn. If anyone knows about this prob or any useful resources on this subject, it would be really helpful.
I was intended to write a program that will create a SSIS package which will import data from a CSV file to the SQL server 2005. But I did not find any good example for this into the internet. I found some example which exports data from SQL server 2005 to CSV files. And following those examples I have tried to write my own. But I am facing some problem with that. What I am doing here is creating two connection manager objects, one for Flat file and another for OLEDB. And create a data flow task that has two data flow component, one for reading source and another for writing to destination. While debugging I can see that after invoking the ReinitializedMetaData() for the flat file source data flow component, there is not output column found. Why it is not fetching the output columns from the CSV file? And after that when it invokes the ReinitializedMetaData() for the destination data flow component it simply throws exception. Can any body help me to get around this problem? Even can anyone give me any link where I can find some useful article to accomplish this goal? I am giving my code here too. I will appreciate any kind of suggestion on this.
Code snippet:
public void CreatePackage() { string executeSqlTask = typeof(ExecuteSQLTask).AssemblyQualifiedName; Package pkg = new Package(); pkg.PackageType = DTSPackageType.DTSDesigner90; ConnectionManager oledbConnectionManager = CreateOLEDBConnection(pkg); ConnectionManager flatfileConnectionManager = CreateFileConnection(pkg); // creating the SQL Task for table creation Executable sqlTaskExecutable = pkg.Executables.Add(executeSqlTask); ExecuteSQLTask execSqlTask = (sqlTaskExecutable as Microsoft.SqlServer.Dts.Runtime.TaskHost).InnerObject as ExecuteSQLTask; execSqlTask.Connection = oledbConnectionManager.Name; execSqlTask.SqlStatementSource = "CREATE TABLE [MYDATABASE].[dbo].[MYTABLE] ([NAME] NVARCHAR(50),[AGE] NVARCHAR(50),[GENDER] NVARCHAR(50)) GO"; // creating the Data flow task Executable dataFlowExecutable = pkg.Executables.Add("DTS.Pipeline.1"); TaskHost pipeLineTaskHost = (TaskHost)dataFlowExecutable; MainPipe dataFlowTask = (MainPipe)pipeLineTaskHost.InnerObject; // Put a precedence constraint between the tasks. PrecedenceConstraint pcTasks = pkg.PrecedenceConstraints.Add(sqlTaskExecutable, dataFlowExecutable); pcTasks.Value = DTSExecResult.Success; pcTasks.EvalOp = DTSPrecedenceEvalOp.Constraint; // Now adding the data flow components IDTSComponentMetaData90 sourceDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New(); sourceDataFlowComponent.Name = "Source Data from Flat file"; // Here is the component class id for flat file source data sourceDataFlowComponent.ComponentClassID = "{90C7770B-DE7C-435E-880E-E718C92C0573}"; CManagedComponentWrapper managedInstance = sourceDataFlowComponent.Instantiate(); managedInstance.ProvideComponentProperties(); sourceDataFlowComponent. RuntimeConnectionCollection[0].ConnectionManagerID = flatfileConnectionManager.ID; sourceDataFlowComponent. RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.ToConnectionManager90(flatfileConnectionManager);
managedInstance.AcquireConnections(null); managedInstance.ReinitializeMetaData(); managedInstance.ReleaseConnections(); // Get the destination's default input and virtual input. IDTSOutput90 output = sourceDataFlowComponent.OutputCollection[0]; // Here I dont find any columns at all..why?? // Now adding the data flow components IDTSComponentMetaData90 destinationDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New(); destinationDataFlowComponent.Name = "Destination Oledb compoenent"; // Here is the component class id for Oledvb data destinationDataFlowComponent.ComponentClassID = "{E2568105-9550-4F71-A638-B7FE42E66922}"; CManagedComponentWrapper managedOleInstance = destinationDataFlowComponent.Instantiate(); managedOleInstance.ProvideComponentProperties(); destinationDataFlowComponent. RuntimeConnectionCollection[0].ConnectionManagerID = oledbConnectionManager.ID; destinationDataFlowComponent. RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.ToConnectionManager90(oledbConnectionManager); // Set the custom properties. managedOleInstance.SetComponentProperty("AccessMode", 2); managedOleInstance.SetComponentProperty("OpenRowset", "[MYDATABASE].[dbo].[MYTABLE]"); managedOleInstance.AcquireConnections(null); managedOleInstance.ReinitializeMetaData(); // Throws exception managedOleInstance.ReleaseConnections(); // Create the path. IDTSPath90 path = dataFlowTask.PathCollection.New(); path.AttachPathAndPropagateNotifications(sourceDataFlowComponent.OutputCollection[0], destinationDataFlowComponent.InputCollection[0]); // Get the destination's default input and virtual input. IDTSInput90 input = destinationDataFlowComponent.InputCollection[0]; IDTSVirtualInput90 vInput = input.GetVirtualInput(); // Iterate through the virtual input column collection. foreach (IDTSVirtualInputColumn90 vColumn in vInput.VirtualInputColumnCollection) { managedOleInstance.SetUsageType( input.ID, vInput, vColumn.LineageID, DTSUsageType.UT_READONLY); } DTSExecResult res = pkg.Execute(); } public ConnectionManager CreateOLEDBConnection(Package p) { ConnectionManager ConMgr; ConMgr = p.Connections.Add("OLEDB"); ConMgr.ConnectionString = "Data Source=VSTS;Initial Catalog=MYDATABASE;Provider=SQLNCLI;Integrated Security=SSPI;Auto Translate=false;"; ConMgr.Name = "SSIS Connection Manager for Oledb"; ConMgr.Description = "OLE DB connection to the Test database."; return ConMgr; } public ConnectionManager CreateFileConnection(Package p) { ConnectionManager connMgr; connMgr = p.Connections.Add("FLATFILE"); connMgr.ConnectionString = @"D:MyCSVFile.csv"; connMgr.Name = "SSIS Connection Manager for Files"; connMgr.Description = "Flat File connection"; connMgr.Properties["Format"].SetValue(connMgr, "Delimited"); connMgr.Properties["HeaderRowDelimiter"].SetValue(connMgr, Environment.NewLine); return connMgr; }
I am working on modifying a VB6 app that dynamically creates DTS packages to copy data from one database to another depending on the selections made in UI. The project currently uses DTSPackage object library and DTSDataDump Scripting object library. We are in the process of upgrading the server to SQL 2005. I am exploring the possibility of replacing code that generates DTS packages on the fly with SSIS packages.
Is it feasible to do this in VB6 ? I have referred to similar posts which focus mainly on VB.NET or C#. Any help with white paper or sample code would be appreciated.
I have a data flow task within a package. I want to be able to stop the whole package if I come across a certain condition based on data.
I've got it to the point where I can detect the condition, but all I'm able to do right now is then insert a row into a log table, and the rest of the package continues running. What I really want to do is stop the whole package in its tracks.
I have developed a big SSIS package to extract data from flat-files ( + 200 Dataflows ).
The situation is the following, inside de SSIS package, there are a lot of validations before extracting & loading the flat-files, i'm running this validations in paralell, so that when a file arrives, it enters the "validation process" and start extracting the file.
When i run the SSIS package from BIDS it works the way i have concepted it... but when i run the ssis in the server, the tables that are loaded through the process are only "available" when the SSIS PACKAGE ends, it is imperative that trough the process, when a table receives new data, it becomes ready, and don't just be available when the SSIS package finishes...
I have attached the an lousing .jpeg.
It is importart for the tables to be available, so the stored procedures(OUTSIDE SSIS PACKAGE) that are dependent of some tables, start working before the SSIS package Ends.