Create A Data Flow In SSIS Wich Updates Som Rows.
Aug 14, 2006
Hi,
I have a table customer wich has the columns phone_number(char type) and ok_to_call(bit type). There are already data in the table and the column ok_to_call only contains the value false for every row.
Now i want to update the latter column. I have a text file with a list of phone numbers and i want that all the rows in the Customer table(phone_number column)that matches the number in the text file to update ok_to_call to true.
This is to be done in SSIS(Integration Services). I'm new at this and i've looked around that tool but is a lot of items, packages and stuff so i dont know where to begin.
Would appreciate help on how to solve this issue in SSIS. What controlflow/Data flows to use,wich items and packages to use, how to configure and how to link together?
Regards
/Tomas
View 3 Replies
ADVERTISEMENT
Sep 20, 2007
Hi all,
I am looking for a way to leave a Data Flow Task destination table name as-is, and have SSIS auto-create the table if it doesn't exist already.
I searched on this in the forums but based on the question it's difficult to kow if it has been answered or not.
Details:
I am writing some SSIS packages that need to be executable on another server. Many of the Data Flow Tasks copy data (such as from a Fuzzy Grouping transformation, and lots of other stuff) into a new table. But the other server will not have these tables set up for the first run.
My current solution is to check information_schema.tables and drop IF EXISTS. But, then the Data Flow Task will not work (becase table does not exist). So, I script to new window a create table statement based on the existing table that I use in my dev environment. This is a hack and I want to find a better method.
It is quite possible (although unlikely) that the source columns could be changed in the future, or some query used to pull the data might be modified. If this happens, then I would need to change the CREATE TABLE Execute SQL task. I want my package to accommodate without having to modify it.
When I use the Import/Export Wizard, I can select a table name from the drop down list OR type in a new name. When I type in the new name, it assumes I want to create the table. NOW, is there a way to mimic this in BI Developer Studio? Yep, I saved the Wizard version of the SSIS package and all it does is run a CREATE TABLE statement first.
I am looking for a way to leave a Data Flow Task destination table name as-is, and have SSIS auto-create the table if it doesn't exist already.
Any ideas?
Brian Pulliam
View 12 Replies
View Related
May 17, 2007
Hi everyone,
Primary platform is 64 bit cluster.
How to move information allocated in SSIS variables from Data Flow to Control Flow layers??
We've got a SSIS package which load a value into a variable inside a Data Flow. Going back to Control Flow how could we retrive that value again????
Thanks in advance and regards,
View 4 Replies
View Related
Oct 13, 2015
I am facing an issue that Data flow task failing after loading 29000 rows out of 2lakhs rows.
I am loading data from .csv file to OLE DB Destination.
This data flow task is placed inside For each loop container.
is this issue because of any performance issue in SSIS packages such as buffer size.
find the error below:
DFT Load Data from FlatFile:Error: The conditional operation failed.
DFT Load Data from FlatFile:Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR.
The "DER Add Calc Columns" failed because error code 0xC0049063 occurred, and the error row disposition on "DER Add Calc Columns.Outputs[Derived Column Output].Columns[M_VALUE_NUM]" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
DFT Load Data from FlatFile:Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "DER Add Calc Columns" (48) failed with error code 0xC0209029 while processing input "Derived Column Input" (49). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
[code]....
View 8 Replies
View Related
Dec 19, 2007
I have a project that consists of a SQL db with an Access front end as the user interface. Here is the structure of the table on which this question is based:
Code Block
create table #IncomeAndExpenseData (
recordID nvarchar(5)NOT NULL,
itemID int NOT NULL,
itemvalue decimal(18, 2) NULL,
monthitemvalue decimal(18, 2) NULL
)
The itemvalue field is where the user enters his/her numbers via Access. There is an IncomeAndExpenseCodes table as well which holds item information, including the itemID and entry unit of measure. Some itemIDs have an entry unit of measure of $/mo, while others are entered in terms of $/yr, others in %/yr.
For itemvalues of itemIDs with entry units of measure that are not $/mo a stored procedure performs calculations which converts them into numbers that has a unit of measure of $/mo and updates IncomeAndExpenseData putting these numbers in the monthitemvalue field. This stored procedure is written to only calculate values for monthitemvalue fields which are null in order to avoid recalculating every single row in the table.
If the user edits the itemvalue field there is a trigger on IncomeAndExpenseData which sets the monthitemvalue to null so the stored procedure recalculates the monthitemvalue for the changed rows. However, it appears this trigger is also setting monthitemvalue to null after the stored procedure updates the IncomeAndExpenseData table with the recalculated monthitemvalues, thus wiping out the answers.
How do I write a trigger that sets the monthitemvalue to null only when the user edits the itemvalue field, not when the stored procedure puts the recalculated monthitemvalue into the IncomeAndExpenseData table?
View 4 Replies
View Related
Nov 2, 2006
I'm using the Data Flow Task to load data from a flat file into a SQL table and I'm missing rows. And there doesn't see to be any consistent or obvious reason why.
When I use the Bulk Insert Task I import the correct number of rows from the flat file. But when I use the Data Flow task and use a Flat File Source connected to a OLE DB Destination I get about 1/3 the right number of rows. So looking at these loaded tables at the same time I notice that the Data Flow Task method just skips rows sometimes.
Why does this happen??
View 3 Replies
View Related
Nov 21, 2007
Dear all,
I built a package to transform data from flatfile into temp table. Then execute a stored procedure to tranform from the temp table into the real table. Since the real table have primary keys so it goes failed when there're duplicate rows from temp table. I let the temp table has no primary key. If I had to check the temp table for duplicate rows it would be using cursor through the data and the package will get slow. Is there task in SSIS to identify the duplicate data and eliminate it without using cursor ?
Thanks in advance,
Best regards,
Hery
View 3 Replies
View Related
Mar 7, 2008
Hello all,
I am creating an SSIS package that takes data from a SQL Server 2005 table, adds some columns, programatically changes some values based on business requirements, and then writes the output to an Excel template which I've already prepared. Everything seems to work fine, but the package always errors out when immediately after it hits the Excel Destintation component, with the following errors:
Error: 0xC0202009 at Process Quarterly Data, Export to Excel [12621]: An OLE DB error has occurred. Error code: 0x80040E21.
Error: 0xC0202025 at Process Quarterly Data, Export to Excel [12621]: Cannot create an OLE DB accessor. Verify that the column metadata is valid.
Error: 0xC004701A at Process Quarterly Data, DTS.Pipeline: component "Export to Excel" (12621) failed the pre-execute phase and returned error code 0xC0202025.
I have verified that the Excel destination file is writing the correct headers to the specified file. I thought that the issue might be that one of the dynamically created columns isn't matching the Excel file, so I manually checked each and every one of them (there are 248 columns, although a lot of them aren't really used - however, we are required to use the template provided , so must include fields in the specified order whether or not they have any data) and made sure that I selected the exact same datatype for each column. However, I still get the error and no rows are written to the Excel file.
Here is the code generated by the Excel Destination Manager:
Generated Code - Excel Destination (SSIS)
CREATE TABLE `xxx_LoadData` (
`PART NUMBER` NVARCHAR(255),
`PART NAME` NVARCHAR(255),
`PRICE TBD` INTEGER,
`PRICE` MONEY,
`UOI` NVARCHAR(4),
`Items per UOI` INTEGER,
`NSN` NVARCHAR(255),
`OEM NAME` NVARCHAR(255),
`OEM PN` NVARCHAR(200),
`UPC` NVARCHAR(255),
`DESCRIPTION` NVARCHAR(255),
`EXPANDED DESCRIPTION` ntext,
`CLASSIFICATION CODE` NVARCHAR(255),
`DAYS ARO` INTEGER,
`IMAGE DESCRIPTION` NVARCHAR(255),
`IMAGE URL IF SELF HOSTED OR IMAGE NAME MANTECH HOSTED` NVARCHAR(255),
`SHIPPING WEIGHT` REAL,
`SHIPPING WEIGHT UNIT OF MEASURE` NVARCHAR(4),
`SHIPPING LENGTH` REAL,
`SHIPPING WIDTH` REAL,
`SHIPPING HEIGHT` REAL,
`SHIPPING UNIT OF MEASURE` NVARCHAR(4),
`PRODUCT WEIGHT` REAL,
`PRODUCT WEIGHT UNIT OF MEASURE` NVARCHAR(4),
`PRODUCT LENGTH` REAL,
`PRODUCT WIDTH` REAL,
`PRODUCT HEIGHT` REAL,
`PRODUCT UNIT OF MEASURE` NVARCHAR(4),
`FEDERAL SUPPLY CODE` NVARCHAR(255),
`ENAC CODE` NVARCHAR(255),
`PACKAGE UNIT OF ISSUE` NVARCHAR(4),
`PACKAGE UNITIP OF ISSUE` NVARCHAR(4),
`PACKAGE PRICE` MONEY,
`CERTIFIED NSN` NVARCHAR(255),
`COG CODE` NVARCHAR(255),
`HAZMAT` NVARCHAR(255),
`UNSPSC` NVARCHAR(255),
`SALE_START_DATE` DATETIME,
`SALE_END_DATE` DATETIME,
`PB1 Quantity` INTEGER,
`PB1 Zone 1 Price` MONEY,
`PB1 Zone 1 Sale Price` money,
`PB1 Zone 2 Price` money,
`PB1 Zone 2 Sale Price` money,
`PB1 Zone 3 Price` money,
`PB1 Zone 3 Sale Price` money,
`PB1 Zone 4 Price` money,
`PB1 Zone 4 Sale Price` money,
`PB1 Zone 5 Price` money,
`PB1 Zone 5 Sale Price` money,
`PB1 Zone 6 Price` money,
`PB1 Zone 6 Sale Price` money,
`PB1 Zone 7 Price` money,
`PB1 Zone 7 Sale Price` money,
`PB1 Zone 8 Price` money,
`PB1 Zone 8 Sale Price` money,
`PB1 Zone 9 Price` money,
`PB1 Zone 9 Sale Price` money,
`PB1 Zone 10 Price` money,
`PB1 Zone 10 Sale Price` money,
/* Repeated through PB10 Zone 10 - code not shown for brevity */
)
Does anyone have any suggestions as to what I'm doing wrong? I'm making an attempt to set up a process for the company, instead of throwing something together; while that would be much quicker (I've spent pretty much the whole day working on this), lack of processes are a big detriment to our current operations.
Any help would be greatly appreciated - I'm not that familiar with SSIS or its nuances just yet.
View 2 Replies
View Related
Jan 14, 2008
All,
I'm having an issue with the Flat File Data Flow Source returning only a limited set of the rows that are in the flat file. Basically, I connect to the flat file fine, it goes to retrieve the data (tab delimited file) and only returns 190 of 392 rows. Is there a limitation on the # of rows this data flow source can retrieve or something? I've look all through the settings and properties of the task as well as the connection manager and nothing is obvious as to what is causing this. Hopefully someone ou tthere has run into this before and can help me retrieve all rows. Thanks in advance!
bakerz
View 4 Replies
View Related
Apr 3, 2014
I need to see inside a SSIS 2012 project a new SSIS installed component, but in the SSDT 2010 I cannot see the SSIS Data Flow Items tab for adding data source/data destination respect to the choose toolbox items pane.
View 4 Replies
View Related
Jan 29, 2008
I need to call a stored procedure to insert data into a table in SQL Server from SSIS data flow task.
I am currently trying to use OLe Db Destination, but I am not sure how to map inputs to OLE DB Destination to my stored procedure insert.
Thanks
View 6 Replies
View Related
Jul 2, 2007
There is a table with a column that contains Xml documents. For each record from my Data Flow Source, I want to pass in the Xml document and the node to interrogate, and return the value contained in the node. Like the Crm component, this is probably one I will have to write from scratch in C#, but I would like to avoid having to create the custom component if it already exists in the public arena.
Does anyone know of any Xml Ssis Data Flow Components that are downloadable for free?
View 3 Replies
View Related
Sep 7, 2007
I was working all day making changes to my 3MB package. I was adding a large number of transforms that were copied-and-pasted from elsewhere in the same data flow task.
All was going well. I even took the time to have SSIS lay out the task again (1/2 hour). Suddenly I started receiving some strange errors:
After the layout, I noticed two stray components 'way off in the upper right corner. I found that one of them had a duplicate name to a component which had been added hours ago. Even after deleting it, I got "duplicate name" errors.
I copied three components in one selection, and when I tried to paste them, got the error "can't initialize component on paste". I tried them one at a time, but got the same error.
I got errors about COM failures due to marshalling to another thread
I then exited Visual Studio and started it again. To my great surprise, the data flow task I was working on was still there, but was completely empty.
Comparing what I'm left with to my last version in source control, I find that the entire pipeline element is missing from the DTS: ObjectData element!
I'm developing a real love/hate relationship with SSIS. It varies from one day to the next. Guess what kind of day this is!
View 3 Replies
View Related
Jun 1, 2006
Hi,
I am using SSIS in SQL Server 2005 and want to have a query like this in my data flow task
Select a.*
from abc as a
inner join (Select max(b.id) as ID from xyz as b inner join pqr as c on b.id = c.id and b.id > ?) as t1
on t1.id = a.id
SSIS fails to detect the parameter (?) for the inner query and gives message.
"
Parameters cannot be extracted from the SQL command. The provider might not help to parse parameter information from the command. In that case, use the "SQL command from variable" access mode, in which the entire SQL command is stored in a variable.", so assuming this is your problem, then you can workaround.
"
The idea is to parameterize the inner query ,,,
(so if the above query doesnt make sense ignore it )
View 1 Replies
View Related
Sep 28, 2006
Hi, there,
I am having some problems with the loading of tab delimited text file (source) to a SQL Server table (destination) using the SSIS data flow task. Package has been executed successfully with no error msg. The number of rows in the text file also matches the number of rows in the SQL table. But, when I check the content of the table, I noticed some of the columns contain NULL which supposed to have value. This happens not to all the rows but only to some rows. I did some testing by removing some rows from the beginning, middle and end of the text file and re-run the package but the result is quite inconsistent. Sometimes, the field got filled, but sometimes, it just contains NULL where it supposed to have value.
What will be the possible causes to this?
Thanks in advance.
Regards,
Yong Hwee
View 2 Replies
View Related
May 26, 2008
Hi,
I am experiencing an error where the ssis data flow task would freeze and stop data export from a oledb source to a text file. It doesn't generate any errors the ssis package would just hang. This only happens when I run it in 64 bit mode. When I change the mode to 32 bit the ssis never freezes and runs fine. Has anyone experience this? Is there a fix so I can run my jobs in 64 bit mode?
Thanks,
View 5 Replies
View Related
Sep 19, 2007
Hi ,
Is there any option to use OLE DB Source in Script Component?
Thanks
Aravind.
View 5 Replies
View Related
Feb 29, 2008
Hi!
When I add a Data Flow Task to a package, save it and then close. When I open it again, I get 3 errors and the detail of the Data Flow Task is empty.
I had try to uninstall and install my SQL Server 2005, but the problem continues.
Can some one help me on this?
Sérgio Cardoso
View 4 Replies
View Related
Sep 29, 2007
Hi,
I have a SSIS Package which I would like to modify using SSIS API. I need to put new component between some two existing data flow's components. During this process I need to disconnect two data flow's components using SSIS API. How can I do that?
Thanks,
Rafal
View 1 Replies
View Related
May 2, 2007
I am loading a lot of Excel and CSV files to SQL Server. Some loading may fail for various reasons. I want a file either be load as a whole or nothing. Currently I keep a list of failed filename and remove it at the end (I add a column for source file name).
Any better way to make sure a file is loaded as a whole or nothing?
Thanks,
View 3 Replies
View Related
Feb 16, 2007
I have a table that I'm loading as part of a control flow that in turn is copied to a target table by using a data flow task. I am doing this because a different set of fields may be used from the source entry to create the target entry based on a field in the source table. That same field may indicate that multiple entries need to be created in the target table from one source table entry for which I use a multi-cast transformation.
The problem I'm having is that no matter how many entries there are in the source table, the OLE DB Source during execution only shows 7,532 entries being taken from the source table. If there are less than 7,532 entries in the source table, everything processes fine. More than 7,532 and the data flow task only takes 7,532 and then seems to hang. It also seems as though only one path of the multi-cast transformation is taken when the conditional split directs a source entry down that path.
Seems like a strange problem I know, but any insight anyone could provide is appreciated. Thanks.
View 1 Replies
View Related
Apr 6, 2006
Environment:
Running this code on my PC via VS 2005
.Net version 2.0.50727 on the server (shown in IIS)
Code is in ASP.NET 2.0 and is a VB.NET Console application
SSIS 2005
Problem & Info:
I am bringing in an Excel file. I need to first strip out any non-detail rows such as the breaks you see with totals and what not. I should in the end have only detail rows left before I start moving them into my SQL Table. I'm not sure how to first strip this information out in SSIS specfically how down to the right component and how to actually code the component to do this based on my Excel file here: http://www.webfound.net/excelfile.xls
Then, I assume I just use a Flat File Source coponent or something to actually take the columns in the Excel and split into an OLE DB Datasource to shove each column into a corresponding column in my SQL Server Table. I have used a Flat File Source in the past to do so with a comma delimited txt file but never tried with an Excel.
Desired Help:
How to perform
1) stripping out all undesired rows
2) importing each column into sql table
View 1 Replies
View Related
Oct 29, 2014
I would like to know how I can add the following sample code to my Source data on Data Flow on SSIS, or what other options there are. The main issue is time as we have talking about 100's of millions of rows
select Sample,
CASE
WHEN Sample IS NULL
THEN NULL
WHEN SUBSTRING(Sample, 1, 6) IS NULL
THEN ' '
ELSE RTRIM(SUBSTRING(Sample, 1, 6))
END AS [Sample_1_6]
from TestTable
what I have done at this stage is just to Create a SQL task with a Insert into
INSERT INTO [dbo].[TestTable1]
([Sample]
,[Sample_1_6])
select Sample,
CASE WHEN Sample IS NULL =THEN NULL
WHEN SUBSTRING(Sample, 1, 6) IS NULL THEN ' '
ELSE RTRIM(SUBSTRING(Sample, 1, 6))
END AS [Sample_1_6]
from TestTable
If there is a way adding this to a dataflow so I van use fast load that would really be the best solution. I know there are derived columns, but would this really be faster than the straight insert into in a SQL Task? If this is the way to go what is the code I would use in the derived column or any other option.
View 7 Replies
View Related
Apr 18, 2006
I have a relatively simple SSIS package that I'm building for a data mining process. The package starts with an OLE DB data source, passes the results of a SQL Command (query) along to a conversion step, which then gets sent to a Term Lookup task. The Term Lookup then writes the result to an OLE DB Data Destination. Pretty simple. The OLE DB data source query returns about 80,000 rows if you run it through SQL WB. The SSIS editor shows 9,557 rows make it out of the source, and into the conversion step, 9,557 make it out of the conversion and into the lookup, and about 60,000 rows make it out of the lookup and are written to the results table. Then the package fails with the following errors listed on the progress screen. I was assuming that the 9,557 was some type of batching that was occurring in the process, but now I'm not so sure.
Thoughts?
Frank
[DTS.Pipeline] Error: The ProcessInput method on component "My Component" (117) failed with error code 0xC02090E5. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
[DTS.Pipeline] Error: Thread "WorkThread0" has exited with error code 0xC02090E5.
[DTS.Pipeline] Error: Thread "WorkThread1" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.
[DTS.Pipeline] Error: Thread "WorkThread1" has exited with error code 0xC0047039.
[My Data Source Error: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
[DTS.Pipeline] Error: The PrimeOutput method on component "My Component" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
[DTS.Pipeline] Error: Thread "SourceThread0" has exited with error code 0xC0047038.
View 2 Replies
View Related
Feb 12, 2007
Does anyone have a helpful link for using the partition processing data
flow task in SSIS? I am trying to process a monthly partition
from within my package and am getting the following error:
Error: 0xC113000A Errors in the high-level relational engine. Pipeline
processing can only reference a single table in the data source view.
If anyone has used this before and could point me in the right direction, I would appreciate it.
Thanks,
Nick
View 3 Replies
View Related
Apr 13, 2007
I have a package that loads staging tables from an Oracle source DB. In the data flow tab I have 30+ read table/write table task combinations. When I run the package 3-4 of the read/write combos execute at a time. What I'm trying to control is the priority order of the combo execution. My goal is to minimize to total load time by having the larger table transfers run first and the smaller table transfers fill in until they are all complete. Currently, the largest table (16 million) transfers last (because it was the last combo that I created?).
Thanks,
Dave
View 1 Replies
View Related
Nov 28, 2005
I am creating a staging database in which I am loading required tables from 2 different sources.
I have 30 different tables to load from source 1 and 10 different tables from source 2.
This is the way I am doing, in Control flow task I am using Sequence container and in that I included the data flow task, the data flow task
has source OLD DB connection from where I select the table and then destination OLE DB connection where I load the data.
So for 30 tables I have one Sequence container with 30 different data flow task and each data flow task has OLE DB source and OLD DB destination.
I wanted to find out if this is the efficient way to do, or if there is any other way to do this.
And for source 2 shall I put in another package or shall I use the same package with different sequence container and follow the same steps
as for Source 1 tables.
Please advice.
Thanks,
View 19 Replies
View Related
Aug 1, 2006
Has anyone come up/determined a generic way to capture and log indicative information within a data flow in SSIS - e.g., a number of rows selected from the source, transformed, rejected, loaded, various timestamps around these events, etc.? I am trying to avoid having to build a custom solution for each of the packages that I will have (of which there will be dozens). Ideally, I'd like to have some sort of a generic component (such as a custom transformation) that will hide the implementation details and provide a generic interface to the package.
It is not too difficult to achieve something similar on the control flow level, but once you get into data flows things get complicated.
Any ideas will be greatly appreciated.
View 5 Replies
View Related
Jul 26, 2006
Hi all...
I'm creating a SSIS in the designer view of SQL Server BI Dev. Studio (SQL Server 2005)
I need to import a whole table from MS Access into my local SQL Server.(this task will be performed weekly, so once working I'll schedule a job for it)
I've created a 'FILE' connection to MS Access in the 'Connection Managers'.
When I'm on the 'Data Flow' tab I can't find a Data Flow Item to use as a MS Access connection.
(available on the 'Data Flow Sources' are only: DataReader, Excel, Flat File, OLE DB, Raw File and XML Sources)
What am I doing wrong/missing?
Thanks for your help.
View 4 Replies
View Related
May 17, 2007
I have just installed Service Pack 2 on my SQL 2005 Standard Edition.
However, now all my SSIS packages will not allow me to open my Data FLow Tasks. I get the following error:
TITLE: Microsoft Visual Studio
------------------------------
Cannot show the editor for this task.
------------------------------
ADDITIONAL INFORMATION:
The task returned an unsupported control editor type. (Microsoft.DataTransformationServices.Design)
If I try to create a new Data Flow task I get:
TITLE: Microsoft Visual Studio
------------------------------
Failed to create the task.
------------------------------
ADDITIONAL INFORMATION:
The designer could not be initialized. (Microsoft.DataTransformationServices.Design)
I have tried to install the latest hotfixes after this but they had no effect.
Can anybody help me???? Please?
View 10 Replies
View Related
Jul 14, 2006
Hello.
I am using the "SSIS Log Provider for SQL Server" to log events to a table for "OnError" and "OnPostExecute" events of a package. This works as expected and provides a nice clean output on the execution steps of the package.
I am curious as to why I do not see any detail for any/all tasks that fall under the "Data Flow" section of the package though. For instance, on my "Control Flow" tab, I added a "Data Flow" task that simply loads a few tables from a target to destination server. However, there is nothing shown in the logging output. Just that a Data Flow task was initiated. And when I'm configuring this logging under "SSIS-->Logging" in the checkbox area on the left, you cannot "drill into" data flow steps.
Is there a reason why there is no detailed logging for Data Flow tasks? Would getting to that require me to create a custom log provider?
Thanks for the help.
Greg
View 1 Replies
View Related
Jul 25, 2007
What are the advantages and disadvantages of having multiple data flow tasks in one SSIS package?
Is this a good idea at all considering the workflow may be similar now but may change in the future? Should it be left as one data flow per package?
View 1 Replies
View Related
Nov 14, 2007
Does anyone know how to hook up to data flow pipeline events via custom solution (C#)? I am trying to write code to log start and end times of components(lookup,merge joins etc) in a data flow task. I tried with a class inheriting from the EventsProvider class but it didn't work as this is only for container tasks. Any ideas will be greatly appreciated.
View 3 Replies
View Related