SSIS && Script Transformation (debug Vs Normal)
Mar 10, 2008
I just ran across an interesting problem, that makes no sense. I
built
an SSIS package that updates a column, using an transformation
script.
Testing in Debug mode everything runs perfectly, but when I have SQL
sever agent run the package it insert null into the new column.
Any thoughts or suggestions would be greatly appreciated.
I have followed the instructions for SSIS Lesson 1 exactly but i get these 4 errors when I come to debug at the "Lookup Date Key" lookup transformation. Last step in the lesson.
1. [Lookup Date Key [66]] Error: Row yielded no match during lookup. 2. [Lookup Date Key [66]] Error: The "component "Lookup Date Key" (66)" failed because error code 0xC020901E occurred, and the error row disposition on "output "Lookup Output" (68)" specifies failure on error. An error occurred on the specified object of the specified component. 3. [DTS.Pipeline] Error: The ProcessInput method on component "Lookup Date Key" (66) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. 4. [DTS.Pipeline] Error: Thread "WorkThread0" has exited with error code 0xC0209029.
I have tried this on 2 hardware setups and recreated the package several times on both. The only thing i can think of is that the collations on both servers is SQL_Latin1_CP1_CI_AS and British English (I have huge legacy databases from SQL 7.0 and cannot get my tech support to change language settings of server to UK English). Is it possible that this is causing the lookup failure mentioned above (Q1)? How can I change the collation/language settings within the DTS so that the text file matches the AdventureWorksDW database settings if this is the issue (Q2)?
Are the error codes listed anywhere and if not can they be added BOL (Q3)? I have read other threads and they suggest 0xC0209029 means lookup failed due to differing lengths. Can dates have differing lengths (Q4)?
I am executing a single package that references 180 other packages , after executing the first 90-100 packages , SSIS designer completely freezes and then i have to kill the session using Task Manager . Is this a limitation of SSIS or is it a system constraint ?
If you have any suggestions or workarounds for de same plz do reply
SSIS 2008 when I develop and debug in BIDS sometimes ignores debug break point.
The script component is in the main control flow and at some point the breakpoint did work.
If, for example, I create a new project and copy my script component there the debug breakpoint will work.So it's absolutely *random* when it works and when it does not.
Below is my BIDS detail:
Microsoft Visual Studio 2008 Version 9.0.30729.4462 QFE Microsoft .NET Framework Version 3.5 SP1 Installed Edition: IDE Standard Enterprise Library v5 Configuration Editor  4.0
When running a package in VS you can see something like this in the output window:
SSIS package "logging.dtsx" starting. Information: 0x40043006 at Data Flow Task, DTS.Pipeline: Prepare for Execute phase is beginning. Information: 0x40043007 at Data Flow Task, DTS.Pipeline: Pre-Execute phase is beginning. Information: 0x402090DC at Data Flow Task, Flat File Source [1]: The processing of file "C: est ssis loggingad_data1.txt" has started. Information: 0x4004300C at Data Flow Task, DTS.Pipeline: Execute phase is beginning. Warning: 0x8020200F at Data Flow Task, Flat File Source [1]: There is a partial row at the end of the file. Information: 0x402090DE at Data Flow Task, Flat File Source [1]: The total number of data rows processed for file "C: est ssis loggingad_data1.txt" is 477. Information: 0x402090DF at Data Flow Task, OLE DB Destination [1011]: The final commit for the data insertion has started. Information: 0x402090E0 at Data Flow Task, OLE DB Destination [1011]: The final commit for the data insertion has ended. Information: 0x40043008 at Data Flow Task, DTS.Pipeline: Post Execute phase is beginning. Information: 0x402090DD at Data Flow Task, Flat File Source [1]: The processing of file "C: est ssis loggingad_data1.txt" has ended. Information: 0x40043009 at Data Flow Task, DTS.Pipeline: Cleanup phase is beginning. Information: 0x4004300B at Data Flow Task, DTS.Pipeline: "component "DataReaderDest" (87)" wrote 0 rows. Information: 0x4004300B at Data Flow Task, DTS.Pipeline: "component "OLE DB Destination" (1011)" wrote 1 rows. SSIS package "logging.dtsx" finished: Success.
This is exactly when I need when a package is running but I want to be able to see it without using Visual Studio. I would do it in Reporting Services but I need to find out to get the information. The SSIS logging feature in a package does not provide that kind of info.
This would actually be funny if I weren't under serious time constraints right now. I have an SSIS project with several script tasks in the control flow. I put breakpoints in one of them to allow me to step through and see whats going on. However, when I start the project, the IDE brings up the script editor for a different script task. The breakpoints in this task are actually on the same line numbers as the task I need to debug but the code is all wrong. I checked the task for which the code is being displayed and there are no breakpoints there.
I have begun using SSIS and I am a little taken aback by the complexity of it especially since I just want to do a simple data transformation such as in DTS. Are there any tutorials for data transformation for SSIS on the web/this forum and what if I want to do a simple transformation from Access to SQL Server?
Hey all - got a problem that seems like it would be simple (and probably is : )
I'm importing a csv file into a SQL 2005 table and would like to add 2 columns that exist in the table but not in the csv file. I need these 2 columns to contain the current month and year (columns are named CM and CY respectively). How do I go about adding this data to each row during the transformation? A derived column task? Script task? None of these seem to be able to do this for me.
Here's a portion of the transformation script I was using to accomplish this when we were using SQL 2000 DTS jobs:
' Copy each source column to the destination column Function Main() DTSDestination("CM") = Month(Now) DTSDestination("CY") = Year(Now) DTSDestination("Comments") = DTSSource("Col031") DTSDestination("Manufacturer") = DTSSource("Col030") DTSDestination("Model") = DTSSource("Col029") DTSDestination("Last Check-in Date") = DTSSource("Col028") Main = DTSTransformStat_OK End Function *********************************************************** Hopefully this question isnt answered somewhere else, but I did a quick search and came up with nothing. I've actually tried to utilize the script component and the "Row" object, but the only properties I'm given with that are the ones from the source data.
My issue is the inner join transformation in SSIS. See i ll explain my problem clearly now.....
Actually i m just checkin if the inner join performed in business intelligence studio usin the inner join transformation and the inner join performed in the management studio using queries are same. Logically both the resultset should match isn't but in my case it is not so. It is very important for me to figure out where the problem is because i m goin to use lotsa inner join transformations in my current project.
I ll appreciate if someone can help me to figure out this problem. May be you can also tell me the detailed steps in adding the inner join transformation and also how it works.
I have a flatfile source to which different flatfiles will be passed as input,this is connected to an OLEDB destination which changes along with the sourcefile. But when the new file is given as input, the OLEDB mappings are not getting refreshed.It is showing an error.
Actually this was implemented in DTS, and they have used an activex script for the transformation. what shd I use in SSIS?
Hi I have migrated a DTS that had some activeX transformation tasks within data pump flow tasks.
Those parts were migrated as "DTS 2000 tasks" .. so activeX transformation tasks aren't possible in SSIS ? I know ActiveX script tasks are but for transformations ?
1. IF i leave these Encapsulated DTS 2000 tasks in the migrated SSIS package, will it run independently of the original DTS or does it need the old DTS running to "call" that part from ? (I hope im making sense here) is it possible to load this functionality internally into the new SSIS ?
2. How could I (if i can't do ActiveX transformation tasks) achieve this is SSIS ? can I achive this using the script tasks in SSIS ?
Can someone help me out in providing the STEPS to solve this problem. My scneario is, I've a table which has got 2 fields and 5 default row values have been filled in. Now, using the above, duirng package runtime, it need to dynamically create additional field and has to store values like for.e.g (0001 America). I'm getting the following error while executing the ssis package.
1. [DTS.Pipeline] Warning: Component "Derived Column" (1170) has been removed from the Data Flow task because its output is not used and its inputs have no side effects. If the component is required, then the HasSideEffects property on at least one of its inputs should be set to true, or its output should be connected to something. 2. [DTS.Pipeline] Warning: Source "OLE DB Source Output" (87) will not be read because none of its data ever becomes visible outside the Data Flow Task.
Please suggest with your valuable solution at the earliest.
Is there way to rename parameters Param_0, Param_1 in OLEDBCommand transformation? I am trying to create table driven packages using BIML. I am using OLEDBCommand Transformation to update rows. But since, I will not be sure of how many parameters and order of the parameters, I was planning to rename the parameter programmatically, so that accordingly I can build the update statement and add filter condition.
First let me say, I really can't believe this chain of events myself--and they are happening to me.
I am upgrading several DTS packages to SSIS on what will be my new production server. These packages create tables, export them to a flat file, and ftp them off to other locations.
What is happening (on the SSIS side) is that the OLE DB Source is reordering some of the columns on its own (moving them to the end of the table/file. Then when my pickup/load routines run, the data is out of place and they fail.
Can anyone please explain what is happening here with the mapping. I have evaluated the table and the columns are in the order that I expect. When I preview the source table in the OLE DB Source Editor the columns are in the correct order/alignment, but when them in the OLE DB Source Editor --Columns section within BIDS the order is changed arbitrarily.
I have been somewhat successful (2 out of 3) in being able to re-map the data, but this last table just doesn't want to change.
Thanks in advance for any help and/or information you can provide
I€™ve made a SSIS package which might take source columns from a plain text file and copy them to the Sql table. A long time ago, when you did the process I did by dts and that stuff included a pump task which had ActiveX Script transform column with VbScript stuff inside so that, how do I for to do the same with SSIS??
I€™ve got a couple of tasks: Flat File Source and OleDb Source Destination but it€™s useless at all for that goal.
SELECT a.TestID, a.TestCode FROM TableA a WHERE UPPER(RTRIM(a.TestCode)) IN SELECT (SELECT UPPER(RTRIM(b.TestCode)) FROM TableB b)
Of course the above query is missing a few things but with ETL the where clause UPPER(RTRIM does not appear to be something that has an object or property that I can use in the Lookup.
Hi, I have an example situation that seems like it should have a super easy solution, but my jobs keep failing. Here we go. . .
I have a SQL Server 2005 table as my source in a data flow task. This table contains raw data. We'll call it FACT_Product_Raw - which contains a field called ProductType varchar(1) Let's say that ProductType contains values of "A" or "B" or "C" - or for that matter, some null and garbage values
I have a lookup table, LOV_Product_Types This table contains 3 fields that will transform my raw data table We'll call these fields ProdTypeID smallint, ProdTypeRaw varchar(1) and ProdType smallint It contains pairs such that A = 1, B = 2, and so on.
Here's what I want to do. I want to ADD a field to FACT_Product_Raw that contains the "looked up" value from LOV_Product_Types. Let's say that I want to add the ProdTypeID field to my _Raw table.
I have used the _Raw table as both my source and destination It blows up every time. Help. Thanks, David
I have a table A with a KEY column and SSN column. KEY = 12 digits ( first 3 digits are Department Id , and last 9 digits are SSN)
I have a table B with SSN column only.
both KEY and SSN columns are Primary keys so duplicate entries must be Avoided.
Table A is intended to be popluated weekly from TXT file (SSIS package RUN). I want to achieve somethign like this..!
P-Code sample:
for each Row in TXT file if TXTfile.KEY = TableA.KEY then skip and Read/Go to next Row in TXT file else INSERT TXTfile.KEY into TABLEA.KEY SSN_Var = EXTRACT the SSN part (SSNpart.READ) if SSN_VAR.Exists In TableB.AnyRow then skip else Insert into TableB End If. End If End For Loop.
----------------------------------------------------------------- Using SSIS controls, what will be best flow and logic to achieve this.....? any sample scripting code ????
OLEDB SOURCE --> LOOK UP TRANSFORMATION | lookup output / error output(for new Inserts) (updated records) / / DERIVED COLUMN DERIVED COLUMN TRANSFORMATION1 TRANSFORMATION1 | | v v OLEDB COMMAND TRANSFORMATION OLEDB DESTINATION (inserting new records to (Updating records in destinationTable) destination)
in this senario new records r insrted properly but though package runs without error records not get updated in Destination. In OLEDB COMMAND my query is like below,
In advanced editor of OLE DB I hv created additional 16 paramater columns though i assign datatype as numeric to tht columns when i press refresh automatically it changes to DT_STR. My destination table columns r numeric .
I though due to this datatype mismatch the error came So i change the datatype of dest to varchar to make compatible with OLEDB Comand Transformation. THN also no Use NO UPDATES
package is running without error but records not get updated.
if change the flow like below
OLEDB SOURCE --> LOOK UP TRANSFORMATION | lookup output / error output(for new Inserts) (updated records) / / DERIVED COLUMN DERIVED COLUMN TRANSFORMATION1 TRANSFORMATION1 | | v v OLEDB COMMANDTRANSFORMATION / (Updating records in / destinationTable) / / / UNION ALL TRANSFORMATION | v
OLEDB DESTINATION
In This Case The updated record get inserted in the target as wel as the old remains as it is means m getting one additional record.
kindly help me to figure out the bug M frusted with this issue please.............
Hi, I am pretty new to SSIS. I am transferring some rows from 2 source tables to 1 destination table. The 2 source tables have 1000 rows.They act as the 2 inputs to a merge join transformation where i perform the join between the 2 tables based on a couple of fields. But for some reason the output of the merge join gives me about 1018 rows .Shouldnt the destination also have only 1000 rows? How do i solve tis problem?
Im from Russia, sorry if my english is not very good.
Here's the case:
1)------------------------------- I made a DTS-package in sql2000 that transfers the [sql table] into [dbf file] via jet4. First i create (in delphi) the empty dbf with the same name and columns same as in sql table. Second, I run my DTS with variables - source and destination table names
In DTS there is source, destination and transformation . After I send the Variables(table names) , the transformation "arrow" needs to be "refreshed" to make column names in both tables correspond each other. For that in transformation I chose ActiveXScript Mode and wright VB Script:
' Copy each source column to the destination column Function Main()
dim i
For i = 1 To DTSSource.Count DTSDestination(i) = DTSSource(i) next
Main = DTSTransformStat_OK End Function
And it works
2)------------------------------ I want to do same thing in sql2005 SSIS but don't figure out how... I managed to make a package that recieves (in variables) table names and runs correctly. But after I change those variable names into any other it crashes - Description: "component "OLE DB Source" (1)" failed validation and returned validation status "VS_NEEDSNEWMETADATA".
Of cource this happens 'cause I didn't "refresh" the transformatoin (and maybe also source and dest), but I don't know how.
We are using the cache transformation in our project , while doing the cache transformation our disk space goes to 0 MB free and SSIS package execution not completes even after 3 hr..Initially we have around 34 GB free space on C: drive .Our server configuration is 64 RAM. We are caching the data from table which contains around 21 million records.We changed the path in properties (“BLOPTempStoragePath”,”BufferTempStoragePath”) of Data Flow task of SSIS in which we are using Cache Transformation.
i have too many DTS packages to migrate to SSIS, and while examining a DTS package in BIDS (converted with the migration utility) i tried to edit the resulting migrated package, which opened the DTS interface with the two connection icons joined by the big fat arrow with a gear on it...not exactly what i had in mind, iow, it looks like SSIS on the outside, but its still DTS on the inside. So I stripped out a series of components from a more complex package hoping that simplifying it would reveal the contents of old DTS Transformations tab at least partially set up in a Derived Column transformation. Can i get there from here, or must i recreate every stinking definition in a derived column manually from the ground up? thanks very much for your help
I need to call a function to calculate a value. This function accepts a varchar parameter and returns a boolean value. I need to call this function for each row in the dataflow task. I thought I would use an oledb command transformation and for some reason if I say..
'select functioname(?)' as the sqlcommand, it gives me an error message at the design time. In the input/output properties, I have mapped Param_0(external column) to an input column.
I get this erro.."syntax error, ermission violation or other non specific error". Can somebiody please suggest me what's wrong with this and how should I deal this.
If you have two synchronous transformation components and the input of the second is connected to the output of the first, does the first transformation process (loop through) all rows in the buffer before outputting these rows to the second transformation? Or does the first transformation output each individual row to the second transormation as soon as it has finished processing it?