Transform Data Task (DTS2000) And SSIS
Jun 21, 2005My first period using SSIS in a real-world application convinced me that there
View 4 RepliesMy first period using SSIS in a real-world application convinced me that there
View 4 RepliesI am trying to read in a flat file, transform the fields and store into a destination database.
In DTS, this works using Transform Data Task Properties. I define the columns and then have a VB script on the Transformations tab that changes any bad data.
Is there a way to do this in SSIS that I can define the column transformations and re-use my VB scripts?
Linda
i have too many DTS packages to migrate to SSIS, and while examining a DTS package in BIDS (converted with the migration utility) i tried to edit the resulting migrated package, which opened the DTS interface with the two connection icons joined by the big fat arrow with a gear on it...not exactly what i had in mind, iow, it looks like SSIS on the outside, but its still DTS on the inside.
So I stripped out a series of components from a more complex package hoping that simplifying it would reveal the contents of old DTS Transformations tab at least partially set up in a Derived Column transformation.
Can i get there from here, or must i recreate every stinking definition in a derived column manually from the ground up?
thanks very much for your help
Hi JayH (or anyone). Another week...a new set of problems. I obviously need to learn .net syntax, but because of project deadlines in converting from DTS to SSIS it is hard for me to stop and do that. So, if someone could help me some easy syntax, I would really appreciate it.
In DTS, there was a VBScript that copied a set of flat files from one directory to an archive directory after modifying the file name. In SSIS, the directory and archive directory will be specified in the config file. So, I need a .net script that retrieves a file, renames it and copies it to a different directory.
Linda
Here is the old VBScript Code:
Public Sub Main()
Option Explicit
Function Main()
Dim MovementDataDir
Dim MovementArchiveDataDir
Dim MovementDataFile
Dim MovementArchiveDataFile
Dim FileNameRoot
Dim FileNameExtension, DecimalLocation
Dim CurMonth, CurDay
Dim FileApplicationDate
Dim fso ' File System Object
Dim folder
Dim FileCollection
Dim MovementFile
'======================================================================
'Create text strings of today's date to be appended to the archived file.
FileApplicationDate = Now
CurMonth = Month(FileApplicationDate)
CurDay = Day(FileApplicationDate)
If Len(CurMonth) = 1 Then
CurMonth = "0" & CurMonth
End If
If Len(CurDay) = 1 Then
CurDay = "0" & CurDay
End If
FileApplicationDate = CurMonth & CurDay & Year(FileApplicationDate)
'=====================================================================
' Set the movement data directory from the global variable.
MovementDataDir = DTSGlobalVariables("gsMovementDataDir").Value
MovementArchiveDataDir = DTSGlobalVariables("gsMovementDataArchiveDir").Value
fso = CreateObject("Scripting.FileSystemObject")
folder = fso.GetFolder(MovementDataDir)
FileCollection = folder.Files
' Loop through all files in the data directory.
For Each MovementFile In FileCollection
' Get the full path name of the current data file.
MovementDataFile = MovementDataDir & "" & MovementFile.Name
' Get the full path name of the archive data file.
MovementArchiveDataFile = MovementArchiveDataDir & "" & MovementFile.Name
DecimalLocation = InStr(1, MovementArchiveDataFile, ".")
FileNameExtension = Mid(MovementArchiveDataFile, DecimalLocation, Len(MovementArchiveDataFile) - DecimalLocation + 1)
FileNameRoot = Mid(MovementArchiveDataFile, 1, DecimalLocation - 1)
MovementArchiveDataFile = FileNameRoot & "_" & FileApplicationDate & FileNameExtension
If (fso.FileExists(MovementDataFile)) Then
fso.CopyFile(MovementDataFile, MovementArchiveDataFile)
' If the archive file was coppied, then delete the old copy.
If (fso.FileExists(MovementArchiveDataFile)) Then
fso.DeleteFile(MovementDataFile)
End If
End If
Next
fso = Nothing
folder = Nothing
FileCollection = Nothing
Main = DTSTaskExecResult_Success
End Function
In my current project i have a requirement to assign value of an aggregate transform to a variable. But i need to accomplish it without using a script task.
View 3 Replies View RelatedHi everyone,
Right now we are trying to replace all stored procedures by SSIS logic.
One of the stored procedures does the following,
1. For Every record in PSTREELEAF find the Hierarchy FROM
PSTREENODE and store them in local valriables.
First the stored procedure pulls all the data from the PSTREELEAF table based on join condition into local variables using cursor. Then for every record in the cursor it finds the corresponding records from PSTREENODE table. This is just a part of the stored procedure.
can anyone tell me how can i do this in SSIS? and also tell me which task would be a perfect replacement for CURSORS in SSIS?
Thanks,
Praveen
Hi
I need to Transform Data from a Transoft Data Source (ODBC) into an OLE DB Connection.
The Solution that I Currently have is to Transform using the DTS 2000 Package Task.
Is there a way that I can perform this task by using SSIS
Regards
Que
Hello, I am working on a module to extract data from a Teradata server to SQL database. I am using a DTS package to extract the data and need to make the data source name (database name and object name) configurable at runtime and to be read from a Config table in the SQL database. What do you suggest is the simplest and most efficient method to do this?
I am trying to use a dynamic SQL query in the data tranform task with the data source as a query. But its giving me a strange syntax error. Can we use a dynamic SQL in the query option of DTS transform task source?
Thanks, Meriya
I am importing data from xls file to a db table with a dts. In the time of the dts creation I am using 'Transform Data Task Properties' GUI window to map incoming xls fields (source) to the table columns (destination).
Question: Is there any way to invoke the 'Transform Data Task Property' GUI window in dts runtime and use it to change the mapping dynamically in the run time?
Thanks, Vadim.
Hi everybody
I'm still trying to learn the advantage of having stored procedures. I have a DTS that uses a Transform Data Task to append the result of a view into a table. All operations are done locally in the server.
Do I have any advantage if I write a stored procedure to insert the view into the table, and then call the stored procedure in the DTS, in stead of using the Transform Data Task ?
Thanks in advance for your thoughts
ds9
Hi All,
I'm stuck up with a strange problem.
When i try to setup a Transform Data task in DTS, the table drop down shows fully qualified table name.
i.e. <database name>.<schema name>.<table name>
as you can see in the attached screenshot. With this I cant see the full table name and am not able to make the correct selection.
Where can i change the properties so that it displays only the table name?
Thanks
Rohit
I have an Access 2.0 database that holds call data on a mapped drive. I am running MS SQL Server 2000. I can open it and view the records inside. I can even run the query below and get results, if I removed the CallDate and CallTime parameters.
SELECT CallDate, CallTime, Mid(CallRecordData, 68, 3) AS Extension, 'I' AS Direction, Mid(CallRecordData, 34, 11) AS Called,
Val(Mid(CallRecordData, 18, 2)) + Val(Mid(CallRecordData, 21, 2))/ 60 AS Minutes, Val(Mid(CallRecordData, 21, 2)) AS Seconds
FROM CallRecords
WHERE (CallDate = ?) AND (CallTime >= ?) AND (CallTime < ?) AND (Mid(CallRecordData, 30, 1) <> '9')
When I preview in the Transform Data Task, I get:
Package Error
Error Source: Microsoft JET Database Engine
Error Description: No value given for one or more required parameters.
When I look at the parameters, they are listed. I check their values, and they have the appropriate values (DateCalled, String, 07/14/2005) (StartTime, String, 06:30) (EndTime, String, 07:00)
When I run it in the build query or in Access with a linked table to the source, I can enter the values when asked for them and it works.
Thanks for any help you can provide.
High all,
I have a very simple SSIS package that is moving data from a DB2 database to a Teradata box. I've run it around 10 times, twice it pushed data over, the balance of the time, it executes with no error, but moves nothing over. In the "incomplete" runs, a command line box pops up for half a second, then the package ends.
Does anyone have ideas as to why this behavior is occurring?
Thanks,
Mark
Hello,
I have an OLE DB Source and i want to transform the data type fields of the table before i export the table in an OLE DB Destination.
Is there a way to transform numeric value to float, and numeric to nvchar?
Thank you in advance.
I can't figure out how to put nested tables into the Data Mining Model Training Transform (SSIS). I can do a simple case table, but how do you get those nested tables with DM Training Transformation? Any ideas? Samples?
Thanks in advance,
-Young K
I am trying to use a DTS package to get data from db2 in a s390 environment. I am able to use the Import task and then run a query on db2, save the package and execute the package.But when i try edit the transform task i get a mmc.ese application error...it says that the instruction at addres "" tries referencing memory at address "". The memory could not be read...
I installed a ibm odbc driver on my client...obviously the connection seem to work since the package executes...But then the edit issue...
If any one faced this problem or know what i am doing wrong....appreciate ur time and effort...
Thanks
The thing is I am using a DTS package to get data from db2 residing on a s390-mainframe and put it in a sql 2000 db on windows 2000 server..
I have the odbc driver that the server guys gave me..I installed it and then run a batch pgm, which i think does the database set up...in short i set up the DSN's in my local machine..I can use the dsn when i use the sql client...
I can't create a new package and then complete the task..Problem is when i try set up a transform task, i get a memory error...it says that the instruction referenced a memory location that can't be read...
So what i do is i use the import task in sql...which inturn uses the same odbc connection and runs the query on db2..inserts the results into a table in sql..then save the package...Only problem is that i can't edit the transorm task...same memory error...
Any idea why this happens ?? I would like to see how i can get past this using Odbc...
I need to call a stored procedure to insert data into a table in SQL Server from SSIS data flow task.
I am currently trying to use OLe Db Destination, but I am not sure how to map inputs to OLE DB Destination to my stored procedure insert.
Thanks
Sometime ago i had to edit a dts package in ssis. To do that i've instaled the sql2005 dts. It worked fine at the time. But now i'm trying to open a dts in ssis after installing sp2 and it doesn't work. Every time i try to open a dts package i got the following error.
Error HRESULT E_FAIL has been returned from a call to a COM component. (Microsoft Visual Studio)
------------------------------
Program Location:
at DTS.CDTSLegacyDesignerClass.ShowDesigner()
at Microsoft.SqlServer.Dts.Tasks.Exec80PackageTask.GeneralView.btnEdit_Click(Object sender, EventArgs args)
I dont understand why this is happening. Just one thing i use to have windows server 2003 now i've xp sp2. I don´t know if this can be important but.
Thanks in advance.
Hi,
I want to insert the contents of a data file into a SQL Server table with XML datatype column using SSIS. I am fairly new to SSIS and script tasks.
Example:
Salary Data file:
EmpID, Name, Salary
100, John, 100000
200, Scott, 200000
Database Table:
create table xmlTable (id int identity(1,1), xmlColumn XML)
The contents of the file needs to be written to xmlTable. For each row in the Salary Data file there will be a corresponding row in the xmlTable.
xmlTable (Result):
1, <EmpID 100...>
2,<EmpID 200...>
Any help with details is greatly appreciated.
Thanks,
Rob
I was working all day making changes to my 3MB package. I was adding a large number of transforms that were copied-and-pasted from elsewhere in the same data flow task.
All was going well. I even took the time to have SSIS lay out the task again (1/2 hour). Suddenly I started receiving some strange errors:
After the layout, I noticed two stray components 'way off in the upper right corner. I found that one of them had a duplicate name to a component which had been added hours ago. Even after deleting it, I got "duplicate name" errors.
I copied three components in one selection, and when I tried to paste them, got the error "can't initialize component on paste". I tried them one at a time, but got the same error.
I got errors about COM failures due to marshalling to another thread
I then exited Visual Studio and started it again. To my great surprise, the data flow task I was working on was still there, but was completely empty.
Comparing what I'm left with to my last version in source control, I find that the entire pipeline element is missing from the DTS: ObjectData element!
I'm developing a real love/hate relationship with SSIS. It varies from one day to the next. Guess what kind of day this is!
Hi,
I need some urgent help on a conceptual question.
Requirement:
I have a simple package with one dataflow task. In that I need to read from a sql table and for every row in that table loop through n times and generate new output rows based on certain conditions (which are best evaluated in custom script as they are rather complex). Hence, if I have 100 rows in the table as my input, I may end up with 100*n rows as output.
My Design:
To implement this I have used an OLE DB Source which outputs to a Script Transform (ST). In the ST I intend to loop through in custom code and generate new rows using the .AddRow feature when I need new rows. This ST then feeds into another OLE DB Destination which writes the data to the table. Simple!
I am using the default buffer settings. All I have tweaked is the Synchronous... property on the script transform (otherwise I do not get to the Output0Buffer within the script!).
Problem:
I wish to do as much as possible in parallel. So I would expect the OLE DB Source to provide more than one row at a time to the script transform and that should process more than one input row simultaneously. It seems the script componenet is serializing input, so it seems to take one row at a time from the OLE DB source, loop through and process in the script transform).
AM I RIGHT IN THINKING THAT THE SCRIPT TRANSFORM IS EXECUTING THE INPUT IN A SEQUENTIAL MANNER?
CAN I PARALLELISE THIS?
If so, how?
Thanks,
Tirthankar
Hi, there,
I am having some problems with the loading of tab delimited text file (source) to a SQL Server table (destination) using the SSIS data flow task. Package has been executed successfully with no error msg. The number of rows in the text file also matches the number of rows in the SQL table. But, when I check the content of the table, I noticed some of the columns contain NULL which supposed to have value. This happens not to all the rows but only to some rows. I did some testing by removing some rows from the beginning, middle and end of the text file and re-run the package but the result is quite inconsistent. Sometimes, the field got filled, but sometimes, it just contains NULL where it supposed to have value.
What will be the possible causes to this?
Thanks in advance.
Regards,
Yong Hwee
Hi, thanks.
I could rosolve a KPI's Data Value by ADOMD.net from any .net application. Now I want to do the same thing from the SSIS Script Task. Could I do that?
SSIS Script Task use a VBA Script. I could use ADO.net in it, by imports the XML.dll.
Thanks.
Hi,
I am experiencing an error where the ssis data flow task would freeze and stop data export from a oledb source to a text file. It doesn't generate any errors the ssis package would just hang. This only happens when I run it in 64 bit mode. When I change the mode to 32 bit the ssis never freezes and runs fine. Has anyone experience this? Is there a fix so I can run my jobs in 64 bit mode?
Thanks,
Hi!
When I add a Data Flow Task to a package, save it and then close. When I open it again, I get 3 errors and the detail of the Data Flow Task is empty.
I had try to uninstall and install my SQL Server 2005, but the problem continues.
Can some one help me on this?
Sérgio Cardoso
I am trying to write a ssis surrogate key data transform, my problem is I can't find an example how to add a column to the incoming columns and add some data to it. If anyone has a sample, can you please post it. I found a script option that works but I would like an actual transform.
Thanks
I am migrating DTS packages to SSIS (recreating all the logic).
I have a Data Driven Query task in DTS with
Source query - select x,y from table1 (from database db1)
Binding - table2 which contains columns which match table1 x,y (fron database db2)
Transformation - maping from source table1 x,y to Binding table2 x,y
Queries - type update update table2 set x=? where y=?
I know that there is no similar task in SSIS,can someone tell me how to replicate this in SSIS
Thanks in Adv
Hello,
I'm Designing sql server 2005 SSIS Packages.
According to my requirment i have a sequence container. It has few data flow task, on success of one next one is running. If any one of them get failed then it should roll backed all the transaction. Each Data flow task transfering a data from one server to another server in similar table.
Thanks.
I have a relatively simple SSIS package that I'm building for a data mining process. The package starts with an OLE DB data source, passes the results of a SQL Command (query) along to a conversion step, which then gets sent to a Term Lookup task. The Term Lookup then writes the result to an OLE DB Data Destination. Pretty simple. The OLE DB data source query returns about 80,000 rows if you run it through SQL WB. The SSIS editor shows 9,557 rows make it out of the source, and into the conversion step, 9,557 make it out of the conversion and into the lookup, and about 60,000 rows make it out of the lookup and are written to the results table. Then the package fails with the following errors listed on the progress screen. I was assuming that the 9,557 was some type of batching that was occurring in the process, but now I'm not so sure.
Thoughts?
Frank
[DTS.Pipeline] Error: The ProcessInput method on component "My Component" (117) failed with error code 0xC02090E5. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
[DTS.Pipeline] Error: Thread "WorkThread0" has exited with error code 0xC02090E5.
[DTS.Pipeline] Error: Thread "WorkThread1" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.
[DTS.Pipeline] Error: Thread "WorkThread1" has exited with error code 0xC0047039.
[My Data Source Error: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
[DTS.Pipeline] Error: The PrimeOutput method on component "My Component" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
[DTS.Pipeline] Error: Thread "SourceThread0" has exited with error code 0xC0047038.
I have a package that loads staging tables from an Oracle source DB. In the data flow tab I have 30+ read table/write table task combinations. When I run the package 3-4 of the read/write combos execute at a time. What I'm trying to control is the priority order of the combo execution. My goal is to minimize to total load time by having the larger table transfers run first and the smaller table transfers fill in until they are all complete. Currently, the largest table (16 million) transfers last (because it was the last combo that I created?).
Thanks,
Dave
I am creating a staging database in which I am loading required tables from 2 different sources.
I have 30 different tables to load from source 1 and 10 different tables from source 2.
This is the way I am doing, in Control flow task I am using Sequence container and in that I included the data flow task, the data flow task
has source OLD DB connection from where I select the table and then destination OLE DB connection where I load the data.
So for 30 tables I have one Sequence container with 30 different data flow task and each data flow task has OLE DB source and OLD DB destination.
I wanted to find out if this is the efficient way to do, or if there is any other way to do this.
And for source 2 shall I put in another package or shall I use the same package with different sequence container and follow the same steps
as for Source 1 tables.
Please advice.
Thanks,
I have just installed Service Pack 2 on my SQL 2005 Standard Edition.
However, now all my SSIS packages will not allow me to open my Data FLow Tasks. I get the following error:
TITLE: Microsoft Visual Studio
------------------------------
Cannot show the editor for this task.
------------------------------
ADDITIONAL INFORMATION:
The task returned an unsupported control editor type. (Microsoft.DataTransformationServices.Design)
If I try to create a new Data Flow task I get:
TITLE: Microsoft Visual Studio
------------------------------
Failed to create the task.
------------------------------
ADDITIONAL INFORMATION:
The designer could not be initialized. (Microsoft.DataTransformationServices.Design)
I have tried to install the latest hotfixes after this but they had no effect.
Can anybody help me???? Please?