In good old fashioned DTS there was the ability to perform custom transformations using activeX / vbscripty type language - does this still exist or are we stuck with the derived column editor?
I saw some thing called custom properties for the "Derived transformation" in the msdn site. I tried to use them in a simple package, but I am getting an error as "can't write to derivedoutputcolumnname.friendlyexpression". Friendly expression is one of the custom properties available for the derived transformation output columns.
The steps I followed to get to this error are as follows:
1) Get data from a table using OLEDB Source. Suppose I am getting firstName, LastName etc.
2) Derived column input is values from the above OLEDB Source.
3) I have added a new column called "Concatenated name" which is concatenated value of first and last names.
4) Then in the properties editor of this data flow task in expressions option I clicked on ellipse available. I got an editor for property expression, which contained two columns called "Property" and "Expression". Property column contains dropdown with friendly expressions propety for the derived columns and expression column is a text box, where in we can enter expression to be evaluated for the corresponding friendly expression property.
5) Now when I click on OK and try to debug it gives an error as "Can't write to concatenatedname.friendlyexpresiion".
If anybody has already faced this problem and solved it please let me know, because I am struck here a long time.
I have been attempting to implement one of our numerous ETL processes in SSIS but hit a brick wall when I tried replacing a complex stored procedure with a series of Merge Join components.
In the end, I had to settle with using a SQL task which merely calls the stored procedure and this proved to be the better option as the other version where I used SSIS components only took forever to run.
How do people cope with complex transformations?! Do you guys opt for pure TSQL to perform complex transformations and use SSIS components for control flow+simple(ish) data flow tasks?
I've read that SSIS tries to do all transformations in memory as a way of enhancing processing speed. What happens though if the amount of data processed exceeds the available RAM? Are raw files then used (similar to staging tables) or is an error generated?
I'm trying to find if there is a combination of dataflow transformations that will produce the following result
SELECT
period,
project,
task,
employee = CASE
when empid in (SELECT DISTINCT empid FROM EmpTable) then empid
else 'Deleted Employee'
end
FROM ProjectTable
I know I can create a dataflow task with this query as a data source and then send it to a destination, but I was wondering if that is the best way to do it or if there was a better way to do this using the data transformations available in SSIS.
I am new to SSIS and have the following problem. I used the following script to clear data in columns of any CR/LF/Commas and char(0)'s. Can I just transfer this to SSIS and how exactly do I do that? Any help or advice would help.
I migrated the DTS from 2000, and the migrated SSIS which includes ActiveX script is 11KB, takes 00:00.125 running. I rewrote the SSIS using some new features provided in BIDS, and the new package is 50KB, takes 00:00:6.016 running. Is that normal or maybe because of the efficiency of my code?
Can you assist me with converting the code below to VB ?
Function Main() Dim objExcel Set objExcel = CreateObject("Excel.Application") objExcel.Workbooks.Open "C:FTPOUTGOINGFTP_MarkOff.xls",,,,"" objExcel.Workbooks(1).SaveAs "C:FTPOUTGOINGFTPMarkoff.xls",,"password" objExcel.Workbooks.Close Set objExcel = Nothing Main = DTSTaskExecResult_Success End Function
I have a flatfile source to which different flatfiles will be passed as input,this is connected to an OLEDB destination which changes along with the sourcefile. But when the new file is given as input, the OLEDB mappings are not getting refreshed.It is showing an error.
Actually this was implemented in DTS, and they have used an activex script for the transformation. what shd I use in SSIS?
Within a SQL 2000 DTS Package I have an ActiveX Script that would go within my transform tasks and update the queries by concatenating a "Where" clause with a date from a database table. This way I could keep track of when the last time I updated the table so that I could only bring down the rows since the last run. How can this be done within SSIS? I've been looking and I'm getting confused. Any help would be greatly appreciated.
My ActiveX code wrote: Function Main() msgbox "Hello" & DTSTaskExecResult_Failure & DTSTaskExecResult_Success Main = DTSTaskExecResult_Failure End Function
However - it displays "Hello10" - but SSIS will not throw an error - it executes successfully. Any ideas?
HI Experts, I am having 2 severs (SQL2000 & SQL2005), there is already one DTS package in SQL 2000, now i have migrated DTS package to SSIS package. Now the problem is DTS package is having ActiveX script and it was executing perfectly in 2000 server but after migrated to 2005 server if i am executing the coverted SSIS package it is giving one error and the error is displaying here.
Error 1 Validation error. DTSTask_DTSDataPumpTask_1: OLE DB Destination [181]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "Microsoft OLE DB Provider for SQL Server Copy" failed with error code 0xC0202009. There may be error messages posted before this with more information on why the AcquireConnection method call failed. SearsCDCC_Transfer (1).dtsx 0 0
and i am also displaying the ACtiveX code as well and here oma11pngrdb02 is 2000 server and sant01pngrdb is 2005 server.
I€™ve made a SSIS package which might take source columns from a plain text file and copy them to the Sql table. A long time ago, when you did the process I did by dts and that stuff included a pump task which had ActiveX Script transform column with VbScript stuff inside so that, how do I for to do the same with SSIS??
I€™ve got a couple of tasks: Flat File Source and OleDb Source Destination but it€™s useless at all for that goal.
I have an ActiveX component in my SSIS package, and it is written in VB. Something is going wrong, and I'm not sure what. I works fine when run from Visual Studio, but when I move it to our server and try to run it from a job there, it fails. I'd like to know WHERE in the component it is failing, so I'd like to send output to something as it goes along, so I can see how far it is getting.
How do I send text to an output or log file?
I can't use MsgBox, because, of course, when it runs as a job, it is putting that message box up on the server, where there is no-one to respond to it, so it hangs. I'm in the process of converting it from a DTS to SSIS, and it does have several instances of MsgBox now. And it's locking up -- on one of them.
I have logging turned on for the job step, and it is writing to a dbo.sysdtslog90 table, but all it tells me is that it is starting that ActiveX script task.
What can I replace the MsgBox with, so that it outputs somewhere to a file? Is there a simple command, like WScript.Echo or Console.WriteLine (neither of which I can get to work)?
I am re-writing old DTS packages(from sql 2005) to convert them to SSIS packages(sql 2014) and in one of the script task, the old activex script does not run.
I am re-writing old DTS packages(from sql 2005) to convert them to SSIS packages(sql 2014) and in one of the script task, the old activex script does not run.
The script is :
'************************ 'Â Visual Basic ActiveX Script '************************ Function Main() Â Â Â mydate = now() Â Â Â yrs = "" Â Â Â mth = "" Â Â Â mth = Month(mydate) - 1
[Code] ...
Not sure how to proceed forward? What is the SSIS counterpart of above script, step by step?
I need to retrieve the Global Variables set in my package configuration file within an ActiveX Script Task within an SSIS package. In DTS, I could access the Global Variables to execute a SQLXMLBulkLoad for the following statement:
Main = DTSTaskExecResult_Success set objBL=Nothing
End Function =========================================
I have tried using the Script Task to write this in VB.NET, however the MSXML4.0 is not exposed within the limited object model of the Script Task Designer. I have written a Data Flow Object using the XML Source, however it requires quite a bit of effort to have the Data Flow Component parse the XML (with 10 hierarchical nodes), transform each and provide a SQL Server Destination. This works, however the XML Source Component requires a hardcoded reference to the XSD Schema file and does not allow for a Global Variable to used. (They do provide this functionality for the XML file source though).
My requirement is to allow for the Global Variable to be passed for the Schema file at runtime. The only way I can think of is to recreate what I was doing in DTS where I could simply pull in the XML and XSD Global Variables and execute the SQLXMLBulkLoad in VB Script.
Any ideas on how to write this in VBScript within the ActiveX Script Task in SSIS?...
I have a SQL2000 DTS package that executes vbscript to loop through a recordset which:
- runs a stored procedure and populated tables
- builds a recordset from the populated tables to write records to an Excel file
- writes status to text files with either the error or success notices
I use FSO to set up the success and error files, but the scheduled job in SQL2005 which calls the SSIS package returns the following error:
"Retrieving the file name for a component failed with error code 0x0015F74C"
I can successullly run this (vbscript) in both the SSIS package via the BI Development Studio and in MS Access (exactly the same code in both) - but not as a SSIS package called in a scheduled job in SQL2005.
I am at an impasse with this ... any and ALL assistance would be GREATLY appreciated.
Right, so I'm currently trudging through the SQL video tutorials and such, so it may be that I get to this sooner or later, but as I'm under a deadline, I thought I'd post this question beforehand so I can use that info with what I'm learning now. Here's my situation: I have a ASP.NET 2.0 site in which I currently use XML files to display the text on the page, and I transform that text using an XSL stylesheet. I want to move that data to a database, but I'm not sure what is the best way to do that. Basically what I'm most concerned with is storing the main text (paragraphs with embedded hyperlinks). Currently, I can get the XSL to pick out the links and transform them from simply XML data to live links when they display on the page, but would I be able to do the same if I were pulling these paragraphs out of a database? Or should I just store the XML data in the database, and still pull that out so I can transform it appropriately with the XSL sheet I already have? (For that matter, can I dynamically write XML content to a database? Or am I just better off keeping my XML files?) What's the best approach for something like this?Thanks for the help!
I want to change the rows of the following table to columns to avoid repeatability:
Manufacturer AOpen
Model s661FXm s661FXm Intel P4
System Type Motherboard
Standard Memory N/A
Maximum Memory 2 GB
Sockets 2
Slots/Banks 2
Manufacturer HP/COMPAQ
Model Presario SR1917FR AMD Athlon 64 X2 3.06 GHz
System Type Desktop
Standard Memory 1024 (1024MB x1 Removable)
Maximum Memory 4 GB
Sockets 4
Slots/Banks 4
Can this be done using Pivot Transformations? If yes then which column will have pivotusage and which will have pivotkeyvalue. I am getting a little confused here. Please help!!! Thanks
I am importing four large flat files and have some formatting issues with dates. I've figured out how to process the dates within a package exactly the way I desire. Unfortunately, the process has several steps that I don't want to have to repeat for each date field. Is it possible to call a reusable sequence of transformations that take parameters and have a return value? Is there any other way to achieve similiar results?
Hi all, In my DataFlow i set the "OLEDB Source" which is a table in my Extract Server and need to do some transformations and stage the table which will be a Dimension in the staging DB,
Q1-Now i need only 3 columns from the Source table, which transformation do i need to use to just extract the the 3 columns?
Q2- Two Columns of 3,which i will need to transform as it is-no changes at all and One of the column which has values like "BOSTON...." (I have a vague idea of what i need to do,need something solid suggestions/advices to kickoff,plan is to use this city column with a Replace function (as one of the forum member's Spirit1 adviced..thanks..!!))to take out the dots and then need to write a condition if BOSTON then Assign Code "BOS" which will be City_Code and this "City_Code" will have to be looked in City_Dimension to get the "City_Key_Number" for "Boston" and lastly the City_Code and City Key Number both have to be transformed to the destination Dimension.
I have a dts that is creating a table with not a fixed number of columns. The number of colums depend on a couple of factors based on the data that I'm pulling from other tables.
After some processing I need to dump all the data in the "dynamic" table into an excel doc. My problem is with the transformations within the transform data task. I don't know how many fields I will have in my table and this needs to be mapped to columns within the excel doc. Is it possible to programmatically define the transformations within an activeX script or what can I do.
The following statement is from Microsoft documentation:
If you use the ExclusionGroup property to specify that rows should only go to one or another of a group of outputs, as in the Conditional Split transformation, you must call the DirectRow method to select the appropriate destination for each row. When you have an error output, you must call DirectErrorRow to send rows with problems to the error output instead of the default output.
I have a question about this because I have never used the "ExclusionGroup" property. For example, I have a script component where I specify 4 separate outputs, because I am sending different groups of rows to each output. I accomplish this programmatically using a lot of conditionals and it works fine.
I did not have to use the "ExclusionGroup" property to do this. So I'm not sure why I would ever need this, or to use DirectRow? I'm trying to understand this better, because maybe I feel like I'm not understanding the DirectRow, or how/when to use it.
I am using SQL 2005 SSIS. I need to do a data conversion for a date field in a txt file. I used the import wizard to bring my txt file into SQL 2005 but didn't convert the date. The date is displayed in the flat file as 20070612. Can someone help me convert the date. I did add an OLE DB Source to the Data Flow screen and selected command what do I do next and what do I write?
I have a column which uses a DEFAULT as GETDATE in one of my tables. When I execute a DTS package to insert data into it, the column values are all the same, but if I use SSIS, the dates differ slightly (by a few ticks after several rows, but not a consistent amount of rows).
Is there an explanation for this difference, and how can I correct this problem?
I am trying to get all the row counts of source and target databases, then insert into a rowcounts table and getting all the data about package name,start time etc and insert into logs table How can I do it?
What is the difference between ‘Fuzzy Lookup Transformations ‘ and ‘Lookup Transformations in ssis .any real time senario for better understanding
Hello all.I am trying to write a query that "just" switches some data around soit is shown in a slightly different format. I am already able to dowhat I want in Oracle 8i, but I am having trouble making it work inSQL Server 2000. I am not a database newbie, but I can't seem tofigure this one out so I am turning to the newsgroup. I am thinkingthat some of the SQL Gurus out there have done this very thing athousand times before and the answer will be obvious to them.This message is pretty long but hopefully gives you enough informationto replicate the issue.There are 3 tables involved in my scenario. Potentially a lot more inthe real application, but I'm trying to keep this example as simple aspossible.In my database I have many "things". Let's call them "User Records"(table: users) for this example. My app allows the customer to createany number of custom "Extra Fields" (XF's) for a given User Record.The Extra Field definitions are stored in a table which we can callattribs. The actual XF values for a given user record are stored in athird table, let's call it users_attribs.users_attribs will look something like this (actual DDL below.)UserID | ExtraFieldID | Value--------------------------------------User_1 | XF_1 | hamUser_1 | XF_2 | eggsUser_2 | XF_1 | baconUser_2 | XF_2 | cheeseUser_3 | XF_2 | onionsThe end result is that I want a SQL query that returns something likethis:UserID | XF_1 | XF_2-------------------------------------User_1 | ham | eggsUser_2 | bacon | cheeseUser_3 | NULL | onionsPotentially there would be one column for each extra field definition.One interesting question is how to get a dynamic number of columns toshow up in results, (so new XF's show up automatically) but I'm notworried about that for now. Assume I will hard-code a specific set ofextra fields into my query.The key here is that all users must show up in the final result EVENIF they don't have some extra field value defined. Since User_3 inthe example above doesn't have an XF_1 record, we see a NULL in thatcolumn in the final result.With Oracle I am able to accomplish this via an Outer Join, and I knowSQL Server supports Outer Joins, but I can't seem to make it work. Inever version I have tried so far, if any user is missing any extrafield value, the entire row for the user goes "missing", and that ismy problem.It seems like one possible solution would be to just go ahead andpopulate the users_attribs table with a NULL value for thatcombination of user ID and extra field ID, basically adding a new rowlike this:UserID | ExtraFieldID | Value--------------------------------------User_3 | XF_1 | NULLI would like to avoid that if possible, for a number of reasons,particularly the question of *when* that NULL would be added. I don'twant my report to touch the database and add stuff at reporting timeif at all possible. In Oracle, I seemingly don't have to, and I wantto get to that point on SQL Server.So, here is some specific DDL to recreate this scenario:CREATE TABLE users (user_id varchar(60), username varchar(60));-- Extra Field (attribs) definitionsCREATE TABLE attribs (xf_id varchar(60), xf_name varchar(60));-- Extra Field values for UsersCREATE TABLE users_attribs (user_id varchar(60), xf_id varchar(60),val varchar(60));-- populate the sample tables-- sample User recsINSERT INTO users VALUES ('U_1', 'John Smith');INSERT INTO users VALUES ('U_2', 'Mary Rogers');-- sample extra field definitionsINSERT INTO attribs VALUES ('XF_1', 'Extra Field 1');INSERT INTO attribs VALUES ('XF_2', 'Extra Field 2');INSERT INTO attribs VALUES ('XF_3', 'Extra Field 3');-- sample values for User Extra Fields (XF's)-- U_1 ("John Smith") has complete values for each XFINSERT INTO users_attribs VALUES ('U_1', 'XF_1', 'XF_1 value forU_1');INSERT INTO users_attribs VALUES ('U_1', 'XF_2', 'XF_2 value forU_1');INSERT INTO users_attribs VALUES ('U_1', 'XF_3', 'XF_3 value forU_1');-- U_2 ("Mary Rogers") only has one value, missing the other two..INSERT INTO users_attribs VALUES ('U_2', 'XF_2', 'XF_2 value forU_2');Now, I can get what I want on Oracle, provided that I define an newview that joins the three tables together, then do a separate query onthat view that does an outer join. I could dispense with the view,but I don't want to hard-code the XF ID's into the query. I am finewith hardcoding the XF names, though. (Long story.)-- Create a User Extra Field view that joins Users-- extra field definitons (attribs)-- and values (users_attribs.)CREATE VIEW u_xf_view ASSELECT u.user_id, at.xf_name, uxf.valFROMusers u,attribs at,users_attribs uxfWHEREuxf.user_id = u.user_id ANDuxf.xf_id = at.xf_id-- Oracle-only outer join syntax works if you use the view:SELECTu.username as "User Name",uxf1.val as "Extra Field 1 Value",uxf2.val as "Extra Field 2 Value",uxf3.val as "Extra Field 3 Value"FROMusers t,u_xf_view uxf1,u_xf_view uxf2,u_xf_view uxf3WHEREuxf1.user_id(+) = t.user_id ANDuxf1.xf_name(+) = 'Extra Field 1' ANDuxf2.user_id(+) = t.user_id ANDuxf2.xf_name(+) = 'Extra Field 2' ANDuxf3.user_id(+) = t.user_id ANDuxf3.xf_name(+) = 'Extra Field 3';-- RESULTS (correct):User Name Extra Field 1 Value Extra Field 2 Value ExtraField 3 Value------------- ------------------------ ------------------------------------------------John Smith XF_1 value for U_1 XF_2 value for U_1 XF_3value for U_1Mary Rogers NULL XF_2 value for U_2 NULL2 Row(s)So far I have not been able to get the equivalent result in SQLServer. Like I said, I am really hoping to avoid populating thoseNULL values. Can anything think of a way to replicate Oracle'sbehavior here? I have tried a number of variations on the ANSI joinsyntax instead of Oracle's (+) operator, but everything I tried so farhas only yielded a row when ALL extra fields are populated (or evenworse behavior.)I greatly appreciate any assitance you may be able to give. I would behappy to provide any additional information if I forgot to mentionsomething important. I apologize in advance for any broken / wrappedlines. Thank you for taking the time to read this.I'm going to be out of town for the next week or so, so I won't checkfor a response until then, but as soon as I get back home I will checkback in the newsgroup.Thank you!!Preston Landerspibble (at) yahoo (dot) com
We receive thousands of files every week from various clients and we attempt to clean the columns using the same technique over and over so the data is consistent. The problem is I dont see a way to reuse complex column transformations in different packages. I would hate to have to go change every package if we change the rules for cleaning a column.
So #1: Can you create some kind of script or .net function that cleans a column and reuse it in multiple packages (or even in the same package)?
#2: Is it possible to call functions from the Derived Column expression builder?