Data Transformations, Or Something Like That, With SQL
Jul 30, 2007
Right, so I'm currently trudging through the SQL video tutorials and such, so it may be that I get to this sooner or later, but as I'm under a deadline, I thought I'd post this question beforehand so I can use that info with what I'm learning now.
Here's my situation: I have a ASP.NET 2.0 site in which I currently use XML files to display the text on the page, and I transform that text using an XSL stylesheet. I want to move that data to a database, but I'm not sure what is the best way to do that. Basically what I'm most concerned with is storing the main text (paragraphs with embedded hyperlinks). Currently, I can get the XSL to pick out the links and transform them from simply XML data to live links when they display on the page, but would I be able to do the same if I were pulling these paragraphs out of a database? Or should I just store the XML data in the database, and still pull that out so I can transform it appropriately with the XSL sheet I already have? (For that matter, can I dynamically write XML content to a database? Or am I just better off keeping my XML files?) What's the best approach for something like this?
Thanks for the help!
View 4 Replies
ADVERTISEMENT
Jun 18, 2008
I am using SQL 2005 SSIS. I need to do a data conversion for a date field in a txt file. I used the import wizard to bring my txt file into SQL 2005 but didn't convert the date. The date is displayed in the flat file as 20070612. Can someone help me convert the date. I did add an OLE DB Source to the Data Flow screen and selected command what do I do next and what do I write?
View 9 Replies
View Related
Jul 20, 2005
Hello all.I am trying to write a query that "just" switches some data around soit is shown in a slightly different format. I am already able to dowhat I want in Oracle 8i, but I am having trouble making it work inSQL Server 2000. I am not a database newbie, but I can't seem tofigure this one out so I am turning to the newsgroup. I am thinkingthat some of the SQL Gurus out there have done this very thing athousand times before and the answer will be obvious to them.This message is pretty long but hopefully gives you enough informationto replicate the issue.There are 3 tables involved in my scenario. Potentially a lot more inthe real application, but I'm trying to keep this example as simple aspossible.In my database I have many "things". Let's call them "User Records"(table: users) for this example. My app allows the customer to createany number of custom "Extra Fields" (XF's) for a given User Record.The Extra Field definitions are stored in a table which we can callattribs. The actual XF values for a given user record are stored in athird table, let's call it users_attribs.users_attribs will look something like this (actual DDL below.)UserID | ExtraFieldID | Value--------------------------------------User_1 | XF_1 | hamUser_1 | XF_2 | eggsUser_2 | XF_1 | baconUser_2 | XF_2 | cheeseUser_3 | XF_2 | onionsThe end result is that I want a SQL query that returns something likethis:UserID | XF_1 | XF_2-------------------------------------User_1 | ham | eggsUser_2 | bacon | cheeseUser_3 | NULL | onionsPotentially there would be one column for each extra field definition.One interesting question is how to get a dynamic number of columns toshow up in results, (so new XF's show up automatically) but I'm notworried about that for now. Assume I will hard-code a specific set ofextra fields into my query.The key here is that all users must show up in the final result EVENIF they don't have some extra field value defined. Since User_3 inthe example above doesn't have an XF_1 record, we see a NULL in thatcolumn in the final result.With Oracle I am able to accomplish this via an Outer Join, and I knowSQL Server supports Outer Joins, but I can't seem to make it work. Inever version I have tried so far, if any user is missing any extrafield value, the entire row for the user goes "missing", and that ismy problem.It seems like one possible solution would be to just go ahead andpopulate the users_attribs table with a NULL value for thatcombination of user ID and extra field ID, basically adding a new rowlike this:UserID | ExtraFieldID | Value--------------------------------------User_3 | XF_1 | NULLI would like to avoid that if possible, for a number of reasons,particularly the question of *when* that NULL would be added. I don'twant my report to touch the database and add stuff at reporting timeif at all possible. In Oracle, I seemingly don't have to, and I wantto get to that point on SQL Server.So, here is some specific DDL to recreate this scenario:CREATE TABLE users (user_id varchar(60), username varchar(60));-- Extra Field (attribs) definitionsCREATE TABLE attribs (xf_id varchar(60), xf_name varchar(60));-- Extra Field values for UsersCREATE TABLE users_attribs (user_id varchar(60), xf_id varchar(60),val varchar(60));-- populate the sample tables-- sample User recsINSERT INTO users VALUES ('U_1', 'John Smith');INSERT INTO users VALUES ('U_2', 'Mary Rogers');-- sample extra field definitionsINSERT INTO attribs VALUES ('XF_1', 'Extra Field 1');INSERT INTO attribs VALUES ('XF_2', 'Extra Field 2');INSERT INTO attribs VALUES ('XF_3', 'Extra Field 3');-- sample values for User Extra Fields (XF's)-- U_1 ("John Smith") has complete values for each XFINSERT INTO users_attribs VALUES ('U_1', 'XF_1', 'XF_1 value forU_1');INSERT INTO users_attribs VALUES ('U_1', 'XF_2', 'XF_2 value forU_1');INSERT INTO users_attribs VALUES ('U_1', 'XF_3', 'XF_3 value forU_1');-- U_2 ("Mary Rogers") only has one value, missing the other two..INSERT INTO users_attribs VALUES ('U_2', 'XF_2', 'XF_2 value forU_2');Now, I can get what I want on Oracle, provided that I define an newview that joins the three tables together, then do a separate query onthat view that does an outer join. I could dispense with the view,but I don't want to hard-code the XF ID's into the query. I am finewith hardcoding the XF names, though. (Long story.)-- Create a User Extra Field view that joins Users-- extra field definitons (attribs)-- and values (users_attribs.)CREATE VIEW u_xf_view ASSELECT u.user_id, at.xf_name, uxf.valFROMusers u,attribs at,users_attribs uxfWHEREuxf.user_id = u.user_id ANDuxf.xf_id = at.xf_id-- Oracle-only outer join syntax works if you use the view:SELECTu.username as "User Name",uxf1.val as "Extra Field 1 Value",uxf2.val as "Extra Field 2 Value",uxf3.val as "Extra Field 3 Value"FROMusers t,u_xf_view uxf1,u_xf_view uxf2,u_xf_view uxf3WHEREuxf1.user_id(+) = t.user_id ANDuxf1.xf_name(+) = 'Extra Field 1' ANDuxf2.user_id(+) = t.user_id ANDuxf2.xf_name(+) = 'Extra Field 2' ANDuxf3.user_id(+) = t.user_id ANDuxf3.xf_name(+) = 'Extra Field 3';-- RESULTS (correct):User Name Extra Field 1 Value Extra Field 2 Value ExtraField 3 Value------------- ------------------------ ------------------------------------------------John Smith XF_1 value for U_1 XF_2 value for U_1 XF_3value for U_1Mary Rogers NULL XF_2 value for U_2 NULL2 Row(s)So far I have not been able to get the equivalent result in SQLServer. Like I said, I am really hoping to avoid populating thoseNULL values. Can anything think of a way to replicate Oracle'sbehavior here? I have tried a number of variations on the ANSI joinsyntax instead of Oracle's (+) operator, but everything I tried so farhas only yielded a row when ALL extra fields are populated (or evenworse behavior.)I greatly appreciate any assitance you may be able to give. I would behappy to provide any additional information if I forgot to mentionsomething important. I apologize in advance for any broken / wrappedlines. Thank you for taking the time to read this.I'm going to be out of town for the next week or so, so I won't checkfor a response until then, but as soon as I get back home I will checkback in the newsgroup.Thank you!!Preston Landerspibble (at) yahoo (dot) com
View 2 Replies
View Related
Jan 10, 2007
Hello,
I created a slowly changing dimension object and used an OLE DB Source object with a SQL Command to feed it. After all the SCD objects are created, I get a warning about a truncation which might happen.
I adjust the source query with the right converts, so the query uses data types and lengths which match the target table. I update the OLE DB Source with the new query. Column names are unchanged.
Now how do I get this data flow to reflect the new column lengths? Do I have to throw away all objects and recreate the SCD? The refresh button in the SCD object doesn't do it. This is also a problem when adding columns to your dimension table. Because I modify the stuff that the SCD generates, it's VERY teadious to always have to throw it all away and start over again.
Thnx. Jeroen.
View 7 Replies
View Related
Jan 2, 2007
Hi Folks,
I'm currently looking into SSIS, to establish whether or not it can improve on an existing stored procedure.
We have a sp that performs standard ELT functions: it extracts new (or newly updated) data out of A, transforms it, and then loads it into B. It runs as part of a scheduled job and takes approx 60 seconds to complete. Fine. But we want it to go faster, and this is where (we hope) SSIS comes in...
I'm approaching this area of SQL Server 2005 for the first time, and have been looking towards the data flow task and its transformations to provide such an equivalent, faster solution. Before I continue down this road however, I would welcome some peer feedback/comment on whether SSIS - and its data flow transformations - are indeed the best tools to leverage when looking to accomplish such an ELT function, and quickly.
I guess the fundamental question here is: 'Why transforms over script?' I am reading Brian Knight's book, and I'd like to quote a few passages:
'...the nicest thing about transforms in SSIS is that it is all done in-memory and it no longer requires elaborate scripting as in SQL Server 2000 DTS...'
I guess this means that it doesn't need to be complied/interpreted, which I suppose all DML does?
'...one of the overriding themes of SSIS is that you shouldn't have to write any code to create your transformation...'
Is this because writing code is considered a more complex task than creating + configuring a transformation, or is it (at least in part) because a transformation is necessarily going to be quicker than its DML equivalent?
Hope some of you guys can respond with some interesting thoughts.
Cheers,
Tamim Sadikali.
View 3 Replies
View Related
Mar 5, 2008
Hi, hope someone can help or point me down the right track.
I have to load 50+ tables (all have different file layouts) using a data reader (The source only allows an OBDC connection) into a SQL Server 2005 databases, rather than create 50+ ETL packages that will be identical in process terms is it possible to create a single package that will dynamically re-map the source destination joins. I know I can set the Source and Destination using expressions however how do I ensure that the mappings are updated and for the errors can I automate so that they re-direct? Is this possible using a script task? Or by some other means?
Many thanks for any help
View 6 Replies
View Related
May 25, 2006
I've read that SSIS tries to do all transformations in memory as a way of enhancing processing speed. What happens though if the amount of data processed exceeds the available RAM? Are raw files then used (similar to staging tables) or is an error generated?
Barkingdog
View 1 Replies
View Related
Nov 1, 2006
Hi
I am new to SSIS and have the following problem. I used the following script to clear data in columns of any CR/LF/Commas and char(0)'s. Can I just transfer this to SSIS and how exactly do I do that? Any help or advice would help.
Function Main()
Dim x
For x=1 to DTSSource.count
If Isnull(DTSSource(x)) = False Then
DTSDestination(x) = replace(replace(replace(Replace(DTSSource(x) , chr(13),""),chr(10),""),chr(0),""),","," ")
Else
DTSDestination(x) = DTSSource(x)
End If
Next
Main = DTSTransformStat_OK
End Function
Andre
View 8 Replies
View Related
Jul 11, 2006
Hi all,
Can anyone tell me some links which can gimme good insight on the SSIS transfoemations.
Thanks,
Praveen kumar Dayanithi
View 4 Replies
View Related
Jan 10, 2008
I want to change the rows of the following table to columns to avoid repeatability:
Manufacturer
AOpen
Model
s661FXm s661FXm Intel P4
System Type
Motherboard
Standard Memory
N/A
Maximum Memory
2 GB
Sockets
2
Slots/Banks
2
Manufacturer
HP/COMPAQ
Model
Presario SR1917FR AMD Athlon 64 X2 3.06 GHz
System Type
Desktop
Standard Memory
1024 (1024MB x1 Removable)
Maximum Memory
4 GB
Sockets
4
Slots/Banks
4
Can this be done using Pivot Transformations? If yes then which column will have pivotusage and which will have pivotkeyvalue. I am getting a little confused here.
Please help!!!
Thanks
View 6 Replies
View Related
Aug 3, 2006
I am importing four large flat files and have some formatting issues with dates. I've figured out how to process the dates within a package exactly the way I desire. Unfortunately, the process has several steps that I don't want to have to repeat for each date field. Is it possible to call a reusable sequence of transformations that take parameters and have a return value? Is there any other way to achieve similiar results?
View 1 Replies
View Related
May 23, 2007
Hi all,
In my DataFlow i set the "OLEDB Source" which is a table in my Extract Server and need to do some transformations and stage the table which will be a Dimension in the staging DB,
Q1-Now i need only 3 columns from the Source table, which transformation do i need to use to just extract the the 3 columns?
Q2- Two Columns of 3,which i will need to transform as it is-no changes at all and One of the column which has values like "BOSTON...."
(I have a vague idea of what i need to do,need something solid suggestions/advices to kickoff,plan is to use this city column with a Replace function (as one of the forum member's Spirit1 adviced..thanks..!!))to take out the dots and then need to write a condition if BOSTON then Assign Code "BOS" which will be City_Code and this "City_Code" will have to be looked in City_Dimension to get the "City_Key_Number" for "Boston" and lastly the City_Code and City Key Number both have to be transformed to the destination Dimension.
Any ideas /suggestions will be appreciated.
Thanks in advance...!!
ravi
View 5 Replies
View Related
Feb 7, 2006
Hi
I have a dts that is creating a table with not a fixed number of columns. The number of colums depend on a couple of factors based on the data that I'm pulling from other tables.
After some processing I need to dump all the data in the "dynamic" table into an excel doc. My problem is with the transformations within the transform data task. I don't know how many fields I will have in my table and this needs to be mapped to columns within the excel doc. Is it possible to programmatically define the transformations within an activeX script or what can I do.
Thanks
Johnnie
View 1 Replies
View Related
Jan 5, 2008
The following statement is from Microsoft documentation:
If you use the ExclusionGroup property to specify that rows should only go to one or another of a group of outputs, as in the Conditional Split transformation, you must call the DirectRow method to select the appropriate destination for each row. When you have an error output, you must call DirectErrorRow to send rows with problems to the error output instead of the default output.
I have a question about this because I have never used the "ExclusionGroup" property. For example, I have a script component where I specify 4 separate outputs, because I am sending different groups of rows to each output. I accomplish this programmatically using a lot of conditionals and it works fine.
I did not have to use the "ExclusionGroup" property to do this. So I'm not sure why I would ever need this, or to use DirectRow? I'm trying to understand this better, because maybe I feel like I'm not understanding the DirectRow, or how/when to use it.
Thanks
View 1 Replies
View Related
Jul 2, 2007
I have a column which uses a DEFAULT as GETDATE in one of my tables. When I execute a DTS package to insert data into it, the column values are all the same, but if I use SSIS, the dates differ slightly (by a few ticks after several rows, but not a consistent amount of rows).
Is there an explanation for this difference, and how can I correct this problem?
View 2 Replies
View Related
Jan 2, 2008
Hi,
I am trying to get all the row counts of source and target databases, then insert into a rowcounts table
and getting all the data about package name,start time etc and insert into logs table
How can I do it?
thanks,
Gokhan
View 31 Replies
View Related
May 29, 2007
In good old fashioned DTS there was the ability to perform custom transformations using activeX / vbscripty type language - does this still exist or are we stuck with the derived column editor?
View 3 Replies
View Related
Mar 5, 2007
What is the difference between ‘Fuzzy Lookup Transformations ‘ and ‘Lookup Transformations in ssis .any real time senario for better understanding
View 1 Replies
View Related
Jul 27, 2007
We receive thousands of files every week from various clients and we attempt to clean the columns using the same technique over and over so the data is consistent. The problem is I dont see a way to reuse complex column transformations in different packages. I would hate to have to go change every package if we change the rules for cleaning a column.
So #1: Can you create some kind of script or .net function that cleans a column and reuse it in multiple packages (or even in the same package)?
#2: Is it possible to call functions from the Derived Column expression builder?
Thanks!
View 3 Replies
View Related
Dec 2, 2003
I am at my wits end here. For Replication the Books Online clearly state:
"The option to allow transformations is set at the time you create a publication"
However, I cannot find any options that allow me to do this in the Create Publication Wizard.
Once the Publication has been created I see in the Properties in the Subscription Options tab that "Use DTS to transform data before distributing it to a Subscriber" is set to No and there is no way to change it.
Where am I going wrong?
View 1 Replies
View Related
Mar 8, 2001
In SQL 7 DTS I'm creating a Data Driven Query Tasks. When I get to writing the transformations in VB Script and try using the Test button to test the script, I get the following message:
Error Source: Microsoft Data Transformation Services Flat File Rowset
Provider
Error opening datafile. The system cannot find the file specified.
The documentation says:
[The transformation is tested] by executing it against a part of the
source data and copying the results to a temporary text file, for
preview purposes.
I can find no place to specify what temporary file is used for the test result and, because it is temporary, have assumed the system creates and
deletes it as needed.
Is this true?
I get this test error for every transformation. even those that work correctly when the DTS package is actually executed.
What can I do to get the transformations to test properly?
Any help or hints will be greatly appreciated.
===============================
William J Brown
Systems Analyst
College of Human Medicine
A114 East Fee Hall
Michigan State University
East Lansing, MI 48824
Voice: (517) 432-7490
Fax: (517) 355-0342
Email: brownwj@msu.edu
View 3 Replies
View Related
Feb 19, 2007
Hi, i'm trying to port a pivot query from access to sqlserver.
I'm trying this query:
SELECT IDMerce, [1] AS [Department-1], [2] AS [Department-2], [3] AS
[Department-3], [4] AS [Department-4]
FROM (SELECT IDMerce, Pezzi, IDMagazzino
FROM Disponibilita) p PIVOT (sum(Pezzi) FOR
IDMagazzino IN ([1], [2], [3], [4])) AS pvt
this works, but in my case i don't know in advance how many transformations
i need, so there is a solution?
Thanks
View 1 Replies
View Related
May 10, 2006
Hi,
I saw some thing called custom properties for the "Derived transformation" in the msdn site. I tried to use them in a simple package, but I am getting an error as "can't write to derivedoutputcolumnname.friendlyexpression". Friendly expression is one of the custom properties available for the derived transformation output columns.
The steps I followed to get to this error are as follows:
1) Get data from a table using OLEDB Source. Suppose I am getting firstName, LastName etc.
2) Derived column input is values from the above OLEDB Source.
3) I have added a new column called "Concatenated name" which is concatenated value of first and last names.
4) Then in the properties editor of this data flow task in expressions option I clicked on ellipse available. I got an editor for property expression, which contained two columns called "Property" and "Expression". Property column contains dropdown with friendly expressions propety for the derived columns and expression column is a text box, where in we can enter expression to be evaluated for the corresponding friendly expression property.
5) Now when I click on OK and try to debug it gives an error as "Can't write to concatenatedname.friendlyexpresiion".
If anybody has already faced this problem and solved it please let me know, because I am struck here a long time.
Thanks&Regards,
Sreekanth Ammisetty
View 1 Replies
View Related
Jun 13, 2006
Greetings SSIS friends,
I have been attempting to implement one of our numerous ETL processes in SSIS but hit a brick wall when I tried replacing a complex stored procedure with a series of Merge Join components.
In the end, I had to settle with using a SQL task which merely calls the stored procedure and this proved to be the better option as the other version where I used SSIS components only took forever to run.
How do people cope with complex transformations?! Do you guys opt for pure TSQL to perform complex transformations and use SSIS components for control flow+simple(ish) data flow tasks?
I am confused.
View 23 Replies
View Related
Apr 20, 2007
I'm trying to find if there is a combination of dataflow transformations that will produce the following result
SELECT
period,
project,
task,
employee = CASE
when empid in (SELECT DISTINCT empid FROM EmpTable) then empid
else 'Deleted Employee'
end
FROM ProjectTable
I know I can create a dataflow task with this query as a data source and then send it to a destination, but I was wondering if that is the best way to do it or if there was a better way to do this using the data transformations available in SSIS.
Any insight would be most appreciated.
Regards,
Bill Webster
View 4 Replies
View Related
Dec 5, 2005
Until there's an Integration Services 2.0, what custom components would you most like to see examples of? The documentation team is starting work on the 2nd Web refresh of Books Online and SQL Server samples, anticipated for release around April, and may be able to incorporate some requests as samples or BOL topics.
View 26 Replies
View Related
Sep 16, 2006
Hi all I am trying to convert the string "(null)" in the [PASSWORD] column of my table to an actual NULL value. I have tried to use two different forms of a conditional operator to achieve this end. However I am getting the below errors both can be summed up with the following statement.
DT_STR operand cannot be used with the conditional operation. The expression directly below however is using a type DT_I4 in the conditional clause as this is what FINDSTRING returns. Hence the equivalencey test to the literal integer 0. So I must say I am somewhat confused by this. Does anyone know why neither of the below statements are working?
Also is there an easy way to accomplish what I am trying to do - convert the string "(null)" in the [PASSWORD] column of my table to an actual NULL value?
FINDSTRING([PASSWORD], "(null)", 1) == 0 ? [PASSWORD] : NULL(DT_STR, 255, 1252)
Error at Administrator Data Flow Task [Derived Column [1985]]: For operands of the conditional operator, the data type DT_STR is supported only for input columns and cast operations. The expression "FINDSTRING(PASSWORD,"(null)",1) == 0 ? PASSWORD : NULL(DT_STR,255,1252)" has a DT_STR operand that is not an input column or the result of a cast, and cannot be used with the conditional operation. To perform this operation, the operand needs to be explicitly cast with a cast operator.
LOWER( TRIM( [PASSWORD] ) ) != "(null)" ? [PASSWORD] : NULL(DT_STR, 255, 1252)
Error at Administrator Data Flow Task [Derived Column [1985]]: For operands of the conditional operator, the data type DT_STR is supported only for input columns and cast operations. The expression "LOWER(TRIM(PASSWORD)) != "(null)" ? PASSWORD : NULL(DT_STR,255,1252)" has a DT_STR operand that is not an input column or the result of a cast, and cannot be used with the conditional operation. To perform this operation, the operand needs to be explicitly cast with a cast operator.
View 4 Replies
View Related
Mar 31, 2008
I€™m trying to populate a table with fields of date type [DT_DATE] using the Slow Changing Dimension Transformation component. When I add the date fields to the component it would not build the stream. The wizard fails and tells me the date fields are not of the same type. The fields in the destination table are of type €œdate€? and the input columns are of type [DT_DATE]. Am I missing something?
View 3 Replies
View Related
Nov 24, 2006
Hi, all here,
Thank you very much for your kind attention.
I am wondering if it is possible to use SSIS to sample data set to training set and test set directly to my data mining models without saving them somewhere as occupying too much space? Really need guidance for that.
Thank you very much in advance for any help.
With best regards,
Yours sincerely,
View 5 Replies
View Related
Dec 14, 2005
After testing out the application i write on the local pc. I deploy it to the webserver to test it out. I get this error.
System.Data.SqlClient.SqlException: The conversion of a char data type to a
datetime data type resulted in an out-of-range datetime value.
Notes: all pages that have this error either has a repeater or datagrid which load data when page loading.
At first I thought the problem is with the date, but then I can see
that some other pages that has datagrid ( that has a date field) work
just fine.
anyone having this problem before?? hopefully you guys can help.
Thanks,
View 4 Replies
View Related
Dec 4, 2007
I have used both data readers and data adapters(with datasets) in the projects that I have worked on. I am trying to get some clarification on when I should be using which one. I think I am doing this correctly but I want to be sure I am developing good habits.
As the name might suggest, it seems like a datareader is for only reading data. I have read that the data adapter and dataset are for a disconnected architecture. Or, that they can be used for this type of set up. I have been using the data adapter and datasets when writing to a database and the datareader when reading from a database.
Is this how these should be used? Is the data reader the best choice for reading data? Am I doing this the optimal way from a performance stand point?
......................................................thanks in advance
View 1 Replies
View Related
Nov 2, 2015
We already integrated different client data to MDS with MS Excel plugin, now we want to push back updated or new added record to source database. is it possible do using MDS? Do we have any background sync process to which automatically sync data to and from subscriber and MDS?
View 4 Replies
View Related
Oct 18, 2006
When I enter over 4000 chars in any ntext field in my SQL Server 2005 database (directly in the database and through the application) I get an error saying that the data could not be updated because string or binary data would be truncated.Has anyone ever seen this? I cannot figure out what is causing it, ntext should be able to hold a lot more data that this...
View 7 Replies
View Related