How Can I Use Audit And ROw Count Transformations?
Jan 2, 2008
Hi,
I am trying to get all the row counts of source and target databases, then insert into a rowcounts table
and getting all the data about package name,start time etc and insert into logs table
How can I do it?
thanks,
Gokhan
View 31 Replies
ADVERTISEMENT
Apr 22, 2015
tell me the difference between Audit transformation and rowcount transformation.
Because audit and rowcount transformation will provide the environment variables.
Only difference i am finding is rowcount returns the count of rows its updating .
Apart from these is there any other difference?
Tell me the scenario where i need to use the audit transformation.
View 3 Replies
View Related
Nov 1, 2014
I have made a server security audit and specify from database audit specification to audit "select" on a certain user and on a certain table. I logged in by this user and made the select statement..when i run this query
"select * from sys.fn_get_audit_file('d:Auditaudit1*',null,null)"
It return a value at which time the query has done
after 15 minutes i repeated the same action, i run the audit query and the same result is showed off on the panel.is it suppose to return a list of values by how many times this user has made the select statement on that table ? for example at 5:00 pm then 6:00 pm and so on
View 1 Replies
View Related
Jan 18, 2006
I need help...here is the problem.Last weekend, the servers in our datacenter where moved around. After thismove, and maybe coincidental, 1 server is performing very poor. Afterrunning a trace with SQL Profiler, I saw the problem which was laterconfirmed with another tool for SQL server performance monitoring. It seemsthat all connections to the SQL server (between 200 - 400) are doing a login/ logout for each command that they process. For example, the user'sconnection will login, perform a SELECT, and then logout. This is not a..NET application. The client software was not changed, it is still thesame. The vendor has said that it is not supposed to do that, it issupposed to use 1 connection that log's on in the morning and logs off atthe end of the day or whenever the user exits. 1 user may have severalconnections to the database.At times, the server is processing over 250 login / logouts (avgeraged for30 second period). Has anyone seen this problem? I have the server inAUDIT FAILUREs only. The server has become very unresponsive, things thattook 3 seconds now take over 15 seconds.Any ideas???
View 6 Replies
View Related
Jul 11, 2006
Hi all,
Can anyone tell me some links which can gimme good insight on the SSIS transfoemations.
Thanks,
Praveen kumar Dayanithi
View 4 Replies
View Related
Jul 30, 2007
Right, so I'm currently trudging through the SQL video tutorials and such, so it may be that I get to this sooner or later, but as I'm under a deadline, I thought I'd post this question beforehand so I can use that info with what I'm learning now. Here's my situation: I have a ASP.NET 2.0 site in which I currently use XML files to display the text on the page, and I transform that text using an XSL stylesheet. I want to move that data to a database, but I'm not sure what is the best way to do that. Basically what I'm most concerned with is storing the main text (paragraphs with embedded hyperlinks). Currently, I can get the XSL to pick out the links and transform them from simply XML data to live links when they display on the page, but would I be able to do the same if I were pulling these paragraphs out of a database? Or should I just store the XML data in the database, and still pull that out so I can transform it appropriately with the XSL sheet I already have? (For that matter, can I dynamically write XML content to a database? Or am I just better off keeping my XML files?) What's the best approach for something like this?Thanks for the help!
View 4 Replies
View Related
Jan 10, 2008
I want to change the rows of the following table to columns to avoid repeatability:
Manufacturer
AOpen
Model
s661FXm s661FXm Intel P4
System Type
Motherboard
Standard Memory
N/A
Maximum Memory
2 GB
Sockets
2
Slots/Banks
2
Manufacturer
HP/COMPAQ
Model
Presario SR1917FR AMD Athlon 64 X2 3.06 GHz
System Type
Desktop
Standard Memory
1024 (1024MB x1 Removable)
Maximum Memory
4 GB
Sockets
4
Slots/Banks
4
Can this be done using Pivot Transformations? If yes then which column will have pivotusage and which will have pivotkeyvalue. I am getting a little confused here.
Please help!!!
Thanks
View 6 Replies
View Related
Aug 3, 2006
I am importing four large flat files and have some formatting issues with dates. I've figured out how to process the dates within a package exactly the way I desire. Unfortunately, the process has several steps that I don't want to have to repeat for each date field. Is it possible to call a reusable sequence of transformations that take parameters and have a return value? Is there any other way to achieve similiar results?
View 1 Replies
View Related
May 23, 2007
Hi all,
In my DataFlow i set the "OLEDB Source" which is a table in my Extract Server and need to do some transformations and stage the table which will be a Dimension in the staging DB,
Q1-Now i need only 3 columns from the Source table, which transformation do i need to use to just extract the the 3 columns?
Q2- Two Columns of 3,which i will need to transform as it is-no changes at all and One of the column which has values like "BOSTON...."
(I have a vague idea of what i need to do,need something solid suggestions/advices to kickoff,plan is to use this city column with a Replace function (as one of the forum member's Spirit1 adviced..thanks..!!))to take out the dots and then need to write a condition if BOSTON then Assign Code "BOS" which will be City_Code and this "City_Code" will have to be looked in City_Dimension to get the "City_Key_Number" for "Boston" and lastly the City_Code and City Key Number both have to be transformed to the destination Dimension.
Any ideas /suggestions will be appreciated.
Thanks in advance...!!
ravi
View 5 Replies
View Related
Feb 7, 2006
Hi
I have a dts that is creating a table with not a fixed number of columns. The number of colums depend on a couple of factors based on the data that I'm pulling from other tables.
After some processing I need to dump all the data in the "dynamic" table into an excel doc. My problem is with the transformations within the transform data task. I don't know how many fields I will have in my table and this needs to be mapped to columns within the excel doc. Is it possible to programmatically define the transformations within an activeX script or what can I do.
Thanks
Johnnie
View 1 Replies
View Related
Jan 5, 2008
The following statement is from Microsoft documentation:
If you use the ExclusionGroup property to specify that rows should only go to one or another of a group of outputs, as in the Conditional Split transformation, you must call the DirectRow method to select the appropriate destination for each row. When you have an error output, you must call DirectErrorRow to send rows with problems to the error output instead of the default output.
I have a question about this because I have never used the "ExclusionGroup" property. For example, I have a script component where I specify 4 separate outputs, because I am sending different groups of rows to each output. I accomplish this programmatically using a lot of conditionals and it works fine.
I did not have to use the "ExclusionGroup" property to do this. So I'm not sure why I would ever need this, or to use DirectRow? I'm trying to understand this better, because maybe I feel like I'm not understanding the DirectRow, or how/when to use it.
Thanks
View 1 Replies
View Related
Jun 18, 2008
I am using SQL 2005 SSIS. I need to do a data conversion for a date field in a txt file. I used the import wizard to bring my txt file into SQL 2005 but didn't convert the date. The date is displayed in the flat file as 20070612. Can someone help me convert the date. I did add an OLE DB Source to the Data Flow screen and selected command what do I do next and what do I write?
View 9 Replies
View Related
Jul 2, 2007
I have a column which uses a DEFAULT as GETDATE in one of my tables. When I execute a DTS package to insert data into it, the column values are all the same, but if I use SSIS, the dates differ slightly (by a few ticks after several rows, but not a consistent amount of rows).
Is there an explanation for this difference, and how can I correct this problem?
View 2 Replies
View Related
May 29, 2007
In good old fashioned DTS there was the ability to perform custom transformations using activeX / vbscripty type language - does this still exist or are we stuck with the derived column editor?
View 3 Replies
View Related
Mar 5, 2007
What is the difference between ‘Fuzzy Lookup Transformations ‘ and ‘Lookup Transformations in ssis .any real time senario for better understanding
View 1 Replies
View Related
Jul 20, 2005
Hello all.I am trying to write a query that "just" switches some data around soit is shown in a slightly different format. I am already able to dowhat I want in Oracle 8i, but I am having trouble making it work inSQL Server 2000. I am not a database newbie, but I can't seem tofigure this one out so I am turning to the newsgroup. I am thinkingthat some of the SQL Gurus out there have done this very thing athousand times before and the answer will be obvious to them.This message is pretty long but hopefully gives you enough informationto replicate the issue.There are 3 tables involved in my scenario. Potentially a lot more inthe real application, but I'm trying to keep this example as simple aspossible.In my database I have many "things". Let's call them "User Records"(table: users) for this example. My app allows the customer to createany number of custom "Extra Fields" (XF's) for a given User Record.The Extra Field definitions are stored in a table which we can callattribs. The actual XF values for a given user record are stored in athird table, let's call it users_attribs.users_attribs will look something like this (actual DDL below.)UserID | ExtraFieldID | Value--------------------------------------User_1 | XF_1 | hamUser_1 | XF_2 | eggsUser_2 | XF_1 | baconUser_2 | XF_2 | cheeseUser_3 | XF_2 | onionsThe end result is that I want a SQL query that returns something likethis:UserID | XF_1 | XF_2-------------------------------------User_1 | ham | eggsUser_2 | bacon | cheeseUser_3 | NULL | onionsPotentially there would be one column for each extra field definition.One interesting question is how to get a dynamic number of columns toshow up in results, (so new XF's show up automatically) but I'm notworried about that for now. Assume I will hard-code a specific set ofextra fields into my query.The key here is that all users must show up in the final result EVENIF they don't have some extra field value defined. Since User_3 inthe example above doesn't have an XF_1 record, we see a NULL in thatcolumn in the final result.With Oracle I am able to accomplish this via an Outer Join, and I knowSQL Server supports Outer Joins, but I can't seem to make it work. Inever version I have tried so far, if any user is missing any extrafield value, the entire row for the user goes "missing", and that ismy problem.It seems like one possible solution would be to just go ahead andpopulate the users_attribs table with a NULL value for thatcombination of user ID and extra field ID, basically adding a new rowlike this:UserID | ExtraFieldID | Value--------------------------------------User_3 | XF_1 | NULLI would like to avoid that if possible, for a number of reasons,particularly the question of *when* that NULL would be added. I don'twant my report to touch the database and add stuff at reporting timeif at all possible. In Oracle, I seemingly don't have to, and I wantto get to that point on SQL Server.So, here is some specific DDL to recreate this scenario:CREATE TABLE users (user_id varchar(60), username varchar(60));-- Extra Field (attribs) definitionsCREATE TABLE attribs (xf_id varchar(60), xf_name varchar(60));-- Extra Field values for UsersCREATE TABLE users_attribs (user_id varchar(60), xf_id varchar(60),val varchar(60));-- populate the sample tables-- sample User recsINSERT INTO users VALUES ('U_1', 'John Smith');INSERT INTO users VALUES ('U_2', 'Mary Rogers');-- sample extra field definitionsINSERT INTO attribs VALUES ('XF_1', 'Extra Field 1');INSERT INTO attribs VALUES ('XF_2', 'Extra Field 2');INSERT INTO attribs VALUES ('XF_3', 'Extra Field 3');-- sample values for User Extra Fields (XF's)-- U_1 ("John Smith") has complete values for each XFINSERT INTO users_attribs VALUES ('U_1', 'XF_1', 'XF_1 value forU_1');INSERT INTO users_attribs VALUES ('U_1', 'XF_2', 'XF_2 value forU_1');INSERT INTO users_attribs VALUES ('U_1', 'XF_3', 'XF_3 value forU_1');-- U_2 ("Mary Rogers") only has one value, missing the other two..INSERT INTO users_attribs VALUES ('U_2', 'XF_2', 'XF_2 value forU_2');Now, I can get what I want on Oracle, provided that I define an newview that joins the three tables together, then do a separate query onthat view that does an outer join. I could dispense with the view,but I don't want to hard-code the XF ID's into the query. I am finewith hardcoding the XF names, though. (Long story.)-- Create a User Extra Field view that joins Users-- extra field definitons (attribs)-- and values (users_attribs.)CREATE VIEW u_xf_view ASSELECT u.user_id, at.xf_name, uxf.valFROMusers u,attribs at,users_attribs uxfWHEREuxf.user_id = u.user_id ANDuxf.xf_id = at.xf_id-- Oracle-only outer join syntax works if you use the view:SELECTu.username as "User Name",uxf1.val as "Extra Field 1 Value",uxf2.val as "Extra Field 2 Value",uxf3.val as "Extra Field 3 Value"FROMusers t,u_xf_view uxf1,u_xf_view uxf2,u_xf_view uxf3WHEREuxf1.user_id(+) = t.user_id ANDuxf1.xf_name(+) = 'Extra Field 1' ANDuxf2.user_id(+) = t.user_id ANDuxf2.xf_name(+) = 'Extra Field 2' ANDuxf3.user_id(+) = t.user_id ANDuxf3.xf_name(+) = 'Extra Field 3';-- RESULTS (correct):User Name Extra Field 1 Value Extra Field 2 Value ExtraField 3 Value------------- ------------------------ ------------------------------------------------John Smith XF_1 value for U_1 XF_2 value for U_1 XF_3value for U_1Mary Rogers NULL XF_2 value for U_2 NULL2 Row(s)So far I have not been able to get the equivalent result in SQLServer. Like I said, I am really hoping to avoid populating thoseNULL values. Can anything think of a way to replicate Oracle'sbehavior here? I have tried a number of variations on the ANSI joinsyntax instead of Oracle's (+) operator, but everything I tried so farhas only yielded a row when ALL extra fields are populated (or evenworse behavior.)I greatly appreciate any assitance you may be able to give. I would behappy to provide any additional information if I forgot to mentionsomething important. I apologize in advance for any broken / wrappedlines. Thank you for taking the time to read this.I'm going to be out of town for the next week or so, so I won't checkfor a response until then, but as soon as I get back home I will checkback in the newsgroup.Thank you!!Preston Landerspibble (at) yahoo (dot) com
View 2 Replies
View Related
Jul 27, 2007
We receive thousands of files every week from various clients and we attempt to clean the columns using the same technique over and over so the data is consistent. The problem is I dont see a way to reuse complex column transformations in different packages. I would hate to have to go change every package if we change the rules for cleaning a column.
So #1: Can you create some kind of script or .net function that cleans a column and reuse it in multiple packages (or even in the same package)?
#2: Is it possible to call functions from the Derived Column expression builder?
Thanks!
View 3 Replies
View Related
Dec 2, 2003
I am at my wits end here. For Replication the Books Online clearly state:
"The option to allow transformations is set at the time you create a publication"
However, I cannot find any options that allow me to do this in the Create Publication Wizard.
Once the Publication has been created I see in the Properties in the Subscription Options tab that "Use DTS to transform data before distributing it to a Subscriber" is set to No and there is no way to change it.
Where am I going wrong?
View 1 Replies
View Related
Mar 8, 2001
In SQL 7 DTS I'm creating a Data Driven Query Tasks. When I get to writing the transformations in VB Script and try using the Test button to test the script, I get the following message:
Error Source: Microsoft Data Transformation Services Flat File Rowset
Provider
Error opening datafile. The system cannot find the file specified.
The documentation says:
[The transformation is tested] by executing it against a part of the
source data and copying the results to a temporary text file, for
preview purposes.
I can find no place to specify what temporary file is used for the test result and, because it is temporary, have assumed the system creates and
deletes it as needed.
Is this true?
I get this test error for every transformation. even those that work correctly when the DTS package is actually executed.
What can I do to get the transformations to test properly?
Any help or hints will be greatly appreciated.
===============================
William J Brown
Systems Analyst
College of Human Medicine
A114 East Fee Hall
Michigan State University
East Lansing, MI 48824
Voice: (517) 432-7490
Fax: (517) 355-0342
Email: brownwj@msu.edu
View 3 Replies
View Related
Feb 19, 2007
Hi, i'm trying to port a pivot query from access to sqlserver.
I'm trying this query:
SELECT IDMerce, [1] AS [Department-1], [2] AS [Department-2], [3] AS
[Department-3], [4] AS [Department-4]
FROM (SELECT IDMerce, Pezzi, IDMagazzino
FROM Disponibilita) p PIVOT (sum(Pezzi) FOR
IDMagazzino IN ([1], [2], [3], [4])) AS pvt
this works, but in my case i don't know in advance how many transformations
i need, so there is a solution?
Thanks
View 1 Replies
View Related
Jan 10, 2007
Hello,
I created a slowly changing dimension object and used an OLE DB Source object with a SQL Command to feed it. After all the SCD objects are created, I get a warning about a truncation which might happen.
I adjust the source query with the right converts, so the query uses data types and lengths which match the target table. I update the OLE DB Source with the new query. Column names are unchanged.
Now how do I get this data flow to reflect the new column lengths? Do I have to throw away all objects and recreate the SCD? The refresh button in the SCD object doesn't do it. This is also a problem when adding columns to your dimension table. Because I modify the stuff that the SCD generates, it's VERY teadious to always have to throw it all away and start over again.
Thnx. Jeroen.
View 7 Replies
View Related
May 10, 2006
Hi,
I saw some thing called custom properties for the "Derived transformation" in the msdn site. I tried to use them in a simple package, but I am getting an error as "can't write to derivedoutputcolumnname.friendlyexpression". Friendly expression is one of the custom properties available for the derived transformation output columns.
The steps I followed to get to this error are as follows:
1) Get data from a table using OLEDB Source. Suppose I am getting firstName, LastName etc.
2) Derived column input is values from the above OLEDB Source.
3) I have added a new column called "Concatenated name" which is concatenated value of first and last names.
4) Then in the properties editor of this data flow task in expressions option I clicked on ellipse available. I got an editor for property expression, which contained two columns called "Property" and "Expression". Property column contains dropdown with friendly expressions propety for the derived columns and expression column is a text box, where in we can enter expression to be evaluated for the corresponding friendly expression property.
5) Now when I click on OK and try to debug it gives an error as "Can't write to concatenatedname.friendlyexpresiion".
If anybody has already faced this problem and solved it please let me know, because I am struck here a long time.
Thanks&Regards,
Sreekanth Ammisetty
View 1 Replies
View Related
Jan 2, 2007
Hi Folks,
I'm currently looking into SSIS, to establish whether or not it can improve on an existing stored procedure.
We have a sp that performs standard ELT functions: it extracts new (or newly updated) data out of A, transforms it, and then loads it into B. It runs as part of a scheduled job and takes approx 60 seconds to complete. Fine. But we want it to go faster, and this is where (we hope) SSIS comes in...
I'm approaching this area of SQL Server 2005 for the first time, and have been looking towards the data flow task and its transformations to provide such an equivalent, faster solution. Before I continue down this road however, I would welcome some peer feedback/comment on whether SSIS - and its data flow transformations - are indeed the best tools to leverage when looking to accomplish such an ELT function, and quickly.
I guess the fundamental question here is: 'Why transforms over script?' I am reading Brian Knight's book, and I'd like to quote a few passages:
'...the nicest thing about transforms in SSIS is that it is all done in-memory and it no longer requires elaborate scripting as in SQL Server 2000 DTS...'
I guess this means that it doesn't need to be complied/interpreted, which I suppose all DML does?
'...one of the overriding themes of SSIS is that you shouldn't have to write any code to create your transformation...'
Is this because writing code is considered a more complex task than creating + configuring a transformation, or is it (at least in part) because a transformation is necessarily going to be quicker than its DML equivalent?
Hope some of you guys can respond with some interesting thoughts.
Cheers,
Tamim Sadikali.
View 3 Replies
View Related
Mar 5, 2008
Hi, hope someone can help or point me down the right track.
I have to load 50+ tables (all have different file layouts) using a data reader (The source only allows an OBDC connection) into a SQL Server 2005 databases, rather than create 50+ ETL packages that will be identical in process terms is it possible to create a single package that will dynamically re-map the source destination joins. I know I can set the Source and Destination using expressions however how do I ensure that the mappings are updated and for the errors can I automate so that they re-direct? Is this possible using a script task? Or by some other means?
Many thanks for any help
View 6 Replies
View Related
Jun 13, 2006
Greetings SSIS friends,
I have been attempting to implement one of our numerous ETL processes in SSIS but hit a brick wall when I tried replacing a complex stored procedure with a series of Merge Join components.
In the end, I had to settle with using a SQL task which merely calls the stored procedure and this proved to be the better option as the other version where I used SSIS components only took forever to run.
How do people cope with complex transformations?! Do you guys opt for pure TSQL to perform complex transformations and use SSIS components for control flow+simple(ish) data flow tasks?
I am confused.
View 23 Replies
View Related
May 25, 2006
I've read that SSIS tries to do all transformations in memory as a way of enhancing processing speed. What happens though if the amount of data processed exceeds the available RAM? Are raw files then used (similar to staging tables) or is an error generated?
Barkingdog
View 1 Replies
View Related
Aug 6, 2006
With the function below, I receive this error:Error:Transaction count after EXECUTE indicates that a COMMIT or ROLLBACK TRANSACTION statement is missing. Previous count = 1, current count = 0.Function:Public Shared Function DeleteMesssages(ByVal UserID As String, ByVal MessageIDs As List(Of String)) As Boolean Dim bSuccess As Boolean Dim MyConnection As SqlConnection = GetConnection() Dim cmd As New SqlCommand("", MyConnection) Dim i As Integer Dim fBeginTransCalled As Boolean = False
'messagetype 1 =internal messages Try ' ' Start transaction ' MyConnection.Open() cmd.CommandText = "BEGIN TRANSACTION" cmd.ExecuteNonQuery() fBeginTransCalled = True Dim obj As Object For i = 0 To MessageIDs.Count - 1 bSuccess = False 'delete userid-message reference cmd.CommandText = "DELETE FROM tblUsersAndMessages WHERE MessageID=@MessageID AND UserID=@UserID" cmd.Parameters.Add(New SqlParameter("@UserID", UserID)) cmd.Parameters.Add(New SqlParameter("@MessageID", MessageIDs(i).ToString)) cmd.ExecuteNonQuery() 'then delete the message itself if no other user has a reference cmd.CommandText = "SELECT COUNT(*) FROM tblUsersAndMessages WHERE MessageID=@MessageID1" cmd.Parameters.Add(New SqlParameter("@MessageID1", MessageIDs(i).ToString)) obj = cmd.ExecuteScalar If ((Not (obj) Is Nothing) _ AndAlso ((TypeOf (obj) Is Integer) _ AndAlso (CType(obj, Integer) > 0))) Then 'more references exist so do not delete message Else 'this is the only reference to the message so delete it permanently cmd.CommandText = "DELETE FROM tblMessages WHERE MessageID=@MessageID2" cmd.Parameters.Add(New SqlParameter("@MessageID2", MessageIDs(i).ToString)) cmd.ExecuteNonQuery() End If Next i
' ' End transaction ' cmd.CommandText = "COMMIT TRANSACTION" cmd.ExecuteNonQuery() bSuccess = True fBeginTransCalled = False Catch ex As Exception 'LOG ERROR GlobalFunctions.ReportError("MessageDAL:DeleteMessages", ex.Message) Finally If fBeginTransCalled Then Try cmd = New SqlCommand("ROLLBACK TRANSACTION", MyConnection) cmd.ExecuteNonQuery() Catch e As System.Exception End Try End If MyConnection.Close() End Try Return bSuccess End Function
View 5 Replies
View Related
Apr 20, 2007
I'm trying to find if there is a combination of dataflow transformations that will produce the following result
SELECT
period,
project,
task,
employee = CASE
when empid in (SELECT DISTINCT empid FROM EmpTable) then empid
else 'Deleted Employee'
end
FROM ProjectTable
I know I can create a dataflow task with this query as a data source and then send it to a destination, but I was wondering if that is the best way to do it or if there was a better way to do this using the data transformations available in SSIS.
Any insight would be most appreciated.
Regards,
Bill Webster
View 4 Replies
View Related
Dec 5, 2005
Until there's an Integration Services 2.0, what custom components would you most like to see examples of? The documentation team is starting work on the 2nd Web refresh of Books Online and SQL Server samples, anticipated for release around April, and may be able to incorporate some requests as samples or BOL topics.
View 26 Replies
View Related
Sep 16, 2006
Hi all I am trying to convert the string "(null)" in the [PASSWORD] column of my table to an actual NULL value. I have tried to use two different forms of a conditional operator to achieve this end. However I am getting the below errors both can be summed up with the following statement.
DT_STR operand cannot be used with the conditional operation. The expression directly below however is using a type DT_I4 in the conditional clause as this is what FINDSTRING returns. Hence the equivalencey test to the literal integer 0. So I must say I am somewhat confused by this. Does anyone know why neither of the below statements are working?
Also is there an easy way to accomplish what I am trying to do - convert the string "(null)" in the [PASSWORD] column of my table to an actual NULL value?
FINDSTRING([PASSWORD], "(null)", 1) == 0 ? [PASSWORD] : NULL(DT_STR, 255, 1252)
Error at Administrator Data Flow Task [Derived Column [1985]]: For operands of the conditional operator, the data type DT_STR is supported only for input columns and cast operations. The expression "FINDSTRING(PASSWORD,"(null)",1) == 0 ? PASSWORD : NULL(DT_STR,255,1252)" has a DT_STR operand that is not an input column or the result of a cast, and cannot be used with the conditional operation. To perform this operation, the operand needs to be explicitly cast with a cast operator.
LOWER( TRIM( [PASSWORD] ) ) != "(null)" ? [PASSWORD] : NULL(DT_STR, 255, 1252)
Error at Administrator Data Flow Task [Derived Column [1985]]: For operands of the conditional operator, the data type DT_STR is supported only for input columns and cast operations. The expression "LOWER(TRIM(PASSWORD)) != "(null)" ? PASSWORD : NULL(DT_STR,255,1252)" has a DT_STR operand that is not an input column or the result of a cast, and cannot be used with the conditional operation. To perform this operation, the operand needs to be explicitly cast with a cast operator.
View 4 Replies
View Related
Mar 31, 2008
I€™m trying to populate a table with fields of date type [DT_DATE] using the Slow Changing Dimension Transformation component. When I add the date fields to the component it would not build the stream. The wizard fails and tells me the date fields are not of the same type. The fields in the destination table are of type €œdate€? and the input columns are of type [DT_DATE]. Am I missing something?
View 3 Replies
View Related
Nov 1, 2006
Hi
I am new to SSIS and have the following problem. I used the following script to clear data in columns of any CR/LF/Commas and char(0)'s. Can I just transfer this to SSIS and how exactly do I do that? Any help or advice would help.
Function Main()
Dim x
For x=1 to DTSSource.count
If Isnull(DTSSource(x)) = False Then
DTSDestination(x) = replace(replace(replace(Replace(DTSSource(x) , chr(13),""),chr(10),""),chr(0),""),","," ")
Else
DTSDestination(x) = DTSSource(x)
End If
Next
Main = DTSTransformStat_OK
End Function
Andre
View 8 Replies
View Related
May 25, 2015
below data,
Countery
parentid
CustomerSkId
sales
A
29097
29097
10
A
29465
29465
30
A
30492
30492
40
[code]....
Â
Output
Countery
parentCount
A
8
B
3
c
3
in my count function,my code look like,
 set buyerset as exists(dimcustomer.leval02.allmembers,custoertypeisRetailers,"Sales")
set saleset(buyerset)
set custdimensionfilter as {custdimensionmemb1,custdimensionmemb2,custdimensionmemb3,custdimensionmemb4}
set finalset as exists(salest,custdimensionfilter,"Sales")
Set ProdIP as dimproduct.dimproduct.prod1
set Othersset as (cyears,ProdIP)
(exists(([FINALSET],Othersset,dimension2.dimension2.item3),[DimCustomerBuyer].[ParentPostalCode].currentmember, "factsales")).count
it will take 12 to 15 min to execute.
View 3 Replies
View Related