Simple IF Or CASE Statement In Data Flow Transformation

May 11, 2007

Using SSIS to import from Excel to SQL Server.

In Excel they are showing Sale as -ve quantity and purchase as +ve quantity.

The database has quantity always as +ve figure and a separate column "isPurchase" set to true or false depending on whether purchase or sale.



So I need Derived Column to return a bolean (or int) depending whether quantity is positive or negative. I tried each of the following in the Expression but all of them were invalid expressions.

CASE WHEN [TotalQuantity] > 0 THEN 0 ELSE 1 END

If [TotalQuantity] > 0 THEN 1 ELSE 0 END

IIF ([TotalQuantity] > 0, 1,0)



Can anyone help me with correct syntax, or correct Data Flow Transformation if Derived column is wrong.



Thanks



Richard

View 3 Replies


ADVERTISEMENT

Simple Custom Data Flow Transformation Doesn't Produce Any Output

Aug 30, 2006

I've built a simple custom data flow transformation component following the Hands On Lab (http://www.microsoft.com/downloads/details.aspx?familyid=1C2A7DD2-3EC3-4641-9407-A5A337BEA7D3&displaylang=en) and the Books Online (ms-help://MS.MSDNQTR.v80.en/MS.MSDN.v80/MS.SQL.v2005.en/dtsref9/html/adc70cc5-f79c-4bb6-8387-f0f2cdfaad11.htm and ms-help://MS.MSDNQTR.v80.en/MS.MSDN.v80/MS.SQL.v2005.en/dtsref9/html/b694d21f-9919-402d-9192-666c6449b0b7.htm).

All it is supposed to do is create an output column and set its value to the result of calling a web service method (the transformation is synchronous). Everything seems fine, but when I run the data flow task that contains it, it doesn't generate any output. The Visual Studio debugger displays it as yellow, with 1,385 rows going into it, but the data viewer attached to its output is empty. The output metadata looks just like I expect: all of my input columns plus the new column, correctly typed. No validation or run-time warnings or errors are reported.

I'll include the entire C# file below, which only overrrides the ProvideComponentProperties, Validate, PreExecute, ProcessInput, and PostExecute methods of the parent PipelineComponent class.

Since this is effectively a specialization of the DerivedColumn transformation, could I inherit from the class that implements the DC component instead of PipelineComponent? How do I even find out what that class is?

Thanks! Here's the code:
using System;
// using System.Collections.Generic;
// using System.Text;

using Microsoft.SqlServer.Dts.Pipeline;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
using Microsoft.SqlServer.Dts.Runtime.Wrapper;

namespace CustomComponents
{
[DtsPipelineComponent(DisplayName = "GID", ComponentType = ComponentType.Transform)]
public class GidComponent : PipelineComponent
{
///
/// Column indexes for faster processing.
///
private int[] inputColumnBufferIndex;
private int outputColumnBufferIndex;

///
/// The GID web service.
///
private GID.WS_PDF.PDFProcessService gidService = null;

///
/// Called to initialize/reset the component.
///
public override void ProvideComponentProperties()
{
base.ProvideComponentProperties();
// Remove any existing metadata:
base.RemoveAllInputsOutputsAndCustomProperties();
// Create the input and the output:
IDTSInput90 input = this.ComponentMetaData.InputCollection.New();
input.Name = "Input";
IDTSOutput90 output = this.ComponentMetaData.OutputCollection.New();
output.Name = "Output";
// The output is synchronous with the input:
output.SynchronousInputID = input.ID;
// Create the GID output column (16-character Unicode string):
IDTSOutputColumn90 outputColumn = output.OutputColumnCollection.New();
outputColumn.Name = "GID";
outputColumn.SetDataTypeProperties(Microsoft.SqlServer.Dts.Runtime.Wrapper.DataType.DT_WSTR, 16, 0, 0, 0);
}

///
/// Only 1 input and 1 output with 1 column is supported.
///
///
public override DTSValidationStatus Validate()
{
bool cancel = false;
DTSValidationStatus status = base.Validate();
if (status == DTSValidationStatus.VS_ISVALID)
{
// The input and output are created above and should be exactly as specified
// (unless someone manually edited the persisted XML):
if (ComponentMetaData.InputCollection.Count != 1)
{
this.ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component accepts 1 Input.",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISCORRUPT;
}
else if (ComponentMetaData.OutputCollection.Count != 1)
{
this.ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component provides 1 Output.",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISCORRUPT;
}
else if (ComponentMetaData.OutputCollection[0].OutputColumnCollection.Count != 1)
{
this.ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component Output must be 1 column.",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISCORRUPT;
}
// And the output column should be a Unicode string:
else if ((ComponentMetaData.OutputCollection[0].OutputColumnCollection[0].DataType != DataType.DT_WSTR) ||
(ComponentMetaData.OutputCollection[0].OutputColumnCollection[0].Length != 16))
{
ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component Output column data type must be (DT_WSTR, 16).",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISBROKEN;
}
}
return status;
}

///
/// Called before executing, to cache the buffer column indexes.
///
public override void PreExecute()
{
base.PreExecute();
// Get the index of each input column in the buffer:
IDTSInput90 input = ComponentMetaData.InputCollection[0];
inputColumnBufferIndex = new int[input.InputColumnCollection.Count];
for (int col = 0; col < input.InputColumnCollection.Count; col++)
{
inputColumnBufferIndex[col] = BufferManager.FindColumnByLineageID(input.Buffer, input.InputColumnCollection[col].LineageID);
}
// Get the index of the output column in the buffer:
IDTSOutput90 output = ComponentMetaData.OutputCollection[0];
outputColumnBufferIndex = BufferManager.FindColumnByLineageID(input.Buffer, output.OutputColumnCollection[0].LineageID);
// Get the GID web service:
gidService = new GID.WS_PDF.PDFProcessService();
}

///
/// Called to process the buffer:
/// Get a new GID and save it in the output column.
///
///
///
public override void ProcessInput(int inputID, PipelineBuffer buffer)
{
if (! buffer.EndOfRowset)
{
try
{
while (buffer.NextRow())
{
// Set the output column value to a new GID:
buffer.SetString(outputColumnBufferIndex, gidService.getGID());
}
}
catch (System.Exception ex)
{
bool cancel = false;
ComponentMetaData.FireError(0, ComponentMetaData.Name, ex.Message, string.Empty, 0, out cancel);
throw new Exception("Could not process input buffer.");
}
}
}

///
/// Called after executing, to clean up.
///
public override void PostExecute()
{
base.PostExecute();
// Resign from the GID service:
gidService = null;
}
}
}

View 1 Replies View Related

Lookup Task Data Flow Transformation Causes Data Flow Task To Hang?

Dec 28, 2007

Hi,
I'm trying to implement an incremental data pull (Oracle to SQL) based on Andy's blog:
http://sqlblog.com/blogs/andy_leonard/archive/2007/07/09/ssis-design-pattern-incremental-loads.aspx

My development machine is decent: 1.86 GHz, Intel core 2 CPU, 3 GB of RAM.
However it seems the data flow task gets hung whenever I test the package against the ~6 million row source, as can be seen from these screenshots. I have no memory limitations on the lookup transformation. After the rows have been cached nothing happens. Memory for the dtsdebug process hovers around 1.8 GB and it uses 1-6 percent of CPU resources continuously. I am not using fast load to insert new records into my sql target table. (I am right clicking Sequence Container 3 and executing this container NOT the entire package in the screenshots)

http://i248.photobucket.com/albums/gg168/boston_sql92/1.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/2.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/3.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/4.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/5.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/6.jpg


The same package works fine against a similar test table with 150k rows.
http://i248.photobucket.com/albums/gg168/boston_sql92/7.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/8.jpg

The weird thing is it only takes 24 minutes for a full refresh of the entire source table from Oracle to the SQL target table.
Any hints,advice would be appreciated.

View 18 Replies View Related

CTE In OLE DB Command Data Flow Transformation

Dec 20, 2006

I am trying to use a CTE in an OLE DB Command data flow transformation object. However, when I enter the cte and corresponding query in the SqlCommand field of the OLE DB command editor dialog, I get a syntax error. Can CTE's be used data flow objects? I have been able to use them in an Execute SQL Control Flow Item, but not in any data flow item.

View 7 Replies View Related

Data Flow Transformation: Count And Sum The Column Value

Nov 8, 2007

Hi,

I have one data flow control. Source is SQL server and destination is flat file destination. I have one derived column placed in between these two. This functionality works fine. I would like to sum one column data and count total no. of columns and put it in global variable. How can I achieve it?

Thanks,

View 7 Replies View Related

Using Expressions In Custom Data Flow Transformation

Sep 18, 2006

Hi,

I'm creating a custom data flow transformation in c#.

I would like to use expressions within this component in the same way as in the derived column component: specifying the expression as a custom property of an output column, then evaluating this expression for each row of the buffer and using this evaluated expression to populate my output column values.

So I've added an custom expression on my output column, and set its expression type to CPET_NOTIFY

IDTSCustomProperty90 exp = col.CustomPropertyCollection.New();

exp.Name = "Expression";

exp.ExpressionType = DTSCustomPropertyExpressionType.CPET_NOTIFY;

But in the ProcessInput method I don't manage to get the evaluated expressions, when I use exp.Value I get my expression definition and not its evaluation.

Is there a way to get these evaluated expressions ?

Thanks,

Stéphane

View 6 Replies View Related

Help Needed With Simple Case Statement In SQL

Mar 1, 2007

Hello,
 
I am looking to modify this Case Statement.  Where it says ELSE '' I need it to display the actual contents of the cell.  1 = Yes , 0 = No, (any other integer) = actual value. 
Right now if the value is anything other than 1 or 0,  it will leave the cell blank.
CASE dbo.Training.TrainingStatus WHEN 1 THEN 'Yes' WHEN 0 THEN 'No' ELSE '' END AS TrainingStatus
Thank You.

View 1 Replies View Related

Is There A Way To Set A Variable In A Data Flow From A SQL Statement (like In Control Flow)

Jan 12, 2006

I'm currently setting variables at the package level with an ExecuteSQL task.  This works fine.  However, I'm now starting to think about restartability midway through a package.  It would be nice to have the variable(s) needed in a data flow set within the data flow so that I only have to restart that task. 

Is there a way to do that using an SQL statement as the source of the value in a data flow? 

OR, when using checkpoints will it save variable settings so that they are available when the package is restarted?  This would make my issue a moot point.

View 2 Replies View Related

Custom Data Flow Transformation - Predefined Inputs

May 16, 2008

Hi,

I've created my own custom data flow transformation task (using C#) that
will parse a fullname and output the various name parts. In the
ProvideComponentProperties method, I create 5 output columns (prefix, first, middle,
last, and suffix). In the ProcessInput method, I parse the input and add the
name parts to the buffer. The bad thing is that I€™m making an assumption on
the position of the Full Name input column within the buffer.

I would like the €œuser€? to be able to map their "full name" input column to a known Full Name column so I don€™t have to make any assumptions. This is the first
SSIS task I€™ve tried to create and I haven€™t been able to find very many
examples online.

Any help is greatly appreciated!

Thank you,

Marshall

View 1 Replies View Related

Custom Data Flow Transformation Loosing CustomProperties

May 12, 2006

I created a custom transform that has a custom interface and is a wizard that uses a web service. It creates custom properties and output columns on the fly. I set the dialog result to Ok and close at the end of the steps. The transform then has the custom fields and output columns I created in the wizard. I've verified this by right clicking on the transform and going to the advanced editor. If I then immediately run the package, the custom fields don't exist in the CustomPropertiesCollection. If I close the package and reopen it, the properties now are gone. If I then go through the wizard again, thus recreating the properties, they stay and don't disappear. The quickest way to get a working transform is to add it to my data flow then save, close and reopen the package and then go through the wizard. Just saving after I add the transform does not help.

Does anyone know what might be causing this very strange problem?

View 7 Replies View Related

Functions In A Transformation Of A Script Component In A Data Flow Task

Feb 19, 2008

Hello Helpers,

I need to know how to use my private function - created as a scalar-valued-function in SQL Server 2005 - in script component (here a transformation is used) in a data flow task to transform a two-digit-month into a tree-sign-month:

Example: '01' should be transformed into 'Jan'

Many thanks for alle your commitment and help!

Ulrike

View 4 Replies View Related

How Can I Check Variables Comes From Data Flow In Predence Constraints And Conditional Transformation?

May 1, 2008



Hi,

I have Variable , data source and conditional transformation which checks the count(*) if the count == 0 then I connect an script component and change variable to false(initial it is True) and write into a log file...

Then I check that variable on predence constarint at workflow if variable==True then success. BUT
Whenever I run the package my dataflow gets green even the condition does not meet like count==0 . So
my variable's value is "False". Actually if the condition doesnt meet then my script shouldnt work.
Am I missing something???

View 7 Replies View Related

Is There Away To Reference Global Variables In A Lookup Transformation That Are Set Outside A Data Task Flow?

Mar 31, 2008



The logic I am trying to recreate via SSIS is the following SQL statement:

insert into db3.dbo.targettable1 -- Target database table
(SiteC,
Objecte,
Attrib1)

select distinct ?,
?,
from ? -- Source database table
join dbo.targettable2 c1 -- Target database table
on c1.Alias = ? and
c1.CSetID = ? and
c1.FacID = (select f.PFacID
from dbo.Fac f
where f.FacID = ?)
where not exists (select * from dbo.targettable2 c -- Target database table
where c.Alias = ? and
c.FacID = ? and
c.CSetID = ?)


I have an OLE DB Source that consists of an expression to approximate the following portion of the Above Select statement:

Select ?,
from ? -- Source database table and

The package has 2 global variables User:CSetID and User::FacID whose scope is global to the package and whose values are set within a Foreach Loop Container outside of the Data Flow Task

I was trying to reference the 2 global variables within the Looup Transformation to recreate the following portion of the SQL statement.but encounter errors:



join dbo.targettable2 c1 -- Target database table
on c1.Alias = ? and
c1.CSetID = ? and
c1.FacID = (select f.PFacID
from dbo.Fac f
where f.FacID = ?)


In the Advanced Editor window of Lookup Transaction

select * from
(select * from [dbo].[targettable2 ]) as refTable
where [refTable].[Alias] = ? and [refTable].[FacID] = ? and
[refTable].[CSetID] = ?



Is there away to reference global variables in a Lookup Transformation that are set outside a Data Task Flow?

View 3 Replies View Related

Load Fact Table, Very Slow With Lookup - Data Flow Transformation

Apr 9, 2008



Hello,


I have developed some packages to load data into "Fact" tables in the data warehouse.
Some packages are OK, other ones not. What is the problem?: some packages load fact tables with lots of "Lookup - Data Flow Transformation" into the "data flow task" (lookup against dimension tables) but they are very very slow, too much slow to be choosen as a solution.

Do you have any other solutions to avoid using "Lookup - Data Flow Transformation"? Any other solution (SSIS, TSQL and so on....) is welcome to speed up the Fact table loading process.

Thank in advance

View 7 Replies View Related

Simple Data Flow Hangs

Jul 26, 2007

Hi

I have a simple data control task that has an OLE source and OLE target.

the source is a SQL query that returns 200m records this is then written straight out to a table. I have used a data flow task so that I can chunk up the inserts rather than using a INSERT INTO.....SELECT FROM.

I have ran it multiple times and it hangs once it reaches 18,961,020 records.

there is no locking on the database and I have even restarted the SQL instance to ensure that there was nothing else contending for resource.

I have changed gthe buffer size and rows per buffer to 100m and 100,000. Now it hangs before the 18.9m mark presumably becuase of the increased buffer size.

I notice that the SELECT statement continues to clock up io and cpu cycles but the BULK insert process has gone.

any ideas on where to start looking?

Thanks for your help

Marcus

View 16 Replies View Related

Ordering Columns In Data Flow (simple?)

Jan 6, 2007

Hello,
I am new to SSIS.
I am trying to write a simple package to export data from some SQL 2005 tables and into a flat file.
In my data flow, I am using the OLE-DB data source and then the flat file destination.

This all works fine except that I cant get the package to write the columns out in the order I want. Even when I drive the OLE-DB source by a query, they columns are getting written to the flat file in a different order than I want.

How is SSIS determining what order to write the columns in and, more importantly, how can I change it to do it in the order I want?
Please help if you can. As mentioned I am new to SSIS so please give clear+simple answers.

Thanks
Mgale1

View 5 Replies View Related

SQL 2012 :: SSIS Data Flow With Case Statements

Oct 29, 2014

I would like to know how I can add the following sample code to my Source data on Data Flow on SSIS, or what other options there are. The main issue is time as we have talking about 100's of millions of rows

select Sample,
CASE
WHEN Sample IS NULL
THEN NULL
WHEN SUBSTRING(Sample, 1, 6) IS NULL
THEN ' '
ELSE RTRIM(SUBSTRING(Sample, 1, 6))
END AS [Sample_1_6]
from TestTable

what I have done at this stage is just to Create a SQL task with a Insert into

INSERT INTO [dbo].[TestTable1]
([Sample]
,[Sample_1_6])
select Sample,
CASE WHEN Sample IS NULL =THEN NULL
WHEN SUBSTRING(Sample, 1, 6) IS NULL THEN ' '
ELSE RTRIM(SUBSTRING(Sample, 1, 6))
END AS [Sample_1_6]
from TestTable

If there is a way adding this to a dataflow so I van use fast load that would really be the best solution. I know there are derived columns, but would this really be faster than the straight insert into in a SQL Task? If this is the way to go what is the code I would use in the derived column or any other option.

View 7 Replies View Related

How To Programmatically Set Column Mappings Of A Simple Data Flow Task?

Sep 4, 2007

Has anyone done this? I can't find anything in the documentation
that describes this. The closest I get is to the InnerObject property
of the TaskHost class. There is an example of programming a bulk
insert task. But I can't find anything on programmatically setting
the column mappings (source to dest) of a simple data flow task. Any
help is appreciated!

View 7 Replies View Related

Try Catch Doesn't Catch Errors Inside A Data Flow Transformation Script Component

Feb 15, 2007

Hi,

I'm having trouble with a Script Component in a data flow task. I have code that does a SqlCommand.ExecuteReader() call that throws an 'Object reference not set to an instance of an object' error. Thing is, the SqlCommand.ExecuteReader() call is already inside a Try..Catch block. Essentially I have two questions regarding this error:

a) Why doesn't my Catch block catch the exception?
b) I've made sure that my SqlCommand object and the SqlConnection property that it uses are properly instantiated, and the query is correct. Any ideas on why it is throwing that exception?

Hope someone could help.

View 3 Replies View Related

Insert / Update In Master Table And Also Save A History Of Changed Records : Using Data Flow/simple Sql Queries

Feb 9, 2007

Hi,

My scenario:

I have a master securities table which has 7 fields. As a part of the daily process I am uploading flat files into database tables. The flat files contains the master(static) security data as well as the analytics(transaction) data. I need to

1) separate the master (static) data from the flat files,

2) check whether that data is present in the master table, if not then insert that data into the master table

3) If data present then move that existing record to an history table and then update the main master table.

All the 7 fields need to be checked to uniquely identify a single record in the master table.

How can this be done? Whether we can us a combination of data flow items or write a sql procedure to do all this.

Thanks in advance for your help.

Regards,

$wapnil

View 4 Replies View Related

Execute Parameterized Select Statement From Data Flow

Aug 25, 2006

I have following requirement. From OLE-DB source I am getting IDS. Then lookup with some master data. Now I have only matching IDs. Now I need find some filed(say Frequency from some table for each above id). I already write stored procedure for same where I am passing ID as parameter.Which is working fine when I run it SQL server management studio.

Query is sort of

Select field1,fiel2... from table 1 where id = @id

@id is each ID from lookup

Now I want to call this stored procedure in Data flow. I tried it using OLE DB command but it did not return output of stored procudre. I am getting output same what ever I am passing input.

Is there way to do this? In short my requirement is execute parametrized select statement using data flow trasformation component.

View 8 Replies View Related

HOW? Access Results Of Computed SQL Statement In A Data Flow

Apr 24, 2007

The SQL computed is complex enough that I can't see a way to make it a parameterized query. The obvious approach seems to be to compute the SQL in a CONTROL FLOW SCRIPT TASK and then use it to load a variable to set the VARIABLE SOURCE of a CONTROL FLOW EXECUTE SQL TASK.



I see that I can return a resultset to a variable.



But getting the rows of the results into a dataflow is not obvious. I have heard mentione that a Derived Column can do this. I can see using a dummy SCRIPT COMPONENT as DATA SOURCE with nothing in it to then drop into DERIVED COLUMN. But when setting up DERIVED COLUMN I don't see how to pull the columns out of the RESULTSET variable.



If it makes a difference I think the columns of the resultset will always be the same in this scenario.



Maybe this is totally the wrong approach? Any clues would be appreciated.

View 1 Replies View Related

Converting Data Like A Case Statement

Oct 10, 2006

Hello.

I have data in a SSIS package that I need to alter to something else.

The source column is a VARCHAR(3) column and it only contains two possible values, "ACT" or "CLS".

The destination column is a CHAR(1) column. Where the value of the source column is 'ACT' I want to put '1' in the destination and where the value of the source column is 'CLS' I want to put '0'.

I can do this easily in T-SQL using a CASE statement but the source data is an Ingres database and CASE isn't a valid SQL keyword.

Can I use a data conversion task to do this in SSIS? and if so, what's the syntax?

Thanks









View 10 Replies View Related

Little Urgent.....Advice On Logic Flow In SSIS Transformation......!

May 19, 2008

Dear All,

I have a table A with a KEY column and SSN column.
KEY = 12 digits ( first 3 digits are Department Id , and last 9 digits are SSN)

I have a table B with SSN column only.

both KEY and SSN columns are Primary keys so duplicate entries must be Avoided.

Table A is intended to be popluated weekly from TXT file (SSIS package RUN). I want to achieve somethign like this..!

P-Code sample:

for each Row in TXT file
if TXTfile.KEY = TableA.KEY then
skip and Read/Go to next Row in TXT file
else
INSERT TXTfile.KEY into TABLEA.KEY
SSN_Var = EXTRACT the SSN part (SSNpart.READ)
if SSN_VAR.Exists In TableB.AnyRow then
skip
else
Insert into TableB
End If.
End If
End For Loop.

-----------------------------------------------------------------
Using SSIS controls, what will be best flow and logic to achieve this.....?
any sample scripting code ????

Many Thanks.

View 9 Replies View Related

Reuse Existing Data Flow Components In A Custom Data Flow Component

Aug 29, 2007

Hello,

Is it possible to use existing data flow components (Merge Join, aggregation,...) in a custom data flow component?

Thanks,

Yoann

View 15 Replies View Related

Assigning User Name And Password Form The Databse To Ssis Package For Data Flow Operations And Execute Sql Statement

May 8, 2008

hi in my package, some sql operations need the special user name and admin privilage. so how do i create my ssis package so that when it executes it takes the given username and password from the table in some database.

View 8 Replies View Related

Using Case To Control Program Flow ...

Apr 22, 2004

I'm writing my first serious stored procedure.

Essentially I have an incoming file, each line in the file is a record.
The records share the same initial key fields for the first 10 columns, then the field structure varies depending on a rectype and sequence number.

My initial plan was load the keys into fields, and load the remaining data into a long varchar field.

Then the stored procedure would evaluate the Rectype and Seqno of each record and chop up the Varchar accordingly.

So I set up a cursor to read the temporary table, do a fetch into variables, and go to evaluate the variables.

I want to be able to use a CASE statement to evaluate the fields and then perform various logic, but it's giving me fits because it seems like CASE only really works in Select statements, and won't really allow you to do any sort of GOTO logic.

I chopped the following SQL up and put in a rough cut of what I thought I was doing.

DROP TABLE #FOO
GO
CREATE TABLE #FOO (
[planno] [char] (6) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[ssn] [char] (9) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL ,
[location] [char] (4) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[eligdate] [char] (8) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[rectype] [char] (3) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL ,
[seqno] [char] (3) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL ,
[empno] [char] (13) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[payrollcode] [char] (4) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[companycode] [char] (3) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[department] [char] (13) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[filler] [char] (34) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[data] [varchar] (366) COLLATE SQL_Latin1_General_CP1_CI_AS NULL
) ON [PRIMARY]
GO

BULK INSERT #FOO FROM '\l32b0021foo.dat'
WITH (FORMATFILE = '\l32b0021foo.fmt')

declare @sessionid varchar(12)
declare @dedcode varchar(5)
declare @eeamt money
declare @eepct decimal
declare @effdt datetime

set @effdt = getdate()
set @sessionid = 'PENDING_BI'


DECLARE Transaction_Cursor CURSOR FOR
SELECT ssn,rectype,seqno,data
from #foo

OPEN Transaction_Cursor

declare @eepssn varchar(9)
declare @rectype varchar(3)
declare @seqno varchar(3)
declare @data varchar(366)
declare @eeccoid varchar(6)
declare @eeceeid varchar(12)
declare @eecempno varchar(9)
declare @companycode varchar(5)
declare @type varchar(20)


FETCH NEXT FROM Transaction_Cursor into @eepssn,@rectype,@seqno,@data

WHILE @@FETCH_STATUS = 0
BEGIN
select @companycode = cmpcompanycode,
@eeccoid = eeccoid,
@eeceeid = eeceeid,
@eecempno = eecempno
from company,empcomp
where eeceeid = (select eepeeid from emppers where eepssn = @eepssn )
and eecemplstatus = 'A'
and cmpcoid = eeccoid




CASE @RECTYPE+@SEQNO
when '001001' then goto parse_pretax
when '002001' then goto parse_LOAN
else select @RECTYPE+@SEQNO+' not recognized!'
end
process_it:
insert into foo2 (empno,companycode,amt,pct)
values (@eecempno,@companycode,@eeamt,@eepct)
FETCH NEXT FROM Transaction_Cursor into @eepssn,@rectype,@seqno,@data
END

CLOSE Transaction_Cursor
DEALLOCATE Transaction_Cursor

goto bypass

parse_pretax:
let @eepct = substring(@data,1,5)

goto process_it

parse_loan:
let @eeamt = substring(@data,27,11)
goto process_it

bypass:


I could sketch it out a little bit better in Northwind or Pubs, but I think I just need a smack upside the head and a little edification.

View 1 Replies View Related

How To Pass Parameter Froon Control Flow To Data Flow

Feb 14, 2006

Hi, All,

I need to pass a parameter from control flow to data flow. The data flow will use this parameter to get data from a Oracle source.

I have an Execute SQL task in control flow to assign value to the Parameter, next step is a data flow which will need take a parameter in the SQL statement to query the Oracle source,

The SQL Looks like this:

select * from ccst_acctsys_account

where to_char(LAST_MODIFIED_DATE, 'YYYYMMDD') >?

THe problem is the OLE DB source Edit doesn€™t have anything for mapping parameter.

Thanks in Advance





View 2 Replies View Related

Problem Using Result From CASE In Another CASE Statement

Nov 5, 2007

I have a view where I'm using a series of conditions within a CASE statement to determine a numeric shipment status for a given row. In addition, I need to bring back the corresponding status text for that shipment status code.

Previously, I had been duplicating the CASE logic for both columns, like so:




Code Block...beginning of SQL view...
shipment_status =
CASE
[logic for condition 1]
THEN 1
WHEN [logic for condition 2]
THEN 2
WHEN [logic for condition 3]
THEN 3
WHEN [logic for condition 4]
THEN 4
ELSE 0
END,
shipment_status_text =
CASE
[logic for condition 1]
THEN 'Condition 1 text'
WHEN [logic for condition 2]
THEN 'Condition 2 text'
WHEN [logic for condition 3]
THEN 'Condition 3 text'
WHEN [logic for condition 4]
THEN 'Condition 4 text'
ELSE 'Error'
END,
...remainder of SQL view...






This works, but the logic for each of the case conditions is rather long. I'd like to move away from this for easier code management, plus I imagine that this isn't the best performance-wise.

This is what I'd like to do:



Code Block
...beginning of SQL view...
shipment_status =
CASE
[logic for condition 1]
THEN 1
WHEN [logic for condition 2]
THEN 2
WHEN [logic for condition 3]
THEN 3
WHEN [logic for condition 4]
THEN 4
ELSE 0
END,


shipment_status_text =

CASE shipment_status

WHEN 1 THEN 'Condition 1 text'

WHEN 2 THEN 'Condition 2 text'

WHEN 3 THEN 'Condition 3 text'

WHEN 4 THEN 'Condition 4 text'

ELSE 'Error'

END,
...remainder of SQL view...


This runs as a query, however all of the rows now should "Error" as the value for shipment_status_text.

Is what I'm trying to do even currently possible in T-SQL? If not, do you have any other suggestions for how I can accomplish the same result?

Thanks,

Jason

View 1 Replies View Related

Simple Question - Control Flow Related

Jan 11, 2007

Hi Everyone:

I have a For Each Loop container, and inside the container, I have a SQL Execute task, which runs first, and then I need to kick off 5 Data Flow Tasks. Do I need to connect the 5 DFTs to each other using the Green(Pipelines). How would you usually do this? Thx.

View 5 Replies View Related

HELP: How Do I Pass Variables From Control Flow To Data Flow

Mar 9, 2007

I have an Execute SQL Task that returns a Full Rowset from a SQL Server table and assigns it to a variable objRecs. I connect that to a foreach container with an ADO enumerator using objRecs variable and Rows in first table mode. I defined variables and mapped them to the columns.

I tested this by placing a Script task inside the foreach container and displaying the variables in a messagebox.

Now, for each row, I want to write a record to an MS Access table and then update a column back in the original SQL Server table where I retreived data in the Execute SQL task (i have the primary key). If I drop a Data Flow Task inside my foreach container, how do I pass the variables as input to an OLE DB Destination on the Data Flow?

Also, how would I update the original source table where source.id = objRects.id?

Thank you for your assistance. I have spent the day trying to figure this out (and thought it would be simple), but I am just not getting SSIS. Sorry if this has been covered.

Thanks,

Steve

View 17 Replies View Related

Handle Tasks In Control Flow Tab From Data Flow Tab

Jan 17, 2008

Dear All!
My package has a Data Flow Task. In Data Flow Task, I use a Script Component and a OLE BD Destination to transform data from txt file to database.
Within Data Flow Task, I want to call File System Task to move file to a folder or any Task of "Control Flow" Tab. So, Does SSIS support this task? Please show me if any
Thanks

View 3 Replies View Related

SSIS Variables Between Data Flow And Control Flow... How To????

May 17, 2007

Hi everyone,

Primary platform is 64 bit cluster.

How to move information allocated in SSIS variables from Data Flow to Control Flow layers??

We've got a SSIS package which load a value into a variable inside a Data Flow. Going back to Control Flow how could we retrive that value again????

Thanks in advance and regards,

View 4 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved