Load Fact Table, Very Slow With Lookup - Data Flow Transformation
Apr 9, 2008
Hello,
I have developed some packages to load data into "Fact" tables in the data warehouse.
Some packages are OK, other ones not. What is the problem?: some packages load fact tables with lots of "Lookup - Data Flow Transformation" into the "data flow task" (lookup against dimension tables) but they are very very slow, too much slow to be choosen as a solution.
Do you have any other solutions to avoid using "Lookup - Data Flow Transformation"? Any other solution (SSIS, TSQL and so on....) is welcome to speed up the Fact table loading process.
Thank in advance
View 7 Replies
ADVERTISEMENT
Mar 31, 2008
The logic I am trying to recreate via SSIS is the following SQL statement:
insert into db3.dbo.targettable1 -- Target database table
(SiteC,
Objecte,
Attrib1)
select distinct ?,
?,
from ? -- Source database table
join dbo.targettable2 c1 -- Target database table
on c1.Alias = ? and
c1.CSetID = ? and
c1.FacID = (select f.PFacID
from dbo.Fac f
where f.FacID = ?)
where not exists (select * from dbo.targettable2 c -- Target database table
where c.Alias = ? and
c.FacID = ? and
c.CSetID = ?)
I have an OLE DB Source that consists of an expression to approximate the following portion of the Above Select statement:
Select ?,
from ? -- Source database table and
The package has 2 global variables User:CSetID and User::FacID whose scope is global to the package and whose values are set within a Foreach Loop Container outside of the Data Flow Task
I was trying to reference the 2 global variables within the Looup Transformation to recreate the following portion of the SQL statement.but encounter errors:
join dbo.targettable2 c1 -- Target database table
on c1.Alias = ? and
c1.CSetID = ? and
c1.FacID = (select f.PFacID
from dbo.Fac f
where f.FacID = ?)
In the Advanced Editor window of Lookup Transaction
select * from
(select * from [dbo].[targettable2 ]) as refTable
where [refTable].[Alias] = ? and [refTable].[FacID] = ? and
[refTable].[CSetID] = ?
Is there away to reference global variables in a Lookup Transformation that are set outside a Data Task Flow?
View 3 Replies
View Related
Dec 28, 2007
Hi,
I'm trying to implement an incremental data pull (Oracle to SQL) based on Andy's blog:
http://sqlblog.com/blogs/andy_leonard/archive/2007/07/09/ssis-design-pattern-incremental-loads.aspx
My development machine is decent: 1.86 GHz, Intel core 2 CPU, 3 GB of RAM.
However it seems the data flow task gets hung whenever I test the package against the ~6 million row source, as can be seen from these screenshots. I have no memory limitations on the lookup transformation. After the rows have been cached nothing happens. Memory for the dtsdebug process hovers around 1.8 GB and it uses 1-6 percent of CPU resources continuously. I am not using fast load to insert new records into my sql target table. (I am right clicking Sequence Container 3 and executing this container NOT the entire package in the screenshots)
http://i248.photobucket.com/albums/gg168/boston_sql92/1.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/2.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/3.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/4.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/5.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/6.jpg
The same package works fine against a similar test table with 150k rows.
http://i248.photobucket.com/albums/gg168/boston_sql92/7.jpg
http://i248.photobucket.com/albums/gg168/boston_sql92/8.jpg
The weird thing is it only takes 24 minutes for a full refresh of the entire source table from Oracle to the SQL target table.
Any hints,advice would be appreciated.
View 18 Replies
View Related
Sep 24, 2015
I have a transaction table having about 40 crore rows in source. It don't have timestamp and unique key columns. It have only Bill_month and Bill_Year columns. Actually for loading this table into staging I have added a new datetime column by adding default bill_date as 01. Then
* First we delete last 3 month data from staging tables.
* Get last 3 months data from source table.
* Load that 3 months data from source to staging table.Â
We do this because we only get update for last three months data. Now I have to include this transaction table as Fact table in DW. What will be the best practice for loading the fact table by picking data form staging table. Also we have to look up with dimensions for Foreign Keys.Â
* Should I implement the same method of deleting last 3 months records and loading them again.Â
View 3 Replies
View Related
Aug 16, 2007
Hi,
I use lookups to map surrogate of level 1 dimensions to my fact tables in SSIS.
But how to handle a level 2 dimension with a ValidFrom and a ValidUntil date field?
I do not use an IsCurrent column, because this could problem with late arriving facts.
- In dts I used an SQL statement like this:
update SA
SET SA.DimProdRef = Dim.RecordID
FROM SAWarenEingang SA, DimProd Dim
where SA.ProduktNumber = Dim.ProduktNumber
and SA.ArtikelkontoBewegungsdatum between Dim.ValidFrom and Dim.ValidUntil
Now in SSIS I want to handle the whole thing in the data flow without using a staging table:
- Using Lookups: I would have to pass the date column for each inside the fact table into the lookup. That does not work.
- Using Execute SQL in the data flow: would be very slow, because the statement will be executed for any line in the dataflow
Any ideas?
Best regards,
Stefoon
View 10 Replies
View Related
Jun 16, 2015
We have created SSIS package to load a text file into a table. Source system shares 10 text files and recently they stopped generating data for one of the text file (comping empty), after few months they will start generating the data for the empty file batch processing.Â
The Issue here is Data Flow task is getting failed while loading empty text file into table. How to handle this empty file load issue in SSIS package.
View 3 Replies
View Related
Aug 25, 2006
Hi All,
I am just curious to know how I can load data from a data warehouse to an Analysis Service Cube (both to the fact tables and dimensions).
Does any body have some way to achieve this?
I appreciate if any body provide me a good material which describe this scenario.
Sincerely,
--Amde
View 1 Replies
View Related
Oct 20, 2014
I have a stored procedure in that attempts to perform a WHERE NOT EXISTS check to insert new records. If the table is empty, the procedure will load the table. However, an insert does not occur when a change to one or more source fields occurs against an existing record. The following is my code:
declare
@Created_By nchar(50)
,@Created_Date datetime
,@Updated_By nchar(50)
,@Updated_Date datetime
select @Created_By = system_user
[code]....
I expected that when one of the source values of any field in the second WHERE clause changes, that the procedure would insert a new record. Why is this not happening? One other note: I am not 'allowed' to use MERGE.
View 1 Replies
View Related
Feb 14, 2006
I need help from you data warehouse / SSIS experts out there! I have a Transaction Fact Table with dollar amounts as the measurements. The grain is one row per transaction. I want to roll this up into a Monthly Periodic Snapshot based on 5 keys. I am having no problem where there is transaction data for each month.
However, the problem I am having is - how do I gracefully insert the Monthly rows for the five keys where there was no activity in the transaction fact table - I am sure there is a slick way to do this with SSIS but I am definitely having a mental block on how to accomplish this. Any help would be appreciated!
View 2 Replies
View Related
Mar 31, 2008
I have a lookup transformation that retrieves a key for a certain column of values, in this case, a name. So, I go in to the lookup table with a name and come out with its key. I had it working and then I added new entries to the lookup table for a bunch of new names. Now, for some reason, I am not getting the matches for the new names. But I am still getting the matches for the names that existed before I added the new ones.
I'm wondering if the lookup transformation is using the old set of data and some how not picking up the new names. Do I have to trigger something in the lookup transformation to let it know that the lookup table data has changed?
View 4 Replies
View Related
Dec 20, 2006
I am trying to use a CTE in an OLE DB Command data flow transformation object. However, when I enter the cte and corresponding query in the SqlCommand field of the OLE DB command editor dialog, I get a syntax error. Can CTE's be used data flow objects? I have been able to use them in an Execute SQL Control Flow Item, but not in any data flow item.
View 7 Replies
View Related
Nov 8, 2007
Hi,
I have one data flow control. Source is SQL server and destination is flat file destination. I have one derived column placed in between these two. This functionality works fine. I would like to sum one column data and count total no. of columns and put it in global variable. How can I achieve it?
Thanks,
View 7 Replies
View Related
Sep 18, 2006
Hi,
I'm creating a custom data flow transformation in c#.
I would like to use expressions within this component in the same way as in the derived column component: specifying the expression as a custom property of an output column, then evaluating this expression for each row of the buffer and using this evaluated expression to populate my output column values.
So I've added an custom expression on my output column, and set its expression type to CPET_NOTIFY
IDTSCustomProperty90 exp = col.CustomPropertyCollection.New();
exp.Name = "Expression";
exp.ExpressionType = DTSCustomPropertyExpressionType.CPET_NOTIFY;
But in the ProcessInput method I don't manage to get the evaluated expressions, when I use exp.Value I get my expression definition and not its evaluation.
Is there a way to get these evaluated expressions ?
Thanks,
Stéphane
View 6 Replies
View Related
Oct 4, 2006
Here is the deal - I have a flat file with 2 fields:
ProdNum FeatureCodes
1 A01, B22, F09
2 C13,C24,E05,G02,G09,J07,J09,M03,M17
3 J07,M01,M17,N02,N11,N13,X15
The I have Excel file like:
Code Description
A01 Handicap features
B22 Smoke-Free
F09 Tinted Windows
C13 Picnic Area
C24 Extra Storage
J07 Tile Flooring
M01 Central Airconditioning
The result in a database needs to be like:
ProdNum Features
1 Handicap features, Smoke-Free, Tinted Windows
2 Picnic Area, Extra Storage, .....
3 Tile Flooring, Central Airconditioning, .....
How can I do this in a data flow - any ideas???
View 8 Replies
View Related
Jan 15, 2007
(Just getting started with SSIS) I have a data flow task with source S1 and destination D. The mappings from S1 to D are straightforward, but additionally, I need to lookup a value from source S2 one time and map the value a column in D for every row of S1. Note that this lookup based on a constant value independent of S1, and therefore no column mappings are involved (this lookup is essentially "standalone").
I would expect I could do the lookup, assign the value to a data flow level variable, then reference that variable in a derived column transform or some such thing, but I'm having trouble figuring out how to do the lookup in the data flow task and assign it to a variable. Am I on the right track here, or...?
View 9 Replies
View Related
May 16, 2008
Hi,
I've created my own custom data flow transformation task (using C#) that
will parse a fullname and output the various name parts. In the
ProvideComponentProperties method, I create 5 output columns (prefix, first, middle,
last, and suffix). In the ProcessInput method, I parse the input and add the
name parts to the buffer. The bad thing is that I€™m making an assumption on
the position of the Full Name input column within the buffer.
I would like the €œuser€? to be able to map their "full name" input column to a known Full Name column so I don€™t have to make any assumptions. This is the first
SSIS task I€™ve tried to create and I haven€™t been able to find very many
examples online.
Any help is greatly appreciated!
Thank you,
Marshall
View 1 Replies
View Related
May 11, 2007
Using SSIS to import from Excel to SQL Server.
In Excel they are showing Sale as -ve quantity and purchase as +ve quantity.
The database has quantity always as +ve figure and a separate column "isPurchase" set to true or false depending on whether purchase or sale.
So I need Derived Column to return a bolean (or int) depending whether quantity is positive or negative. I tried each of the following in the Expression but all of them were invalid expressions.
CASE WHEN [TotalQuantity] > 0 THEN 0 ELSE 1 END
If [TotalQuantity] > 0 THEN 1 ELSE 0 END
IIF ([TotalQuantity] > 0, 1,0)
Can anyone help me with correct syntax, or correct Data Flow Transformation if Derived column is wrong.
Thanks
Richard
View 3 Replies
View Related
May 12, 2006
I created a custom transform that has a custom interface and is a wizard that uses a web service. It creates custom properties and output columns on the fly. I set the dialog result to Ok and close at the end of the steps. The transform then has the custom fields and output columns I created in the wizard. I've verified this by right clicking on the transform and going to the advanced editor. If I then immediately run the package, the custom fields don't exist in the CustomPropertiesCollection. If I close the package and reopen it, the properties now are gone. If I then go through the wizard again, thus recreating the properties, they stay and don't disappear. The quickest way to get a working transform is to add it to my data flow then save, close and reopen the package and then go through the wizard. Just saving after I add the transform does not help.
Does anyone know what might be causing this very strange problem?
View 7 Replies
View Related
Feb 19, 2008
Hello Helpers,
I need to know how to use my private function - created as a scalar-valued-function in SQL Server 2005 - in script component (here a transformation is used) in a data flow task to transform a two-digit-month into a tree-sign-month:
Example: '01' should be transformed into 'Jan'
Many thanks for alle your commitment and help!
Ulrike
View 4 Replies
View Related
Jul 30, 2007
I'm creating a new Integration Services Project that copies data out of a SQL 7 server, transforms it, and places the data on a SQL 2005 (SP 2) Server. When defining a lookup transformation, if I specify an OLE DB Connection to my server running SQL 7 as the reference table, as soon as I click on the Colums tab, Visual Studio closes / crashes and dumps me to windows. I don't get an error message. If however I specify a connection to a server running SQL 8, or SQL 2005, no problems.
Is this supposed to happen?
My workstation is running Windows XP Pro SP2, Visual Studio 2005 Pro.
Microsoft SQL Server Integration Services Designer
Version 9.00.1399.00
The server that doesn't work for a reference table is running Windows 2000 Server SP4
SQL 7.00.623
Thanks for your help,
Kirk
View 6 Replies
View Related
May 1, 2008
Hi,
I have Variable , data source and conditional transformation which checks the count(*) if the count == 0 then I connect an script component and change variable to false(initial it is True) and write into a log file...
Then I check that variable on predence constarint at workflow if variable==True then success. BUT
Whenever I run the package my dataflow gets green even the condition does not meet like count==0 . So
my variable's value is "False". Actually if the condition doesnt meet then my script shouldnt work.
Am I missing something???
View 7 Replies
View Related
Aug 30, 2006
I've built a simple custom data flow transformation component following the Hands On Lab (http://www.microsoft.com/downloads/details.aspx?familyid=1C2A7DD2-3EC3-4641-9407-A5A337BEA7D3&displaylang=en) and the Books Online (ms-help://MS.MSDNQTR.v80.en/MS.MSDN.v80/MS.SQL.v2005.en/dtsref9/html/adc70cc5-f79c-4bb6-8387-f0f2cdfaad11.htm and ms-help://MS.MSDNQTR.v80.en/MS.MSDN.v80/MS.SQL.v2005.en/dtsref9/html/b694d21f-9919-402d-9192-666c6449b0b7.htm).
All it is supposed to do is create an output column and set its value to the result of calling a web service method (the transformation is synchronous). Everything seems fine, but when I run the data flow task that contains it, it doesn't generate any output. The Visual Studio debugger displays it as yellow, with 1,385 rows going into it, but the data viewer attached to its output is empty. The output metadata looks just like I expect: all of my input columns plus the new column, correctly typed. No validation or run-time warnings or errors are reported.
I'll include the entire C# file below, which only overrrides the ProvideComponentProperties, Validate, PreExecute, ProcessInput, and PostExecute methods of the parent PipelineComponent class.
Since this is effectively a specialization of the DerivedColumn transformation, could I inherit from the class that implements the DC component instead of PipelineComponent? How do I even find out what that class is?
Thanks! Here's the code:
using System;
// using System.Collections.Generic;
// using System.Text;
using Microsoft.SqlServer.Dts.Pipeline;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
using Microsoft.SqlServer.Dts.Runtime.Wrapper;
namespace CustomComponents
{
[DtsPipelineComponent(DisplayName = "GID", ComponentType = ComponentType.Transform)]
public class GidComponent : PipelineComponent
{
///
/// Column indexes for faster processing.
///
private int[] inputColumnBufferIndex;
private int outputColumnBufferIndex;
///
/// The GID web service.
///
private GID.WS_PDF.PDFProcessService gidService = null;
///
/// Called to initialize/reset the component.
///
public override void ProvideComponentProperties()
{
base.ProvideComponentProperties();
// Remove any existing metadata:
base.RemoveAllInputsOutputsAndCustomProperties();
// Create the input and the output:
IDTSInput90 input = this.ComponentMetaData.InputCollection.New();
input.Name = "Input";
IDTSOutput90 output = this.ComponentMetaData.OutputCollection.New();
output.Name = "Output";
// The output is synchronous with the input:
output.SynchronousInputID = input.ID;
// Create the GID output column (16-character Unicode string):
IDTSOutputColumn90 outputColumn = output.OutputColumnCollection.New();
outputColumn.Name = "GID";
outputColumn.SetDataTypeProperties(Microsoft.SqlServer.Dts.Runtime.Wrapper.DataType.DT_WSTR, 16, 0, 0, 0);
}
///
/// Only 1 input and 1 output with 1 column is supported.
///
///
public override DTSValidationStatus Validate()
{
bool cancel = false;
DTSValidationStatus status = base.Validate();
if (status == DTSValidationStatus.VS_ISVALID)
{
// The input and output are created above and should be exactly as specified
// (unless someone manually edited the persisted XML):
if (ComponentMetaData.InputCollection.Count != 1)
{
this.ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component accepts 1 Input.",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISCORRUPT;
}
else if (ComponentMetaData.OutputCollection.Count != 1)
{
this.ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component provides 1 Output.",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISCORRUPT;
}
else if (ComponentMetaData.OutputCollection[0].OutputColumnCollection.Count != 1)
{
this.ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component Output must be 1 column.",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISCORRUPT;
}
// And the output column should be a Unicode string:
else if ((ComponentMetaData.OutputCollection[0].OutputColumnCollection[0].DataType != DataType.DT_WSTR) ||
(ComponentMetaData.OutputCollection[0].OutputColumnCollection[0].Length != 16))
{
ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component Output column data type must be (DT_WSTR, 16).",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISBROKEN;
}
}
return status;
}
///
/// Called before executing, to cache the buffer column indexes.
///
public override void PreExecute()
{
base.PreExecute();
// Get the index of each input column in the buffer:
IDTSInput90 input = ComponentMetaData.InputCollection[0];
inputColumnBufferIndex = new int[input.InputColumnCollection.Count];
for (int col = 0; col < input.InputColumnCollection.Count; col++)
{
inputColumnBufferIndex[col] = BufferManager.FindColumnByLineageID(input.Buffer, input.InputColumnCollection[col].LineageID);
}
// Get the index of the output column in the buffer:
IDTSOutput90 output = ComponentMetaData.OutputCollection[0];
outputColumnBufferIndex = BufferManager.FindColumnByLineageID(input.Buffer, output.OutputColumnCollection[0].LineageID);
// Get the GID web service:
gidService = new GID.WS_PDF.PDFProcessService();
}
///
/// Called to process the buffer:
/// Get a new GID and save it in the output column.
///
///
///
public override void ProcessInput(int inputID, PipelineBuffer buffer)
{
if (! buffer.EndOfRowset)
{
try
{
while (buffer.NextRow())
{
// Set the output column value to a new GID:
buffer.SetString(outputColumnBufferIndex, gidService.getGID());
}
}
catch (System.Exception ex)
{
bool cancel = false;
ComponentMetaData.FireError(0, ComponentMetaData.Name, ex.Message, string.Empty, 0, out cancel);
throw new Exception("Could not process input buffer.");
}
}
}
///
/// Called after executing, to clean up.
///
public override void PostExecute()
{
base.PostExecute();
// Resign from the GID service:
gidService = null;
}
}
}
View 1 Replies
View Related
Aug 3, 2007
When I copy over an SSIS package I have been developing from my laptop to my desktop with Windows File Sharing (shared folders) across a home network, the moved package fails to load properly. I can see the Control Flow tasks but not the Data Flow tasks - they simply disappear! I have update the Connection Managers to point to the new machine, and tested them (OK).
Its the same as this issue posted here
And Here
Creating an empty package and clicking to create a new data flow cause this error:
TITLE: Microsoft Visual Studio
------------------------------
Failed to create the task.
------------------------------
ADDITIONAL INFORMATION:
The designer could not be initialized. Microsoft.DataTransformationServices.Design)
Seems the only way is to uninstall and reinstall Client Components.
How do I uninstall and reinstall Client Components? when I try to do that, SQL Server Setup tells me "None of the selected features can be installed or upgraded. Setup cannot procees since there is no effective change being made to the machine...."
I had selected "Workstation components, Books online and Development tools". This is already installed - so how I can reinstall it if I get this message above which does not let me proceed.
View 2 Replies
View Related
Jul 1, 2015
My source has 2.2 million of records. I'm performing the incremental load.In the lookup transformation i used the destination table for the reference using Full cache mode.For the first time package executed successfully but when i executed the package second time, Suddenly Package hangs while running.Than i truncate the data from the destination table and restart the SQL Server Services.After doing all this i executed package again and it worked but when i executed package second time, again package hangs up .I have 8GB RAM and i5 2.5 GHz Processor laptop.
View 7 Replies
View Related
Dec 5, 2006
I have a fuzzy lookup in a data flow but this data flow never gets pass pre-execute phase. What is the problem?
Thanks
View 2 Replies
View Related
Feb 2, 2008
Slow OLEDB Source in Data Flow
Hi All,
I have a simple data flow task, composed of only an OLEDB Source, a Conditional Split, and two Execute SQL statements (both insert statements, one after the other). When I run my package in Visual Studio for debugging, I noticed that after executing around ~9800 in the first and another ~9800 records in the second insert statements, the OLEDB Source will take around 3 or 4 minutes to fetch another set of ~9800 records. I have set the DefaultBufferMaxRows property of the Data Flow to 10000. My query to retrieve those 700,000 records runs for about 2-3 mins to finish (which I think should be decent enough). Is this an expected behavior of SSIS? The expected number of records to be retrieved is 700,000, and it takes forever to finish the transfer of these records. Please help
View 4 Replies
View Related
Jun 11, 2007
Hello,
I developed an SSIS package doing a nightly load into a data warehouse. We have an 8 hour loading window - currently the package takes 16 hours to complete.
I isolated the problem to a Data Flow task where +-35% of the time is spent. This task is pretty straight forward:
- OLE DB source, reading +- 800,000 rows from a SQL server database
- 13 Lookups in sequence, to get surrogate keys from dimension tables. Lookups are all on GUIDS.
- An aggregation
- OLEDB target, fact table in a SQL server database.
It seems unreasonable for the this task to take over 5 hours. It spends the majority of time on the lookups - not so much at target, source and aggregation.
Any comments and advice will be greatly appreciated.
Thanks.
(PS some machine details:
OS Name Microsoft(R) Windows(R) Server 2003, Standard Edition
Version 5.2.3790 Service Pack 1 Build 3790
Other OS Description Not Available
OS Manufacturer Microsoft Corporation
System Name ARK-SQL
System Manufacturer HP
System Model ProLiant DL380 G5
System Type X86-based PC
Processor x86 Family 6 Model 15 Stepping 6 GenuineIntel ~1866 Mhz
Processor x86 Family 6 Model 15 Stepping 6 GenuineIntel ~1866 Mhz
BIOS Version/Date HP P56, 9/18/2006
SMBIOS Version 2.3
Windows Directory C:WINDOWS
System Directory C:WINDOWSsystem32
Boot Device DeviceHarddiskVolume1
Locale United States
Hardware Abstraction Layer Version = "5.2.3790.1830 (srv03_sp1_rtm.050324-1447)"
User Name Not Available
Time Zone South Africa Standard Time
Total Physical Memory 3,327.30 MB
Available Physical Memory 938.20 MB
Total Virtual Memory 1.10 GB
Available Virtual Memory 2.78 GB
Page File Space 2.00 GB
Page File C:pagefile.sys)
View 6 Replies
View Related
Oct 26, 2015
Say you have a fact table with a few columns that all reference the same key column in a dimension table, you want to write a view to return the information for those keys?
USE MyTestDB;
GO
SET NOCOUNT ON;
IF OBJECT_ID ('dbo.FactTemp' ,'U') IS NOT NULL
DROP TABLE dbo.FactTemp;
[Code] ....
I'm using very small data at the moment, and the query plan and statistics don't really say which way.
View 2 Replies
View Related
Jan 2, 2007
Hi Guys,
I
have a flat file which is loaded into the database on a daily basis.
The file contains rows of strings which I load into a table,
specifically to a column of length 8000.
The string has a length of 690, but the format is like 'xxxxxx xx xx..'
and so on, where 'xxxx' represents data. So there are spaces, etc present in the middle.
Previously
I used SQL 2000 DTS to load the files in, and it was just a Column
Transformation with the Col001 from the text file loading straight to
my table column. After the load, if I select len(col) it gives me 750
for all rows.
Once I started to migrate this to SSIS, I
allocated the Control Flow Task and specified the flat file source and
the oledb destination, and gave the output column a type of String and
output column width of 8000. But when I run the data flow task it
copies only 181 or 231 characters out of the 750 required.
I feel it stops where it finds the SPACES and skips the rest.
I
specified row delimiters or CR, and LF. I checked the file under
UltraEdit and there were no special characters in the file that would
cause the problem.
Any suggestions how I can get it to load the full data?
Thanks
View 26 Replies
View Related
Aug 14, 2007
Hi All,
I have to load my fact table using data from my stage table and joining it against dimesnsions, most of my dimesions are straight joins that I can implement using Lookups , howevere in one join I am using something like this
SELECT T.Dim_Time_ID ,
SUM (Measure)
FROM [dbo].[Stage_Table] S
INNER JOIN Dim_Time T
ON SUBSTRING(T.MonthYear,1,3) = SUBSTRING(S.Time,1,3)
AND SUBSTRING(T.MonthYear,6,2) = SUBSTRING(S.Time,6,2)
Currently I am using a copy column transformation to make a copy of this column, and then using substring function in my SQL code and the Lookup Transformation joining the data to get the desired output, was wondering if there is any other (better) way of accomplishing this inside the data flow.
Thanks
View 4 Replies
View Related
Feb 15, 2007
Hi,
I'm having trouble with a Script Component in a data flow task. I have code that does a SqlCommand.ExecuteReader() call that throws an 'Object reference not set to an instance of an object' error. Thing is, the SqlCommand.ExecuteReader() call is already inside a Try..Catch block. Essentially I have two questions regarding this error:
a) Why doesn't my Catch block catch the exception?
b) I've made sure that my SqlCommand object and the SqlConnection property that it uses are properly instantiated, and the query is correct. Any ideas on why it is throwing that exception?
Hope someone could help.
View 3 Replies
View Related
May 8, 2007
Hi all,
Maybe someone here can help me out: I have a Kimball type II dimension, where i track changes in a hierarchy. Each row has a RowStartDate and RowEndDate property to indicate from when to when a certain row should be used.
Now i want to load facts to that table. So each fact will have a certain date associated with it that i can use to lookup the right Id (a certain SourceId can have mulitiple integer Ids when there are historic changes) and then load the facts.
Is there a building block I can use for that? I could do this with SQL scripts but the client would prefer to have as much as possible done in SSIS. The Lookup transformation will only let me specify an equal (inner join where A=B) join, but i need equal for one column (SourceId) and then >= and <= (RowStart and RowEnd) to find the right row version.
Any ideas? Thanks in advance,
John
View 3 Replies
View Related
Sep 16, 2014
I have heard from my client that they are facing duplicate data issue on one of the fact table.
Basically there is a view built on fact table and client access the data through the views.
There warehouse is loaded daily through SSIS packages. The duplicate records issue is only when the views are queried during the data loading process. The duplicates are gone when the data load is completed.
View 1 Replies
View Related