Doing A Calculation Using Data From Two Different Source Tables

May 6, 2008



I use the following report expression for my SSRS report.


=SUM(Fields!CBNonAccr.Value) / SUM(Fields!TotNonAccr.Value)*ReportItems!HE__Non_Accruals.Value


When I try to preview the report , I get a error massege saying "report item expression can only refer to other report items within the same group grouping scope.

Fields!CBNonAccr.Value and Fields!TotNonAccr.Value come from one table. But HE_Non_Accruals.Value comes from another table which does not have a relationship with the first one. Its a seperate table.
So I'm wondering how do I do this now?

Thanks

View 1 Replies


ADVERTISEMENT

YTD Calculation When Using SSAS As Data Source

Jun 14, 2007

I have a cube that tracks sales by sales rep. In this cube, I have dimensions for SalesRep, Product, and Region hierarchies. I also have a Time dimension that provides Fiscal Year, Fiscal Quarter, Fiscal Month, and Calendar Date; the Time dimension also has an attribute showing the first day of the year for any given date (our fiscal year starts on a different date every year).



I have a report that passes StartDate and EndDate parameters back to the cube, and provides sales numbers by Rep, Product, and Region for that given date range. What I would also like is a field that provides YTD sales through the EndDate parameter.



This is a piece of cake for me to implement against the SQL tables, but I am pulling my hair out trying to determine the best way to implement with a cube. Does anyone have any suggestions?



TIA



- Steve

View 1 Replies View Related

Integration Services :: How To Do Data Profiling On 3 Source Tables

Sep 29, 2015

need to do data profiling on 3source tables .Can I use the data profiling task for it.

I am mapping the xml output to excel file using dataflow task.

View 2 Replies View Related

Importing Data From One Source In Two Destination Tables Linked By A Foreign Key

Nov 14, 2006

Hi,

I have a new problem when I import data from an xml source file in two destination tables. The two tables are linked by a foreign key... for example :

table MOTHER (MOTHER_ID, MOTHER_NAME)

table CHILD (CHILD_ID, MOTHER_ID, MOTHER_NAME)

After a lot of transformations data are inserted into MOTHER table and I want to insert other fields of the data flow in CHILD table. To do this, I need the MOTHER_ID field that is auto incremented in MOTHER table.

My problem is to chain the insertion in CHILD table after the insertion in MOTHER table to be sure that the relative row in MOTHER table is really inserted. I haven't find any solution to chain another transformation task after my flow destination "Insert into MOTHER table".

The only solution I have found is to create a new flow control to insert data in CHILD table, using a lookup transformation task to bind with MOTHER table... But with this solution all my flow control transforms are made two times...

Is there a solution to chain two insertions with a foreign key constraint in a data flow?

Thanks

Regards,

Arnaud Gervais.

View 4 Replies View Related

How To Replace Data Of Tables From Source To Destination Database Using SSIS

Apr 29, 2008

I would like to replace data of some tables from STG to DEV database daily using SSIS package. Should I use "Transfer SQL Server Objects Task" to do that? Thanks.

View 5 Replies View Related

Integration Services :: Get Data From Source By Executing Set Of Queries That Have Temp Tables

Jul 29, 2015

I need to grab data from teradata(using odbc connection).. i have no issues if its just bunch of joins and wheres conditions.. but now i have a challenge. simple scenario, i have to create volatile table, dump data into this and then grab data from this volatile table. (Don't want to modify the query in such a way i don't have to use this volatile table.. its a pretty big query and i have no choice but create bunch of volatile tables, above scenarios is just mentioned on simple 1 volatile table ).

So i created a proc and trying to pass this string into teradata, not sure if it works.. what options i have.. (I dont have a leisure to create proc in terdata and get it executed when ever i want and then grab data from the table. )

View 2 Replies View Related

An Error Has Occurred During Report Processing. A Data Source Instance Has Not Been Supplied For The Data Source DetailDS_get_o

Mar 13, 2008

hi ,

i am trying for a drill through report (rdlc)

ihave written the following code in drill through event of reportviewer, whenever i click on the first report iam getting the error like

An error has occurred during report processing.


A data source instance has no
t been supplied for the data source "DetailDS_get_orderdetail".







the code is



using System;

using System.Data;

using System.Data.SqlClient;

using System.Configuration;

using System.Collections;

using System.Web;

using System.Web.Security;

using System.Web.UI;

using System.Web.UI.WebControls;

using System.Web.UI.WebControls.WebParts;

using System.Web.UI.HtmlControls;

//using Microsoft.ApplicationBlocks.Data;

using Microsoft.Reporting.WebForms;

using DAC;

public partial class _Default : System.Web.UI.Page

{

protected void Page_Load(object sender, EventArgs e)

{

ReportViewer1.Visible = false;

}

protected void Button1_Click(object sender, EventArgs e)

{

DAC.clsReportsWoman obj = new clsReportsWoman();

DataSet ds = new DataSet();

ds = obj.get_order();

ReportViewer1.LocalReport.DataSources.Clear();

ReportDataSource reds = new ReportDataSource("DataSet1_get_order", ds.Tables[0]);



ReportViewer1.LocalReport.DataSources.Add(reds);

ReportViewer1.LocalReport.ReportPath = "C:/Documents and Settings/km63096/My Documents/Visual Studio 2005/WebSites/drillthrurep/Report.rdlc";

ReportViewer1.LocalReport.Refresh();

ReportViewer1.Visible = true;

}

protected void ReportViewer1_Drillthrough(object sender, DrillthroughEventArgs e)

{

DAC.clsReportsWoman obj = new clsReportsWoman();

ReportParameterInfoCollection DrillThroughValues =

e.Report.GetParameters();



foreach (ReportParameterInfo d in DrillThroughValues)

{

Label1.Text = d.Values[0].ToString().Trim();

}

LocalReport localreport = (LocalReport)e.Report;

string order_id = Label1.Text;

DataSet ds = new DataSet();

ds = obj.get_orderdetail(order_id);



ReportViewer1.LocalReport.DataSources.Clear();

ReportDataSource reds = new ReportDataSource("DetailDS_get_orderdetail", ds.Tables[0]);

ReportViewer1.LocalReport.DataSources.Add(reds);

ReportViewer1.LocalReport.ReportPath = Server.MapPath(@"Reportlevel1.rdlc");

ReportViewer1.LocalReport.Refresh();





}



}

the code in method get_orderdetail(order_id) is

public DataSet get_orderdetail(string order_id)
{
SqlCommand cmd = new SqlCommand();
DataSet ds = new DataSet();
cmd.Parameters.Add("@order_id", SqlDbType.VarChar, 50);
cmd.Parameters["@order_id"].Value = order_id;
ds = SQLHelper.ExecuteAdapter(cmd, CommandType.StoredProcedure, "dbo.get_orderdetail");
return (ds);
}pls help me.

View 1 Replies View Related

Power Pivot :: One Slicer To Control Two Pivot Tables That Have Different Source Data And Common Key

Jul 8, 2015

I have two data tables:

1) Production data with column headers: Key, Facility, Line, Time, Output
2) Costs data with column headers: Key, Site, Cost Center, Time, Cost

The tables have a common key named obviously as Key. The data looks like this:

Key
Facility
Line
Time
Output
Alpha

I would like to have two pivot tables which I can filter with ONE slicer based on the column Key. The first pivot table shows row labels Facility, Line and column labels Time. Value field is Output. The second pivot table shows row labels Site, Cost Center, and column lables Time. Value field is Cost.How can I do this with Power Pivot? I tried by linking both tables above to a table with unique Keys in PowerPivot and then creating a PivotTable where I would have used the Key from the Keys table.

View 5 Replies View Related

Analysis :: SSAS Calculation With Division Combined With A Time Calculation?

Sep 17, 2015

I have created calcalated measures in a SQL Server 2012 SSAS multi dimensional model by creating empty measures in the cube and use scope statements to fill the calculation.

(so I can use measure security on calculations

as explained here  )

SCOPE [Measures].[C];

THIS = IIF([B]=0,0,[Measures].[A]/[Measures].[B]);

View 2 Replies View Related

Amo And Creating Data Source And Data Source View Code

Feb 2, 2008

,
Hi
In this code how can I create a new data source and new data source view and model and structure that it run dynamic.
In this code I have a lot of errors, that they are about server and database don€™t have in current code,
In this code, first I should definition server or no?

Database dbNew = new Database (databaseName,
Utils.GetSyntacticallyValidID(databaseName, typeof(Database)));
srv.Databases.Add(dbNew);
dbNew.Update(true);
***********************************************************

How can I create data source and data source view and model and structure?
Please say code of that, and guide me.
databasename and srv is unknown.
Do I add other reference with analysis services?
Please explain about these codes:
************************************************************************
1)
RelationalDataSource dsNew = new RelationalDataSource(
datasourceName,
Utils.GetSyntacticallyValidID(
datasourceName,
typeof(RelationalDataSource)));

db.DataSources.Add(dsNew);
dsNew.ConnectionString = connectionString;

dsNew.Update();
2)

RelationalDataSourceView rdsv;

rdsv = db.DataSourceViews.Add(
datasourceviewName,
Utils.GetSyntacticallyValidID(
datasourceviewName,
typeof(RelationalDataSourceView)));

rdsv.DataSourceID = ds.ID
***************************************************************
3)
OleDbConnection cn = new OleDbConnection(ds.ConnectionString);
OleDbCommand cmd = new OleDbCommand(
"SELECT * FROM [" + tableName + "] WHERE 0=1", cn);
OleDbDataAdapter ad = new OleDbDataAdapter(cmd);

DataSet dss = new DataSet();
ad.FillSchema(dss, SchemaType.Source);

*************************************************************
4)

// Make sure we have the name we thought

dss.Tables[0].TableName = tableName;

// Clone here - the original DataTable already belongs to a DataSet
rdsv.Schema.Tables.Add(dss.Tables[tableName].Clone());
rdsv.Update();


5)

MiningStructure ms = db.MiningStructures.Add(miningstructureName, Utils.GetSyntacticallyValidID(miningstructureName,
typeof(MiningStructure)));
ms.Source = new DataSourceViewBinding(dsv.ID);
ms.CaseTableName = "Customer";

Add columns:
ScalarMiningStructureColumn smsc;

// From table "Customer" we will add a couple of columns
// CustomerID - key
smsc = new ScalarMiningStructureColumn("Customer ID",
Utils.GetSyntacticallyValidID("Customer ID", typeof(ScalarMiningStructureColumn)));
smsc.IsKey = true;
smsc.Content = "Key";
smsc.KeyColumns.Add("Customer", "customer_id", OleDbType.Integer);
ms.Columns.Add(smsc);

*******************************************
6)

MiningModel mm = ms.MiningModels.Add(miningmodelName,
Utils.GetSyntacticallyValidID(miningmodelName,
typeof(MiningModel)));


mm.Algorithm = "Microsoft_Decision_Trees";
mm.Parameters.Add("COMPLEXITY_PENALTY", 0.3);


MiningModelColumn mc = new MiningModelColumn("Customer ID",
Utils.GetSyntacticallyValidID("CustomerID",
typeof(MiningModelColumn)));
mc.SourceColumnID = ms.Columns["Customer ID"].ID;
mc.Usage = "Key";
mm.Columns.Add(mc);


mm.Update();

Please exactly say, whatever I want
Thanks a lot for your answer
Please don€™t move this question because I don€™t know where I should write this.

View 1 Replies View Related

How Do I Add An ODBC Connection Data Source As A Data Flow Source

Mar 2, 2007

I have set up a new connection as a connection from data source, but I cannot see how to use this connection to create my Data Flow Source. I have tried using an OLE DB connection, but this is painfully slow! The process of loading 10,000 rows takes 14 - 15 minutes. The same process in Access using SQL on a linked table via DSN takes 45 seconds.

Have I missed something in my set up of the OLE DB source / connection? Will a DSN source be faster?

Thanks in advance

ADG

View 2 Replies View Related

Converting Oracle Calculation To Sql Server 2005 Calculation

Jul 19, 2007

Hi I am having to convert some oracle reports to Reporting Services. Where I am having difficulty is with the

calculations.

Oracle

TO_DATE(TO_CHAR(Visit Date+Visit Time/24/60/60,'DD-Mon-YYYY HH24:MISS'),'DD-Mon-YYYY HH24:MISS')



this is a sfar as I have got with the sql version

SQLSERVER2005

= DateAdd("s",Fields!VISIT_DATE.Value,Fields!VISIT_TIME.Value246060 )



visit_date is date datatype visit_time is number datatype. have removed : from MI(here)SS as was showing as smiley.



using:

VS 2005 BI Tools

SQLServer 2005



View 5 Replies View Related

Create Data Source, Data Source View From XML?

May 8, 2007

hi everybody,
i want to create data source and data source view for data mining, with using C Sharp.
i have create data source and data source view and export to XML file, but when i change to another computer, run those XML file, it return error, when i run statement to create and biuld mining model, what can i change on xml or how to run XML on another computer sucessfully,
and have i build data source and data source view, how to do it.?

thanks for you helps

View 1 Replies View Related

Cannot Create A Connection To Data Source 'data Source Name'

Dec 11, 2007

Today I was making a few reports.
When I tested the reports in Visual Studio, they worked great: I got the expected result.
But when I deployed the reports to our reportserver the problem started.
When I click on the directory in which my reports are deployed, I got my 4 reports.
Till now everything worked correct.
But when I click on a report to view the results it went wrong.
I got an error:
"Cannot create a connection to data source 'Live'"
(Live is the name of our data source).

We are using the Windows Logons and I am sure that I have all the rights on the server, I gave myself 'sysadmin' rights, so it should work.
I also have tried it with all the roles assigned on my account, but then it still won't work.

When I modify the data source, and set it to another server en database it works.
The datasource 'Live' exists on a x64 MsSQL server, en the other datasource is on a x86 MsSQL server.
Maybe that is the problem?

Can someone tell me what is wrong?

View 1 Replies View Related

HELP: A Data Source Instance Has Not Been Supplied For The Data Source

May 20, 2007

Hi there,



I'm trying to build a report for Windows Forms (C#) using .rdlc files.

But every time I run it, I get this error message. I've followed the step-by-step from MSDN instructions... but it doesn't work!



Could anyone help me?!





Tks a lot!



Luis Antonio - Brazil

View 1 Replies View Related

Loading Data Using Ole Db Source With Input Source Being A View

Dec 13, 2007

I was trying to load data using SSIS, Data Flow Task, OLE DB Source, source was a view to a OLE DB Destination (SQL Server). This view returns 420,591 rows from Query Analyzer in 21 seconds. Row length is 925. When I try to executed the Data Flow Task from SSIS, I had to stop the process after 30 minutes, because only 2,000 rows had been retrieved. I modified the view to retun top 440, 000 and reran. This time all 420, 591 rows were retrieved and written in 22 seconds. Next, I tried to use a TOP 100 Percent. Again, only 2,000 rows were return after 30 minutes. TempDB is on a separate SAN Raid group with 200 gig free, Databases on a separate drive with 200 gig free. Server has 13 gig of memory and no other processes were executing.

The only way I could populate the table was by using an Execute SQL Task and hard code an Insert into table selecting data from the view (35 seconds) from SSIS.

Have anyone else experience this or a similar issue? Anyone have a solutionexplanation?

View 13 Replies View Related

Excluded Data In A Calculation

Nov 17, 2007

In the following query, when an es.evstrname DOES have data in os.activity in (5,7), then the query returns the expected results. However, if there was no out of service break, then the query will return results, but there will not be any data returned for the last column of the select statement. This
makes sense to me but I was hoping there was some way that I could have data returned. For instance, is there a way for me to make the query read
the - (os.actualdeparttime - os.actualarrivetime) as zero so the column would still consider the first part of the equation? Or is there a way to write a subquery that would work around this?

Select s.ldate,
es.evstrname as 'Run',
d.lastname+', '+d.firstname as 'Driver',
po.actualarrivetime as 'PullOut',
os.actualarrivetime as 'OOS',
os.actualdeparttime as 'IS',
pi.actualdeparttime as 'PullIn',
os.actualdeparttime - os.actualarrivetime as 'Break'
(pi.actualdeparttime - po.actualarrivetime) - (os.actualdeparttime - os.actualarrivetime) as 'PayTime'

From Schedules S

Join eventstrings es
On s.schid=es.schid

Join employees d
On d.employeeid=es.employeeid

Join events po
on po.evstrid=es.evstrid
and po.schid=es.schid
and po.activity=4

Join events pi
on pi.evstrid=es.evstrid
and pi.schid=es.schid
and pi.activity=3

Left Outer Join events OS
on os.evstrid=es.evstrid
and os.schid=es.schid
and os.activity in (5,7)


Where es.evstrname>=?
AND es.evstrname<=?
AND s.ldate>=?
AND s.ldate<=?

Order by s.ldate, es.evstrname, po.actualarrivetime, os.actualarrivetime, os.actualdeparttime, pi.actualdeparttime

All es.evstrname will always have an 'event activity' (pi.activity, po.activity etc) of 3 and 4. Only es.evstrname that have clocked
out of service will have data with os.activity in (5,7) (this is an out of service break).

I'm not sure if this is a format that would help, but here is some sample data. Currently, it returns the following:

Date Run Pullout OOS IS PullIn Break PayTime
10-1 101 10:00 12:00 12:30 16:00 :30 5:30
10-1 102 11:00 ---- ---- 17:00 ---- -----

I would like it to return this:

Date Run PI OOS IS PullIn Break PayTime
10-1 101 700 900 930 945 30 215
10-1 102 700 --- --- 945 -- 215

Thanks! Craig

View 2 Replies View Related

Optimizing Data Calculation Inside SP

Oct 1, 2007

Hi,
I created a SP which is supposed to calculate some values for me and return them as a resultset. I have a RequestTime field, and a ResponseTime field; This sp should calculate how much time does it take for us to respond to a customer's request. if it is more than a specific time, this sp should calculate the extra time and its fine base on a constant fine-per-extra-minute value.
It should also calculate total fine for all records.
To do so, I wrote it as this:


Code:

CREATE PROCEDURE up_responsetime
@sdate smalldatetime,
@edate smalldatetime,
@TotalFine int OUTPUT
AS
DECLARE @Fine_Per_Min int
DECLARE @Max_ResponseTime int
SET @Fine_Per_Min = 3
SET @Max_ResponseTime = 120

SELECT ID,RequestDate,RequestTime,ResponseDate,ResponseTime,
-- Calculate exceeded amount of time for each record
DATEDIFF(minute,RequestDate+RequestTime,ResponseDate+ResponseTime) - @Max_ResponseTime as ExtraTime,
-- Calculate fine for each record
(DATEDIFF(minute,RequestDate+RequestTime,ResponseDate+ResponseTime) - @Max_ResponseTime) * @Fine_Per_Min as Fine
FROM CusRequests
WHERE RequestDate BETWEEN @sdate AND @edate
--Calculate sum of all fines and return it in TotalFine variable.
SELECT @TotalFine = SUM((DATEDIFF(minute,RequestDate+RequestTime,ResponseDate+ResponseTime) - @Max_ResponseTime) * @Fine_Per_Min) FROM CusRequests
GO



but I have two concerns about this:
1- Calculated Fine field returns a negative figure if the ResponseTime is in the desired period of time. But I want it to return zero for such cases, and return only a positive figure when extra time was spent on responding to the request.

2- As you can see, there are many redundant calculation in this code, and this will affect its performance. I wanna know if there is any more optimized way to write such a code?

I appreciate any help on this.
Thanks alot

p.s. sorry if the post was lengthy.

View 2 Replies View Related

Power Pivot :: Using Data From Slicers In DAX Calculation

May 13, 2015

I have a formula that should use data from 2 slicers:

Spend per period (changing currencies & dates):=[Spend per period]*CALCULATE([Sum of Value],FILTER(Currency,Currency[Date]=[End Date]),FILTER(Currency,Currency[Attribute]=CurrencySlicer[Attribute]))

I managed to link the [End Date] from the slicer to the formula, however the [Attribute] field is not numeric so I can't duplicate the same methodology. 

{FYI:    End Date:=LASTDATE('Finish Date Slicer'[Column1])    }

I assume that I need to build a formula to extract the data chosen in the slicer, and can't connect it directly to the slicer. 

View 2 Replies View Related

Data Mining :: Calculation Not Working Trigger

Apr 20, 2015

In My Table , I have Three Column like Qty,Price and Total, I create a Trigger for this . 

Table:
CREATE TABLE [dbo].[tbl_SO](
[SoNo] [varchar](50) NULL,
[Qty] [int] NULL,
[price] [numeric](18, 2) NULL,
[total] [numeric](18, 2) NULL
) ON [PRIMARY]

[code]...

Input :
1) UPDATE TBL_SO SET QTY='10',Price='100' Where SONo='10'
2)UPDATE TBL_SO SET QTY='10',Price='100'  Total=Qty*Price Where SONo='10'

Output:
What I get? I tried both 1 and 2, It(Query) does not work on first time,I executed (Query) second time It works.Why?What I need?   How to work on First Time?

View 6 Replies View Related

Simple Simple Linking Tables & Perform Calculation

Mar 22, 2007

found it

View 3 Replies View Related

SQL Server 2012 :: Day Wise And Date Range Calculation With Looping Or Dynamic Data?

May 14, 2015

I am using Sql Server 2012.

This is how I calculate the ratio of failures in an order:

31 Days Table 1 query
sum(CASE
WHEN (datediff(dd,serDATE,'2015-01-21')) >= 31 THEN 31
WHEN (datediff(dd,serDATE,'2015-01-21')) < 0 THEN 0
ELSE (datediff(dd,serDATE,'2015-01-21'))END) as 31days1 .

How do i loop and pass dates dynamically in the Datediff?

31 Failures Table 2 query
SUM(Case when sometable.FAILUREDATE BETWEEN dateadd(DAY,-31,CONVERT(DATETIME, '2015-01-21 23:59:00.0', 102))
AND CONVERT(DATETIME, '2015-01-21 23:59:00.0', 102)Then 1 Else 0 END) As Failures31,31 Day Cal(Formula) combining both Table 1 and Table 2
((365*(Convert(decimal (8,1),T2.Failures31)/T1.31day))) [31dayCal]This works fine when done for a specific order.

I want a similar kind of calculation done for day wise and month wise.

2. what approach should I be using to achieve day wise and month wise calculation?

I do also have a table called Calender with the list of dates that i can use.

View 3 Replies View Related

Data Warehousing :: Copy Data From Staging Tables To Other Instance Master Tables?

Aug 14, 2015

I need to copy data from warehouse tables to master tables of different SQL instances. Refresh need to done once in an hour. What is the best way to do this? SQL agent jobs or SSIS packages?

View 3 Replies View Related

Join Some Tables From XML Source

Oct 2, 2007


Hello everyone.

I read one XML File with XML-Source. It represents this file with many tables.
I need to save some data from the different XML tables into one table of the SQL Server Database. I use between XML Source and OLE DB Destination Derived Column transformation.
How can I join two XML Table, into one. For Derived Column I can specified only one XML table.

If I use this structure :
XML Source -> Derived Column -> OLE DB Destination

I can specify into Derived Column only one XML Table.

How can I add another Derived Column object that use the Data from another Table of XML Source and saves it into OLE DB Destination?

Thanks a lot for your help!

View 4 Replies View Related

SQL 2012 :: Replication To A Database With More Tables Than Source

Apr 14, 2015

We need to replicate data from a live database to a reporting database nightly.

There is data which is only needed for the reporting function - namely the selection criteria for reports which are to be run nightly or monthly.

Is it possible to have extra tables in the target database, holding the selection criteria?

If not, we'll have the reports running heterogeneous queries between the target database and a separate database with the selection criteria in it.

View 1 Replies View Related

Dynamically Selecting Tables In OLEDB Source

Mar 17, 2008

hello guys,

I have 10 tables, table1, table2, table3, table4.......table10. all these tables have different structure.
From each of these tables I want extract data and dump into flat file csv.

So i have OLEDB source and FlatFile Destination.

If i write seperate data flow task for each of the tables, then there is no issue.

But i want to use a single data flow task for all these tables. So for this, i use a variable @SQLStr . And i dynamically set the value of this variable to select * from table1, select * from table2.........selct * from table10.

So in the OLEDB source I select Data Access Mode as : SQL Command from variable. And I use @SQLStr for this varible.

For Destination, i dynamically generate the flat file for each of the table.

But this doesnt work, i get validation errors.

So, first can i use a single data flow task to dynamically change the source(different source tables) and destination. If so, what i am missing in the above process?

any help appriciated.

Thanks

View 8 Replies View Related

Dynamically Pick The Source And Destination Tables

Feb 27, 2007



I want to write a SSIS which picks up the source and destination tables at runtime is it possible. As we have a SSIS which is used to pull data from oracle but the source and destination table name changes.

View 5 Replies View Related

No Columns When Using Temp Tables In T-SQL In OLEDB Source

Mar 14, 2006

We have a complicated select query that needs to build a couple temporary work tables that are then used in the final select statement (in an OLEDB Source data flow control). We can click preview and see the resultset, but if we click on the Columns view there are no columns. We can save and close the OLEDB Source control but downstream from it there are messages saying that there are no input columns. The T-SQL looks something like this (abbreviated):

SELECT fieldlist INTO #temp1 FROM table

SELECT fieldlist INTO #temp2 FROM table

SELECT fieldlist FROM table INNER JOIN #temp1 INNER JOIN #temp2

DROP TABLE #temp1; DROP TABLE #temp2

Has anyone been able to use temp tables in a source SQL statement in a data flow? Are we doing something wrong or incomplete?

Thanks, Gordy

View 3 Replies View Related

Query:Source From Multiple Tables To A Fact Table

Jan 19, 2006

Greetings,

Iam new to SQLl2005. Iam using DTS to transfer data from my source to the warehouse. I have a couple of tables in my source whein I have to join these to tables fields and insert the same in teh warehouse fact table. I have used a Join query in my Oledb source component, What other component needs to be used to insert the data into the fact table.
I also need to extract same data with aggregation and insert the same into an another Fact table.

Kindly help.

View 21 Replies View Related

Integration Services :: Load Different Source Files Into Tables

Oct 12, 2015

I have one small requirement.. I want to load the different types of files(.txt, .csv, .tsv, .xlsx).

Using one forearch loop container how can I load the files to database and I shouldn't use the script task to split the filenames. Is there any other way to load all the files using forearch loop container, exesql task..

View 2 Replies View Related

Disable Automatic Renaming Of Source Tables By View Editor?

Sep 29, 2012

Is there a way to disable to automatic renaming of source tables by the View editor? (SLQ Server 2008 R2)

I have a view that is using several sub-queries (including a "rank/partition") and even though each sub-query is contained in ( ) and given an alias, the view designer automatically adds the _1, _2 to all of the tables it thinks are duplicates, which then invalidates the and explicit field calls in CASE WHEN statements ...

View 1 Replies View Related

Pipeline Error-excel Source-data Reader Does Not Read In Meta Data

Apr 16, 2008

Hi all, i got this error:


[DTS.Pipeline] Error: "component "Excel Source" (1)" failed validation and returned validation status "VS_NEEDSNEWMETADATA".

and also this:

[Excel Source [1]] Warning: The external metadata column collection is out of synchronization with the data source columns. The column "Fiscal Week" needs to be updated in the external metadata column collection. The column "Fiscal Year" needs to be updated in the external metadata column collection. The column "1st level" needs to be added to the external metadata column collection. The column "2nd level" needs to be added to the external metadata column collection. The column "3rd level" needs to be added to the external metadata column collection. The "external metadata column "1st Level" (16745)" needs to be removed from the external metadata column collection. The "external metadata column "3rd Level" (16609)" needs to be removed from the external metadata column collection. The "external metadata column "2nd Level" (16272)" needs to be removed from the external metadata column collection.


I tried going data flow->excel connection->advanced editor for excel source-> input and output properties and tried to refresh the columns affected.
It seems that somehow the 3 columns are not read in from the source file?
ans alslo fiscal year, fiscal week is not set up up properly in my data destination?
anyone faced such errors before?

Thanks

View 13 Replies View Related

XML Data Source .. Expression? Variable? Connection? Error: Unable To Read The XML Data.

Feb 23, 2008

RE: XML Data source .. Expression? Variable? Connection? Error: unable to read the XML data.

I want my XML Data source to be an expression as i will be looping through a directory of xml files.

I don't see the expression property or the connection property??

I tried setting the XMLData property to @[User::filename], but that results in:

Information: 0x40043006 at Load XML Files, DTS.Pipeline: Prepare for Execute phase is beginning.
Error: 0xC02090D0 at Load XML Files, XML Source [108]: The component "XML Source" (108) was unable to read the XML data.
Error: 0xC0047019 at Load XML Files, DTS.Pipeline: component "XML Source" (108) failed the prepare phase and returned error code 0xC02090D0.
Information: 0x4004300B at Load XML Files, DTS.Pipeline: "component "OLE DB Destination" (341)" wrote 0 rows.
Task failed: Load XML Files
Information: 0xC002F30E at Bad, File System Task: File or directory "d:jcpxmlLoadjcp2.xml.bad" was deleted.
Warning: 0x80019002 at Package: The Execution method succeeded, but the number of errors raised (2) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "Package.dtsx" finished: Failure.
The program '[3312] Package.dtsx: DTS' has exited with code 0 (0x0).


Thanks for any help or information.

View 3 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved