Common DTS Source && Multiple Destinations

Feb 2, 2004

I want to run multiple DTS packages which export data into text files.
There is only one Data Source ..and multiple destinations.
When i write a code for this in VB ,for each Package i need to define the source connectioninividually. Can't i use the same Source connection which i used for the first package in the subsequent packages?

View 1 Replies


ADVERTISEMENT

Multiple OLD DB Destinations

Feb 28, 2008

Can you have more then one OLD DB desination connected to the slowly changing task for the NEW OUTPUT scenario. I know I can have more then one if they have different outputs...like New Output for one and the other have fixed attrib output or unchanged output.

I have a query that gets data but I want to insert that data in 2 different tables at the same time. I can not re-query to get the data since it may change the data I get the second time I query.

So if I have a query that returns a data set how can I insert into 2 diff tables?

View 1 Replies View Related

Multiple Sources && Destinations...

Apr 13, 2006

Hello,

I am working on a typical data conversion project where we are migrating data from an old data model to a new data model, using SSIS. Both the DBs are in SQL.

Now we have a situation where say there are 25 source tables and 20 odd target tables.

For transporting data, we are using OLEDB Source & OLEDB Destination transforms. However, each transform maps to one view or one table. As a result, the Data Flow is really messed up with 45+ transforms in it. Is there an elegant way of doing this ? With say just one datasource or maybe fewer transforms?

Thanks,

Satya

View 1 Replies View Related

Data Flow Task Multiple Destinations

Sep 21, 2006

Hi,

The further i get with doing my current SSIS package the more i am starting to wonder about best practices and performance.

My current package loops through CSV files in a specified location and extracts events from these files. Each file contains multiple events which are a mixture of different types. Depending on the event there are a different number of comma seperated values. In the package i firstly set each event to one column seperated by a comma delimeter. I then create an array for the event which is split by the delimeter. In a script i weed out all elements of the array that are common to all events and set the remaining events to another array. After some processing i come to my conditional split transformation which splits the processing of each event based on the EventID. This is where i'm having doubts on whether i have approched the package correctly. There are approximately 60 different events so each one of these has a seperate pipeline to process the remaining parameters in the array and output them to the destination table. The destination table is differnet for each ID. Is it viable to have this amount conditions and paths when creating the pacakge and is this likely to have any detrimental effect on performance. Is there possibly another way that i could approach this problem?

Many thanks, i hope that made sense.

Grant

View 8 Replies View Related

Multiple Initiators To Common Target

Apr 3, 2006

Information for configuring Service Broker when you have multiple initiators seems to be thin on the ground.
The examples are all point-to-point and seem to require separate logins for each initiator and corresponding exchange of certificates using a multi-pass installation.
E.g. Rushi's ServiceListing API.

We have an application with several hundred workstations which need to send transaction data to department servers and then on to corporate. There is also a need to replicate small amounts of reference data to all levels.

The examples seem to generate an installation and administrative nightmare.

I have a question - would the following scheme work?

1. All our machines are on the same domain so transport security can be handled by Windows Security (One less set of certificates to exchange).

2. Generate 2 Certificates on the Corporate database and export them to the department and workstations
use these pre-installed certificates in each database to handle dialog security for all converations.
Ie. multiple workstations feed into one user id on the
department server using a single certificate. Ditto for all the department servers feeding into corporate.

3. Use Service broker id's to determine where messages are sent (as the same services would be installed at multiple locations).

4. When a database is intalled/restored at a workstation does this invalidate any certificates that are shipped with it?

View 5 Replies View Related

Select From Multiple Tables No Common Field

Oct 17, 2005

I know there is some kind of rule against the following SQL statement, but I was wondering what to do to get around this problem (some kind of grouping). Sorry for the stupid question.

SELECT * FROM Table1, Table2 WHERE Table1.ID IS NOT NULL AND Table2.ID IS NOT NULL

Basically I want to select all records from the two tables (they have the same fields, but are just different specialties) and then output them, but there is nothing in common between the two to reference one another, and it ends up in some kind of loop. Thanks. for the help.

View 2 Replies View Related

Multiple Tables Grouped By Common Field

Jan 4, 2008

Here is my situation. I am building a report that has three different tables each with their own dataset. Example: Opportunities, Leads, Activities. All three of these datasets/tables have a common field - SalesID. I would like the report to show the first SalesID, then all Opportunities for that SalesID in the Opportunities table, followed by all Leads for that SalesID in the Leads table, followed by all Activities for that SalesID in the Activities table, and then rollover to the next SalesID and repeat that for all SalesIDs. Any suggestions on how I could achieve this? Thanks in advance for all help!!!

View 6 Replies View Related

Multiple Operations On A Common Table Expression

Jan 28, 2008

Hi,
I'd like to perform a number of different operations on my Common Table expression but I seem to be limited to only one operation. For example I cannot both delete duplicate rows and then perform a select statement. I can only execute one of the statements referencing the common table expression.

What is wrong with my syntax?


;With OrderedTable

AS

(

select Row_number() OVER (partition BY SSNumber order by Department_Id desc ) AS ROWID,* from Employee

)

delete from OrderedTable where RowId != 1

SELECT COUNT(*),SSNumber FROM OrderedTable group by Department_Id order by count(*) desc

View 1 Replies View Related

Grouping Common Functionality In Multiple Stored Procedures

Oct 22, 2007

Hi i have always used views in my code to group common functionality in my sql expressions and then i can simply call these views in my data access layer by saing:
SqlCommand cmd = new SqlCommand("SELECT * FROM vw_Documents WHERE CategoryID = @CategoryID", cn);
However my view has become so complicated that i had to convert it to a stored procedure called sp_Documents.  The problem now though is that is that i wish to do queries against the data returned but i can't simply say:
SqlCommand cmd = new SqlCommand("SELECT * FROM sp_Documents WHERE CategoryID = @CategoryID", cn);
The only way i can see to do it is to create a stored procedure for every single senario i have passing in the appropriate values as parameters.  This seems a pretty messy solution to me because i would have repeated logic in all my stored procedures.  Therefore i was wondering if there's a simpler way for me to do this or am i just being lazy :).
Appreciate if someone could help,

View 2 Replies View Related

Sql Query Which Uses Multiple Tables But No Common Field To Join

Jan 29, 2004

Hello-

I have a sql query that I am using to populate a datagrid. The problem is one of the tables is a month table. and the other tables are full of data. So there is no common column name to match using a inner join "on".

How do i do this?

View 6 Replies View Related

Encountered Problems With Multiple Common Table Expression

Feb 20, 2006

Hi guys,

I'm trying to have two common table expression in my stored procedure, but I'm receiving errors when executing it, I found that they can't exist side by side,once I removed 1 of them, the stored procedure executed successfully.

The following are the errors

Code:


Msg 156, Level 15, State 1, Procedure GetProductsByCategory, Line 27
Incorrect syntax near the keyword 'With'.
Msg 319, Level 15, State 1, Procedure GetProductsByCategory, Line 27
Incorrect syntax near the keyword 'with'. If this statement is a common table expression or an xmlnamespaces clause, the previous statement must be terminated with a semicolon.
Msg 319, Level 15, State 1, Procedure GetProductsByCategory, Line 33
Incorrect syntax near the keyword 'with'. If this statement is a common table expression or an xmlnamespaces clause, the previous statement must be terminated with a semicolon.



I'm using SQL Server Express

View 4 Replies View Related

Using Multiple Report Viewers On MOSS With Common Parameters

May 17, 2007

Hi!

I'm using Reporting Services Integration Mode and I have a page that has a number of report viewer webparts on it to display reports. Is there a way that I can add another webpart to pass parameters to all the reports at once instead of individually? I am trying to build a dashboard and this is one of the requirements. I see on the Report Viewer web part, that there is an option in Connections to 'Get report parameters from' , but it is greyed out. Any ideas on how to do this?



Thanks!

Brian

View 6 Replies View Related

Power Pivot :: One Slicer To Control Two Pivot Tables That Have Different Source Data And Common Key

Jul 8, 2015

I have two data tables:

1) Production data with column headers: Key, Facility, Line, Time, Output
2) Costs data with column headers: Key, Site, Cost Center, Time, Cost

The tables have a common key named obviously as Key. The data looks like this:

Key
Facility
Line
Time
Output
Alpha

I would like to have two pivot tables which I can filter with ONE slicer based on the column Key. The first pivot table shows row labels Facility, Line and column labels Time. Value field is Output. The second pivot table shows row labels Site, Cost Center, and column lables Time. Value field is Cost.How can I do this with Power Pivot? I tried by linking both tables above to a table with unique Keys in PowerPivot and then creating a PivotTable where I would have used the Key from the Keys table.

View 5 Replies View Related

How To Combine Multiple Rows Data Into Single Record Or String Based On A Common Field.

Nov 18, 2007

Hellow Folks.
Here is the Original Data in my single SQL 2005 Table:
Department:                                            Sells:
1                                                              Meat
1                                                              Rice
1                                                              Orange
2                                                              Orange
2                                                              Apple
3                                                             Pears
The Data I would like read separated by Semi-colon:
Department:                                            Sells:
1                                                             Meat;Rice;Orange
2                                                             Orange;Apple
3                                                             Pears
I would like to read my data via SP or VStudio 2005 Page . Any help will be appreciated. Thanks..
 
 

View 2 Replies View Related

Raw File As Source For Multiple Packages

May 23, 2007

I have a question regarding Raw Files. I am breaking a large package into more modular components for better processing and debugging.



The process will start with a preparatory dataflow that will create a Raw File(s). This Raw File will then be used as the source in possibly 6 data flows and/or packages.



My question is whether 1 Raw File can be read concurrently by the multiple jobs and how this would affect processing. I'm assuming that this would slow processing.



My other option is to Multicast the writing of the Raw File to 5 other versions of the file. All would be identical except for filename. Obviously this would use more disk space but this is not a concern as we have lots of disk space. Our concern is for speedy processing.



If you have experience with Raw Files, please let me know how you approached this issue. As always, blogs and specific examples are always great!



Thanks in advance.

View 4 Replies View Related

One Package, Many Destinations

Aug 1, 2006

Hello,



I have made a package to import data from a flat file datasource to our development SQL Server database and am very pleased about how well it works.

Now, it's time to import these data into our pre-production database using the same package. I am still wondering how to parametrize the connection string so that I can switch configuration easily. I specifically want to avoid creating 2 similar packages to do the same job but with different destinations.

I have read about variables and configurations but I am still confused. They say that the package has to be reloaded or something for a change to happen. At best, it would be perfect to have a drop down list somewhere, select the correct database and hit a button to execute the package. I would be very interested in knowing if .NET could be of any use for this kind of job.

Where can I get information about this topic?



Thanks!

Marc Lacoursiere

View 1 Replies View Related

Sources And Destinations

Nov 13, 2007

I have around 120 tables. I am using script component to pull the values from oracle stored procedures. I do not want to create 120 source & desitinations in my dataflow. Please advice how can this be possible with one script component and one oledb destination.


I noticed I can add mutiple outputs in 1 script component which makes the script component to work like a dataset (container of different recordsets (tables) ), if I am correct. Can this be redirected to an dynamic oledb connection.

View 14 Replies View Related

Excel Destinations?

May 8, 2006

hi.. i have a question.. which is the best provider to use connection managers that maps xls files????

the default i have on my server is

native OLE DB . microsoft jet 4.0 OLE DB provider

but.. is there any other??

View 1 Replies View Related

Need Example Of Source Component With Multiple Non-Error Outputs

Aug 21, 2007

Can someone please point me to one or more examples of a Source component that has two or more outputs?

I wrote a source, which works with a single output, but after adding a second output, I see no rows there. I've single-stepped the code in the debugger, and it looks like it should be adding rows to the second output, but the downstream component never sees any.

Thanks.

View 7 Replies View Related

Problem With Dynamic Source For Multiple File

May 1, 2008

I have a problem while creating a dynamic source connection.

i have four files while which comes with different extension , apparently the name of the files are same for example.

9500.txt, the other one is 9500. rtf, 9500.dat, 9500.map, They are all text file but with different formats. I have problem selecting a dynamic source connection for each of this files.

when i create variable for input file name the file connection gets confused and throws me error with input file.

please help

Thank you




View 13 Replies View Related

Single Source, Multiple Lookups Against Same Table

Nov 16, 2006

Let's say I have 4 columns coming from my OLE DB source.

Column1
Column2
Column3
Column4

I also have a table that I'll be using in a lookup, LUPTable. In LUPTable, I have two fields, LUPField, ReplaceField.

In my data flow, I need to take columns, 2-4, and look them up against LUPField in LUPTable. I then need to add the value of ReplaceField (when a match is found) into the data flow.

The problem that I'm running into is that I don't want to sequentially do the lookups in the dataflow, because that's just a waste of time/memory. I only need to build the in-memory lookup table once, because that exact same data (it is static, for the most part) will be used for the remaining lookups.

What is the best way to achieve this?

The goal is to have the following columns remaining in the dataflow:
Column1
NewColumn2 (containing value from ReplaceField)
NewColumn3 (containing value from ReplaceField)
NewColumn4 (containing value from ReplaceField)
Column2-4 can be dropped from the dataflow after the lookups.

Thanks,
Phil

View 7 Replies View Related

OLE DB Source Executing SQL Command Multiple Times

Jan 2, 2008



Happy new year everyone!!!


Ok, now for my not so happy new year problem ... here's the scenario:

I have a simple task of exporting data retrieved by a parameterized stored procedure to a delimited text file ... task done in 30 mins max using dataflow task from OLE DB Source with SQL Command to Text File Destination ... thank you very much SSIS!

But the problem is the OLE DB Source executed the SP MULTIPLE TIMES (3 to be exact)
Actually configuration is:

1. OLE DB Source
2. SQL Command access mode
3. The following command written in the command text box:

SET FMTONLY OFF;
exec sp_SampleSP @param1=?, @param2=?
4. Configured parameter mapping.

The stored procedure logs to a database table everytime it is executed to report how many rows were selected (please don't crucify me with this method, i'm re-using procs from old system). The log shows that during execution of this package the SP was executed on 3 times.

Help anyone? Please ...

View 5 Replies View Related

Query:Source From Multiple Tables To A Fact Table

Jan 19, 2006

Greetings,

Iam new to SQLl2005. Iam using DTS to transfer data from my source to the warehouse. I have a couple of tables in my source whein I have to join these to tables fields and insert the same in teh warehouse fact table. I have used a Join query in my Oledb source component, What other component needs to be used to insert the data into the fact table.
I also need to extract same data with aggregation and insert the same into an another Fact table.

Kindly help.

View 21 Replies View Related

Cannot Have Two Flat File Destinations In A Package (possible Bug)?

Jan 4, 2007

During my development of a ssis package i've noticed that when creating two control flows that pulling data from seperate tables, each going to its own flat file, that the second keeps the attributes of the column names from the first. So when I create my second flat file, not only does it have the names of its correct columns but has the name of the the first flat file.

I'm hoping that I've explained the correctly. I'll provide more info "OR" I can provide the code to package if anyone would like.

View 3 Replies View Related

Order Of Destinations In A Data Flow

Dec 15, 2005

Hi,

A long long time ago I asked for the ability to define an order in which you insert data into multiple destinations in a single data-flow. This was to get around the problem of loading tables in the same data-flow that have FK relationships between them.

My chosen alternative at the moment is to load the table with the unique key first and drop the data for the table with the FK into a raw file and load it in a seperate data-flow. Another alternative is to disable FK constraints and reenable after but I've chosen the raw file method.



Now that I'm using SSIS on a real project this is becoming a much bigger problem that I ever envisaged it might be. Its rare that I'm building stuff where I don't have to use raw files for this very reason and this means that I have god knows how many raw files hanging around all over the place.

I suppose the reason for this post is to flag the importance of this requested feature. I really hope that this makes it into Katmai. Or into a service pack would be even better!

Its a bit of a failing at the moment.



Are there any chances that this will make it into Katmai? its one of my biggest bug-bears about SSIS v1.



-Jamie





[SSIS Team Followup]

View 4 Replies View Related

Two Destinations In A Data Flow Task

Aug 13, 2007

Hi All,

I wanna know if we can have more than one "OLEDB Destination" within a Data Flow Task, I want to use the same data flow and write to two different tables in a database with some changes. If we cannot do this within the same data flow what is the best way to do this.

Thanks

View 3 Replies View Related

FastLoadMaxInsertCommitSize For Data Flow Destinations

May 19, 2006

Is there a way to programatically set (using expressions, variables) the FastLoadMaxInsertCommitSize property of an OLEDB destination in a data flow for Fast Load Operations.
Basically, what I want to do is based on the # of records which are going to be inserted want to set the FastLoadMaxInsertCommit size.

View 1 Replies View Related

BI Dev Studio I See No DATAFLOW DESTINATIONS In The Toolbox

May 17, 2007

I just installed sql2005 and bi dev studio on a vista box. Then VS 2005 and all svc packs.
I create a new SSIS package but don't see any Data Flow Destinations in the toolbox while on the dataflow tab. I choose "show all" and don't see any data flow destinations anywhere.
I can use the wizard to create packages, and it adds destinations just fine.
Anyone have any idea what may be wrong?

View 1 Replies View Related

HOW TO: Create Dynamic Excel Destinations

Jun 22, 2006

I have a ForEach Loop Container that is running from a Foreach ADO Enumerator with records telling me which companies have records to export. As I loop through I use a data flow task to export the records to Excel, I want to create separate Excel files using some of the parameters from my recordset as parts of the name.

I have DelayValidation=True for my DFT and my Excel Connection Manager, ValidateExternalMetaData=False for my Excel Destination Adapter, and an expression setting the ExcelFilePath and ServerName properties to the dynamic path & file name from variables.

The layout will be the same (i.e. metadata) for each file. The files are just getting broken up by company and service type and I want to use that in naming the files.

I am currently getting the following errors:

[EX_DST New Enrollments File [238]] Error: An OLE DB error has occurred. Error code: 0x80040E37.

[EX_DST New Enrollments File [238]] Error: Opening a rowset for "NewEnrollments$" failed. Check that the object exists in the database.

[DTS.Pipeline] Error: component "EX_DST New Enrollments File" (238) failed the pre-execute phase and returned error code 0xC02020E8.

What do I have to do to create the new Excel File? I thought it would do it when the properties were set. Do I have to create the "table" for the worksheet named "NewEnrollments"? If so, how do I accomplish it.

Thanks in advance.

sk

View 1 Replies View Related

Should Conditional Split Destinations Distinct?

May 13, 2008

Source
||
Conditional Split

||
dest,dest, dest (Same table)

Debugging stops with Yellow filled box, its got stuck not proceeding further.
i removed two other destinations. its working.
whats this issue? any solution for this?

View 16 Replies View Related

Spliting Flat Files Into Two Destinations

May 18, 2007

For some reason I am having a really hard time grasping IS and I have a task that I would imagine is easy.

I have a flat file source with 6 columns, I would like to import this file into two flat files. One file containing columns 1,2,3,5 and the second containing 2,4,5,6. I created the connection managers for both destination files, but I can€™t determine what transformation tool I need to accomplish this task? Could you help?

View 3 Replies View Related

DTS - Split Single Source Record (text) To Multiple Target (sql)

Aug 31, 2000

I am using DTS and VBScript in DataPump tasks in order to transfer large amounts of data from text files to an SQL database.

As the database uses a normalized schema, there is often the case of inserting multiple records in a destination table from various fields of the same record of the source text file.

For example, if the source record contains information about goods sold like date, customer, item code, item name and total amount, and does so for a maximum of 3 goods per sale (row), therefore has the structure:

[date], [custid], [code1], [name1], [amount1], [code2], [name2], [amount2], [code3], [name3], [amount3]

trying to transfer that record to a [SALES] target table (in a normalized database), we would have to split each source record as follows:

[date], [custid], [code1], [name1], [amount1]
[date], [custid], [code2], [name2], [amount2]
[date], [custid], [code3], [name3], [amount3]

What is the best way to do this using DTS?

I have tried using a datapump task and VBScript, and I guess it has to do with the DTSTransformStat_**** constants, but none of those I used seems to work

Vasilis Siatravanis,
siatravanisv@interamerican.gr , vasilliss@hotmail.com

View 6 Replies View Related

1 SP With Dynamic Input Parameters And Multiple Rows As The Source Of The Query

Dec 4, 2005

How can I run a single SP by asking multiple sales question eitherby using the logical operator AND for all the questions; or usingthe logical operator OR for all the questions. So it's alwayseither AND or OR but never mixed together.We can use Northwind database for my question, it is very similarto the structure of the problem on the database I am working on.IF(SELECT OBJECT_ID('REPORT')) IS NOT NULLDROP TABLE REPORT_SELECTIONGOCREATE TABLE REPORT_SELECTION(AUTOID INT IDENTITY(1, 1) NOT NULL,REPSELNO INT NOT NULL, -- Idenitifies which report query this-- "sales question" is part ofSupplierID INT NOT NULL, -- from the Suppliers tableProductID INT NOT NULL, -- from the Products table, if you choose--a ProductID, SupplierID is selected also by inheritenceCategoryID INT NOT NULL, -- from the Categories tableSOLDDFROM DATETIME NULL, -- Sold from which dateSOLDTO DATETIME NULL, -- Sold to which dateMINSALES INT NOT NULL, -- The minimum amount of salesMAXSALES INT NOT NULL, -- The maximum amount of salesOPERATOR TINYINT NOT NULL -- 1 is logical operator AND, 2 is OR)GOINSERT INTO REPORT_SELECTIONSELECT 1, 1, 2, 1, '1/1/1996', '1/1/2000', 10, 10000, 1 UNION ALLSELECT 1, -1, -1, 1, '1/1/1996', '1/1/2000', 10, 1000, 1You can ask all kinds of sales questions like:1-I want all employees that sold products from supplierID 1(Exotic Liquids), specifically the ProductID 2 (Chang) from theCategoryID 1 (Beverages) between Jan 1 1996 to Jan 1 2000 and soldbetween $10 and $10000 - AND for my 2nd sales question2-I want all employees that sold CategoryID 1 (beverages) betweenJan 1 1996 to Jan 1 2000 and sold between $10 and $1000I want to get the common result of both questions and find outwhich employee(s) are in this list.Here are some of the points:1-I want my query to return the list of employees fitting theresult of my sales question(s).2-If I ask three questions with the logical operator AND, I wantthe list of employees that are common to all three questions.3-If I ask 2-3-4. questions with the logical operator OR, I wantthe list of employees that are in the list of the 1st "successful"sales question (the first question that returns any employee isgood enough)4-You can ask all kind of sales question you want even if theycontradict each other. The SP should still run and returnnothing if that is the case.5-Let's assume you can have the same product name from the samesupplier but under different categories. So entering a ProductIDshould not automatically enter the CategoryID also; whereasentering the ProductID should automatically enter its SupplierID.6-SOLDFROM, SOLDTO, MINSALES, MAXSALES, OPERATOR are mandatoryfields, you can't leave them NULL7-SupplierID, ProductID and CategoryID are the dynamic inputparameters, there can be 5 different combinations to choose from:a-SupplierID onlyb-SupplierID and a ProductID,c-SupplierID and a CategoryIDd-SupplierID, ProductID and a CategoryIDe-CategoryID onlyf-Any time you choose a ProductID, the SupplierID valuewill be filled automatically based on the ProductID'srelationshipg-Any of the three values here that is not chosen by theuser will take a default value of -1 (meaning return ALLfor this Column, in other words don't filter by this column)The major problem I have is I can't use dynamic SQL for choosingthe three dynamic columns as the 2nd row of records would have adifferent selection of dynamic columns (at least I don't know howif the solution is dynamic SQL). The only solution I can think oflooks pretty bad to me. I would use a cursor, run each row at atime, store a TRUE, FALSE value to stop processing or not andstore the result in another detail table. Then if all ANDquestions have ended with TRUE do a union of all the result andreturn the common list of employees. It sounds pretty awful as anapproach. I am hoping there's a simpler method for achieving this.Does anyone know if any SQL book has a topic on this type ofquery? If so I'll definitely buy the book.I appreciate any help you can provide.Thank you

View 7 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved