How To Replace Data Of Tables From Source To Destination Database Using SSIS
Apr 29, 2008
I would like to replace data of some tables from STG to DEV database daily using SSIS package. Should I use "Transfer SQL Server Objects Task" to do that? Thanks.
View 5 Replies
ADVERTISEMENT
Nov 14, 2006
Hi,
I have a new problem when I import data from an xml source file in two destination tables. The two tables are linked by a foreign key... for example :
table MOTHER (MOTHER_ID, MOTHER_NAME)
table CHILD (CHILD_ID, MOTHER_ID, MOTHER_NAME)
After a lot of transformations data are inserted into MOTHER table and I want to insert other fields of the data flow in CHILD table. To do this, I need the MOTHER_ID field that is auto incremented in MOTHER table.
My problem is to chain the insertion in CHILD table after the insertion in MOTHER table to be sure that the relative row in MOTHER table is really inserted. I haven't find any solution to chain another transformation task after my flow destination "Insert into MOTHER table".
The only solution I have found is to create a new flow control to insert data in CHILD table, using a lookup transformation task to bind with MOTHER table... But with this solution all my flow control transforms are made two times...
Is there a solution to chain two insertions with a foreign key constraint in a data flow?
Thanks
Regards,
Arnaud Gervais.
View 4 Replies
View Related
Sep 1, 2014
I am a complete newbie to SSIS. I can create a simple package to transfer data between SQL instances and thats about it.
I have tableA (source data) and tableB (Destination data). TableA has 4 column and tableB has 5. I want to transfer all of the columns from tableA into TableB, but the 5th column in tableB needs to be populated with the ServerInstance name of the server TableA sits on. Do I need to have multiple data sources to achieve this? I have tried but no matter how I set it up, the Column in the destination is set to ignore.
View 2 Replies
View Related
Apr 3, 2014
I need to see inside a SSIS 2012 project a new SSIS installed component, but in the SSDT 2010 I cannot see the SSIS Data Flow Items tab for adding data source/data destination respect to the choose toolbox items pane.
View 4 Replies
View Related
Apr 16, 2014
I have an source file and i have to load it into the data base by changing datatype of the columns in ssis
View 1 Replies
View Related
May 9, 2007
Hi,
I am having one query regarding data transfer using SSIS. If we use
DTS packages for the data transfer between two databases then the source database and
destination database must be on different db servers or instances.Here, I
am talking about the DTS Packages with Distributed Transactions enabled.
I need to know that whether this constraint has been rectified with SSIS Packages or it still persists.
One more thing is that while transferring the data, can we view/insert/update source database or it is locked if the transfer is in process.
Please reply..........
Thanks and Regards,
Rajesh
View 5 Replies
View Related
Apr 29, 2008
I would like to replace data of some tables from STG to DEV database daily using SSIS package. Should I use "Transfer SQL Server Objects Task" to do that? Thanks.
View 3 Replies
View Related
May 14, 2008
I need to write TSQL script that will insert, update, delete rows in a production db from new, updated and deleted data from staging db. They are both in different servers and I can only use TSQL. Basically, the production DB needs to be synchornized with the staging DB.
I was thinking using dynamic sql, cursors, and linked servers. Any ideas?
Jim
View 2 Replies
View Related
Feb 27, 2007
I want to write a SSIS which picks up the source and destination tables at runtime is it possible. As we have a SSIS which is used to pull data from oracle but the source and destination table name changes.
View 5 Replies
View Related
Jun 5, 2007
Hi,
I have an SSIS package which calls a web service and returns a Dataset object in the form of an xml file (it does this successfully - and stores it successfully).
The Data Flow then picks up the file using an xml source task (in which use inline schema is specified) , and passes it to an SQL Server Destination task - for bulk load.
I've checked the mappings and everything seems to be ok, the package gives no warnings or errors when run.
The destination table is created on the SQL Server without problem, but at that stage the task finishes "successfully", without actually loading any table rows into the destination table.
Also in the SQL Server Destination task - when I select "preview" in the connection manager section - I can see the column definitions, but again - no rows of data.
Thanks to Simon Sabin for pointing me here - I did post on the managed msdn newsgroups in sql server programming but have had no replies as yet.
Below is the beginning of the xml Dataset object which is output as an xml file for information. If you need any more information just let me know.
Not quite sure where i'm going wrong here - thanks in advance
Crispin.
Code Snippet
<?xml version="1.0" encoding="utf-16"?>
<DataSet>
<xs:schema id="callStatisticsDataSet" xmlns="" xmlns:xs="http://www.w3.org/2001/XMLSchemahttp://www.w3.org/2001/XMLSchema">http://www.w3.org/2001/XMLSchema</A< A>>" xmlns:msdata="urn:schemas-microsoft-com:xml-msdata">
<xs:element name="callStatisticsDataSet" msdata:IsDataSet="true" msdata:UseCurrentLocale="true">
<xs:complexType>
<xs:choice minOccurs="0" maxOccurs="unbounded">
<xs:element name="Table">
<xs:complexType>
<xs:sequence>
<xs:element name="date" type="xs:dateTime" minOccurs="0" />
<xs:element name="callID" type="xs:int" minOccurs="0" />
<xs:element name="cli" type="xs:string" minOccurs="0" />
<xs:element name="itemClosed" type="xs:string" minOccurs="0" />
<xs:element name="custProdId" type="xs:string" minOccurs="0" />
<xs:element name="itemSaleId" type="xs:string" minOccurs="0" />
<xs:element name="phoneNumber" type="xs:string" minOccurs="0" />
<xs:element name="lastXml" type="xs:string" minOccurs="0" />
<xs:element name="followedPath" type="xs:string" minOccurs="0" />
</xs:sequence>
</xs:complexType>
</xs:element>
</xs:choice>
</xs:complexType>
</xs:element>
</xs:schema>
<diffgr:diffgram xmlns:msdata="urn:schemas-microsoft-com:xml-msdata" xmlns:diffgr="urn:schemas-microsoft-com:xml-diffgram-v1">
<callStatisticsDataSet>
<Table diffgr:id="Table1" msdata:rowOrder="0">
<date>2007-06-04T08:03:10.02+01:00</date>
<callID>85560695</callID>
<cli>648477292</cli>
<itemClosed />
<custProdId>-1</custProdId>
<itemSaleId />
<phoneNumber />
<lastXml />
<followedPath>Initialize #waitList;|tvShop.CallInitiated;|Is not closed?;|No item on sale;|Store Items;</followedPath>
</Table>
View 4 Replies
View Related
Dec 1, 2006
Hi I have an SSIS package that retrieves table data from a database using an OLE DB source component.
This then passes the rows of data to a script destination component.
Within this script my code takes the row data and inserts it into the database (amongst other things).
However I have noticed that if any of the Row columns contain nulls then the component falls over with errors when run like so:
[Back Up Prize Banks [2875]] Error: System.Data.SqlClient.SqlException: Parameterized Query '(@PASId int,@ScriptName nvarchar(17),@LastModified datetime,@Pri' expects parameter @RegenerateXML, which was not supplied. at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.HandleUserException(Exception e) at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32 inputID, PipelineBuffer buffer) at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper90 wrapper, Int32 inputID, IDTSBuffer90 pDTSBuffer, IntPtr bufferWirePacket)
I have solved the problem by checking the column contents before inserting it as follows:
Before:
Command.Parameters.AddWithValue("@RegenerateXML", Row.RegenerateXML)
After:
If (Row.RegenerateXML_IsNull()) Then
Command.Parameters.AddWithValue("@RegenerateXML",System.DBNull.Value)
Else
Command.Parameters.AddWithValue("@RegenerateXML", Row.RegenerateXML)
End If
Which is great but it does turn 1 line into 5 lines. I have 20 parameters which takes up 20 lines of code which will now be 20 x 5 = 100 lines of code.
My question is :
Is there a better way to write this code without having to take up so many lines?
Thanks
Matt.
View 2 Replies
View Related
Dec 7, 2007
Hi,
I am new to SSIS programming and trying to export data from a flatfile source to SQL server destination table dynamically. I need to get the table schema info (column length, data type etc.) from SQL server table and then map the source columns from flatfile to destination table columns.
I am referring to one of the programming samples from Microsoft and another excellent article by Moim Hossain. Can someone help me understand how to map the Source columns to destination table columns depending on table schema? Please help.
Thanks
View 5 Replies
View Related
Oct 6, 2015
How can I get table name and schema names of the source and destination in ssis package to insert into audit table....??
View 3 Replies
View Related
Apr 19, 2007
Hi all,
I am passing flat file source as a variable to Dtexec Utility. (like package.variables[User::varFileName].Value;"D:sourcedata.txt).
Destination table is having one more column.
I want to add custom value in that column at run time by parameter to Dtexec(User::varDate)
I dont know how to do it, please help me.
Madhukar
View 4 Replies
View Related
Jun 22, 2015
I am using SSIS 2014 and installed adapter for sharepoint list source and destination and when I refresh the toolbox I don't see them. Is there a way to manually add them?
View 4 Replies
View Related
Sep 22, 2013
I want to process data from source table to destination table without using cursor.
At this point; we are creating temp table and inserting data from source to temp table. once we get data into temp table; using while loop we are processing record one by one to destination table.
while executing stored procedure; we noticed that there are few records in source table which are invalid and stored procedure is terminating from such records.
Any better approach to log INVALID data and resume code to process next record instead of terminating.
View 7 Replies
View Related
Oct 17, 2006
Hi,
I have read only permission in the source (OLTP) database. The source database is running in SQL Server 2000 and the size is more than 200 GB. I need to pull data from source and load target which is running in SQL server 2005.
Following are the objectives I want to achieve.
Data should be loaded on incremental basis.
Whatever changes take place (Update/Delete) in source, that should be replicated to already uploaded data.
Here I want to mention that, the source database does not have any identification key or timestamp column like Updated_Date by which I can filter the data which are recently inserted or updated into the source and upload the same. The source does not maintain any history data also. So I do not have any track of deleted record also.
I don€™t have any scope to change the schema in the source. In this scenario can anybody suggest me the best approach to achieve the above mentioned objectives?
Can I retrieve only the recent updated or inserted date form transaction log back up. Can log shipping solve the give the solution?
One more question. Say I have a table and I am exporting/importing all the data from/to my target table using SSIS or DTS. In this scenario does using query or using directly the table affects the performance?
Regards
Sudripta Rakshit
View 4 Replies
View Related
Nov 19, 2007
Hi
i want to compare the source and destination data my source is Oledb source and destination is sql server destination I want to Compare the Data in destination with source and also want to add validation like iF Any row is there is destination but not in source delete that row if any row in source but not in destination Insert that row nd compare source with destination respect to Primary key if There is change in any column of that row Update that if not change than move to next record. Please tell me what is the sutable object and what to do
Please Help me Out
Thanks
Sandeep
View 10 Replies
View Related
Apr 24, 2008
Hi,
I am new to SSIS can u help me out in this i have a source(Flat File) and target SQL Server 2005. Source has got 2 columns- Col1 and Col2 even target has got 2 col's Col1 being the PK. I have created a package that checks if target and if the record exist it updates and if it does not it inserts. My package looks like
Source - lookup on target- Conditional split- Derived Column and then 2 OLE DB Destinations 1 for inserts and 1 for updates.
I have created a relationship in lookup with col1 from source and col1 from target and col2 as lookup col and connected red output to 1 OLE DB for inserts and redirect rows to it. Green out put i have taken to conditional split and gave condition like Col2 from source is not equal to lookup col2. I run the package with this it is inserting new records but not updating it.
I tried to add derived column after conditional split but lacks in writing expression that says update col2 records if they changed at source. Can u help me out in this scenario. I would appreciate if you could get back to me ASAP.
Thanks
View 5 Replies
View Related
Dec 5, 2007
I've SSIS 2005 SP2 and Excel 2007 installed. How come I do not see Excel 2007 on the Excel version list?
Thanks,
Ash
View 7 Replies
View Related
Nov 9, 2006
Hi
Inside a data flow task, i have a oledb source and destination. In my situation, I need to pull data from a table in the source, but also hard code some columns myself, which means my source is a blend of data from table, hard coded data, which will then have to be mapped to columns in oledb destination. Does anyone which option to choose in the oledb source dropdown for the data access mode. Keep in mind, i do need to run a a select query, as well as get data from a table. Is it possible to use multiple oledb sources and connect to one destination, because that is really what intend to do here. I am not sure how it will work, or even if its possible. Basically my source access mode needs to be a blend of sql command and table columns, how would that be implemented? Any help or advice is appreciated.
MA
View 4 Replies
View Related
Apr 9, 2007
I'm trying to build a DTSX package that FTP's an XML file to the local file system and then imports it into an existing table.
My "Data Flow" for the package starts with an XML Source component and then goes to Data Conversion component and then to an OLE DB Destination component.
It all executes with out error and seems to work fine, but when I look at the data in the table after I've run the package it seems to have inserted the appropriate number of rows from the XML file but all of the column values are NULL.
All of the data in the XML file is surrounded by <![CDATA[ ]]> and I discovered that if I remove the CDATA wrapper by hand then it inserts the data properly. The only problem is that I'm not in a position to have the data provider remove the CDATA tags and some of the data in the XML file needs the CDATA wrapper or else it will not validate.
Anyone know of anything I can do to get the CDATA to import properly?
View 15 Replies
View Related
Jul 20, 2005
I'm working on an ASP.Net project where I want to test code on a localmachine using a local database as a back-end, and then export it tothe production machine where it uses the hosting provider's SQL Serverdatabase on the back-end. Is there a way to export tables from oneSQL Server database to another in such a way that if a table alreadyexists in the destination database, it will be updated to reflect thechanges to the local table, without existing data in the destinationtable being lost? e.g. suppose I change some tables in my localdatabase by adding new fields. Can I "export" these changes to thedestination database so that the new fields will be added to thedestination tables (and filled in with default values), without losingdata in the destination tables?If I run the DTS Import/Export Wizard that comes with SQL Server andchoose "Copy table(s) and view(s) from the source database" and choosethe tables I want to copy, there is apparently no option *not* to copythe data, and since I don't want to copy the data, that choice doesn'twork. If instead of "Copy table(s) and view(s) from the sourcedatabase", I choose "Copy objects and data between SQL Serverdatabases", then on the following options I can uncheck the "CopyData" box to prevent data being copied. But for the "CreateDestination Objects" choices, I have to uncheck "Drop destinationobjects first" since I don't want to lose the existing data. But whenI uncheck that and try to do the copy, I get collisions between theproperties of the local table and the existing destination table,e.g.:"Table 'wbuser' already has a primary key defined on it."Is there no way to do what I want using the DTS Import/Export Wizard?Can it be done some other way?-Bennett
View 3 Replies
View Related
Aug 6, 2014
Does Replication use linked server.
If not then how data will transfer from source to destination.
Note : As per my knowledge it is not backup and restore like logshipping.
View 2 Replies
View Related
Sep 26, 2006
The DTS works perfectly when I run it manually. However, when I run itas a job it fails. Before you ask if i'm running it under differentsecurity context. I have already made sure of that. I was logged intothe server through remote viewer, when I created and ran the package,as well as scheduling the job. So the accounts they're running underare consistent. They're the same accounts as the SQL Agent is runningunder and it's the sys admin account.The data source is a Fox pro database with a pull of two tables. I amusing Microsoft OLE DB Foxpro driver as my source connection and OLE DBConnection for SQL Server as my destination. I am doing a simple tableto table transformation. The path of my connection is a mapped Drive:E:Main. There are other packages and jobs within my job queue that arepointing to the same database and they seem to run fine using the abovemapped drive. The ONLY difference between this package and otherpackages are that, they're few months old and this one was created lastnight. I have also enabled logging on this package and here is thebelow error when the job fails:Package Steps execution information:Step 'DTSStep_DTSDataPumpTask_1' failedStep Error Source: Microsoft OLE DB Provider for Visual FoxProStep Error Description:Invalid path or file name.Step Error code: 80040E21Step Error Help File:Step Error Help Context ID:0Step Execution Started: 9/23/2006 11:39:17 AMStep Execution Completed: 9/23/2006 11:39:17 AMTotal Step Execution Time: 0.031 secondsProgress count in Step: 0
View 1 Replies
View Related
Mar 7, 2008
Hi all,
I have a package that does simple exporting from an excel sheet to a table.
I used a Dataflow task with Excel Source and OLEDB Destination Components.
And i created Package configurations for Source and Destination Components.
After than when i execute the package i get the following error.
Information: 0x40016041 at ProductDetails_Import: The package is attempting to configure from the XML file "D:TEST_ETLLPL_Config2.dtsConfig".
Information: 0x40016041 at ProductDetails_Import: The package is attempting to configure from the XML file "D:TEST_ETLDBCon2.dtsConfig".
SSIS package "ProductDetails_Import.dtsx" starting.
Information: 0x4004300A at Data Flow Task, DTS.Pipeline: Validation phase is beginning.
Error: 0xC0202009 at ProductDetails_Import, Connection manager "Excel Connection Manager": SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E21.
An OLE DB record is available. Source: "Microsoft OLE DB Service Components" Hresult: 0x80040E21 Description: "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.".
Error: 0xC020801C at Data Flow Task, Excel Source [1]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "Excel Connection Manager" failed with error code 0xC0202009. There may be error messages posted before this with more information on why the AcquireConnection method call failed.
Error: 0xC0047017 at Data Flow Task, DTS.Pipeline: component "Excel Source" (1) failed validation and returned error code 0xC020801C.
Error: 0xC004700C at Data Flow Task, DTS.Pipeline: One or more component failed validation.
Error: 0xC0024107 at Data Flow Task: There were errors during task validation.
SSIS package "ProductDetails_Import.dtsx" finished: Failure.
The program '[2416] ProductDetails_Import.dtsx: DTS' has exited with code 0 (0x0).
I have been trying to troubleshoot the error message given below from last evening.
I have been trying to troubleshoot the error from last morning.
Counld not figure out what is causing this error to occur.
Please help!!!!
Any pointersSuggestions would be highly appreciated.
Thanks & Regards
View 3 Replies
View Related
May 16, 2008
Hi,
How do I retrieve the connections (connection managers) collections from Custom Data Flow destination? ComponentMetadata.RuntimeConnectionCollection is empty. I would like to be able to access all the connections defined in the package from the custom data flow task.
I came across code in which it was possible to access the Connections collection using the IDtsConnectionService for custom task (destination). The custom task has access to serviceProvider, whcih can be used to get access to the IDtsConnectionService interface but not the custom data flow task.
Any help appreciated.
Thanks
Naveen
View 5 Replies
View Related
Oct 18, 2015
We have a single generic SSIS package that is used to import several hundred iSeries tables into SQL. I am not looking to rewrite the process. But I am looking for ways to improve performance.
I have tried retain same connection, maximum insert commit size, lock table (tablock), removed some large columns, played with the log file location and size, and now I am working to tweak the defaultbuffermaxrows.
To describe the data flow task - there are six data flows tasks (dft)  working at the same time. Each dtf has their own list of iSeries tables and columns and the corresponding generic SQL table names. Each dtf determines their list of tables based on the number of columns to import. So there is dft30 (iSeries table has 1-30 columns to import), dtf60 (iSeries table has 31-60 columns to import), etc. The destination SQL tables are generically called Staging30, Staging60, etc. Each column in the generic Staging tables are varchar(100). The dtfs are comprised of an OLE DB Source and an OLE DB Destination.
The OLE DB Source uses a SQL Command from Variable to build a SELECT statement. The OLE DB Source uses a connection manager that uses an IBM iAccess IBMDA400 provider. The SQL Command ends up looking like this for the dtf30. This specific example is importing from the iSeries table TDACLR and it only has two columns so it will be copied to the Staging30 table.
select TCREAS AS C1,TCDESC AS C2,0 AS C3,0 AS C4,0 AS C5,0 AS C6,0 AS C7,0 AS C8,0 AS C9,0 AS C10,0 AS C11,0 AS C12,0 AS C13,0 AS C14,0 AS C15,0 AS C16,0 AS C17,0 AS C18,0 AS C19,0 AS C20,0 AS C21,0 AS C22,0 AS C23,0 AS C24,0 AS C25,0 AS C26,0 AS C27,0 AS
C28,0 AS C29,0 AS C30,''TDACLR'' AS T0 from Store01.TDACLR
The OLD DB Source variable value looks like the following, but I am not showing the full 30 columns
select cast(0 AS varchar(100)) AS C1,cast(0 AS varchar(100)) AS C2,cast(0 AS varchar(100)) AS C3,cast(0 AS varchar(100)) AS C4,cast(0 AS varchar(100)) AS C5, ... cast(0 AS varchar(100)) AS C30.
The OLE DB Destination uses OpenRowSet Using FastLoad From Variable. The insert into Staging30Â ends up looking like this.
insert bulk STAGE30([C1] varchar(100) ,[C2] varchar(100) ,[C3] varchar(100) ,[C4] varchar(100) ,[C5] varchar(100) , ... ,[C30] varchar(100) ,[T0] varchar(20)
Of course we then copy and transform the Staging30 data to the SQL table that equals T0.
But back to defaultbuffermaxrows. Previously the dtfs had default values of 10000 for DefaultBufferMaxRows and 10485760 for DefaultBufferSize. I added a SQL task to SUM the iSeries column sizes, TCREAS and TCDESC in this example, and set the DefaultBufferMaxRows by dividing the SUM of the columns max_length into 10485760. But I did not see a performance improvement. Do you think that redefining the columns as varchar(100) for the insert is significant? Should I possibly SUM the actual number of columns (2) as 2x100 or SUM the 30x100?
View 4 Replies
View Related
Sep 2, 2015
1 How to get the desired output colums into Excel file without having 'copy of column/unwanted columns' in destination file.
2. How to override the existing file in excel destination.
View 2 Replies
View Related
Oct 6, 2015
We are using SQL Server 2014 and SSDT-BI 2013. We have a reporting environment where business users create objects which need to be persisted for fiscal year reporting. Let's say for instance SQLSERVER1SRVR1 they create table objects like below in the reporting environment.
Accounting2014, Accounting2015 in AccountingDB;Â
Sales2014, Sales2015 in SalesDB;Â
Products2014, Products2015 in ProductsDB;Â
Inventory2014, Inventory2015 in InventoryDB etc....
These tables are persisted for auditing in a different environment SQLSERVER2SRVR2 for finance & audit folks.We would want to automate this process using SSIS to create tables in corresponding database and load data. I tried using For Each Loop container but the catch is I could loop the source or destination but how do we loop on Source & Destination at the same time (i.e when source is in AccountingDB destination to be AccountingDB, source SalesDB then destination SalesDB so on etc....
View 6 Replies
View Related
Feb 23, 2007
Hi,
I have a small problem. I've gone through the SSIS wizard and created a dtsx file which imports data from an access file into a SQL Server 2005 database. It has been set to delete existing rows and enable identity insert.
I then edited the .dtsx package in SQL Server Management Studio and added an environment variable configuration to allow me to change the destination database. In the script which runs the dtsx, here is what I have (it's an x64 system, so hi have to use DTExec):
"C:Program Files (x86)Microsoft SQL Server90DTSBinnDTExec.exe" /file e: estimport.dtsx /set Package.Connections[DestinationConnectionOLEDB].Properties[InitialCatalog];newdatabasename
and here is the error I get:
Description: The configuration environment variable was not found. The envir
onment variable was: "InitialCatalog". This occurs when a package specifies an e
nvironment variable for a configuration setting but it cannot be found. Check th
e configurations collection in the package and verify that the specified environ
ment variable is available and valid.
I got the package.connections.etc etc path from originally creating the environment variable as an xml config file, then I could open the config file and see what the path was...
Any help would be appreciated :)
View 11 Replies
View Related
Apr 1, 2014
I am stuck on finding a solution to transpose source data from a system via a metadata look-up table into a destination table. I need a method to transpose/pivot the source data into columns (which are by various data-types). The datatypes for each column are listed in a metadata table.
Source Data Table:
Table Name: Source
SrcID AGE City Date
01 32 London 01-01-2013
02 35 Lagos 02-01-2013
03 36 NY 03-01-2013
Metadata Table:
Table Name:Metadata
MetaID Column_Name Column_type
11 AGE col_integer
22 City col_character
33 Date col_date
Destination table:
The source data to be loaded into the destination table(as shown below):
Table Name: Destination
SrcID MetaID col_int col_char col_date
01 11 32 - -
01 22 - London -
01 33 - - 01-01-2013
02 11 35 - -
02 22 - Lagos -
02 33 - - 02-01-2013
03 11 36 - -
03 22 - NY -
03 33 - - 03-01-2013
View 7 Replies
View Related
Jan 1, 2007
hello
i am performing the ETL on the as400 db2 database using ms- dts,ssis.
i have built the connection b/w as400 and source to extract data from as400 to staging means in dataflow . when i have built the oledb connction for loading data to destination as oledb destination.then it will connct successfully to the db2 as destination but when execute the task then it not load data , and give provider error.
what can be good solution for this.
can u solve it.
rep plz.
View 2 Replies
View Related