Transfer Multiple Tables From SqlServer To Oracle Using SSIS
Jan 14, 2008
Hi,
I am new to SQL Server 2005 SSIS Packages. I want to transfer data from multiple tables from sql server to oracle database. I cannot use export wizard as it creates new tables in the destination (oracle) DB. I already have tables created in the destination DB. When I created an SSIS package, it allowed me create package to tranfer data for only for one table from source to destination. I have created DTS packages in 2000, where you have source db and destination db and just add links for multiple tables. Is there a way I can do it in SSIS. Please let me know.
I have to copy a large (3000) amount of different tables from a Oracle machine into an SQLServer machine. I am able to do this using a (VB) script. I use now several methods:
1) INSERT INTO TABLE1 SELECT * FROM SID1..DB.TABLE1 (SID1 is a linked server)
2) INSERT INTO TABLE1 SELECT * FROM OPENQUERY(SID1,'SELECT * FROM DB.TABLE1')
3) Also used OPENROWSET method (similar to 2)
For small tables this is fine, however for BIG tables (15M Rows/150Cols) the methods above are too slow. If I compare the same copy action with a simple DTS, the DTS is 3 times faster. Also, the DTS seems to bulk copy the data directly into the desired database while the mentioned methods first fill the tempdb, then the transaction log of the desired database and then finally the desired table (need very much extra space on your filesystem). The total size of data is about 300GB.
Can anyone supply me with a simple example how to copy data from an Oracle table into a SQLServer table in script (or SQL) that is as fast as the DTS and not filling my logfiles?? I read the bcp (which I use for import/export files) and bulk insert commands, but I do not understand how to use them in this question.
How is it possible to set data to flow through development,test,and production environment???????
I have created one package.Right now i am transferring data to tables in the development environment.
I want to use the same package for several diffrent distinations two based on environment.
Is there a way that the destination componnent sets itself automatically by passing destination tables through registry or is there anything i could or do i need to change manually everytime which ever i want to..
Hi All, i have mutiple text file. let us say,a1.txtb1.txtc1.txt i have to port this text file data into the table (SqlServer Database) which have the same file structure.(i.e)x1 (SqlServer table)y2 (SqlServer table)z3 (SqlServer table) now i have to transfer a1.txt file data ----to--- x1b1.txt file data ----to--- y2c1.txt file data ----to--- z3 using SSIS. like that, i have to transfer more than 250 files at a time.manually binding 250 files into the package is very cumbersome and time consuming process. so, can any one give ur valuable sugession to solve this issue.
I'm using a Business Intelligence project to copy stored procedures and tables from one database to another across servers. I'm having trouble copying tables or stored procedures using the Management.SMO.Transfer class.
I tried copying stored procedure with the property transfer.CopyAllStoredProcedures = true. This didn't work. As a workaround, I used the StringCollection property and executed every string as sql.
Now I'm having trouble copying tables. I don't want to copy all the tables in the database. How do I go about selecting what tables to copy. I tried using ObjectList property and provided the names of the tables in an ArrayList. I get the error
"Transfer cannot process System.String. You need to pass an instance class object."
How can I pass an "instance class object" for something that's in the database? The ScriptTransfer method fails so I can't even see the script that is being generated. There is virtually no documentation for this class.
I have a requirement where in i have around 15 different flat files , filenames are fixed but folder path can be changed(i think i should use a variable for folder path). These 15 files data should go to their respective tables in the database.
Whether I need to create separate data flow task for each file or separate package? In addition to these, example : while importing product data into product table, if product ID already exists, we need to ignore it and upload only the new records.
I have a couple of hundred flat files to import into database tables using SSIS.
The files can be divided into groups by the format they use. I understand that I could import each group of files that have a common format at the same time using a Foreach Loop Container.
However, the example for the Foreach Loop Container has multiple files all being imported into the same database table. In my case, each file needs to be imported into a different database table.
Is it possible to import each set of files with the same format into different tables in a simple loop? I can't see a way to make a Data Flow Destination item accept its table name dynamically, which seems to prevent me doing this.
I suppose I could make a different Data Flow Destination item for each file, in the Data Flow. Would that be a reasonable solution, or is there a simpler solution, or should I just resign myself to making a separate Data Flow for every single file?
need a clue about how to migrate the data from an Oracle applications 11.03 and underlying Oracle 8.05 database to navision 4.0 running sql server 2000
I am using the "Transfer SQL Server Objects Task" to copy some tables from database A to database B including data.
The tables, primary key constraints, Foreign key, data and all transfers nicely except for "DEFAULT CONSTRAINTS" on the tables.
I have failed to find any option in the "Transfer SQL Server Objects Task" task to explicitly say "copy default constraints". So I guess logically it should happen automatically but it doesn't. I hope it is not a bug :-)
Is there a way to transfer data from a SqlServer db to a SqlServer Express db. I tried to use the backup file of SqlServer, but this file is not valid for SqlServer Express. Or there any alternatives?
Does anyone know whats the best to transfer sqlserver to new server but maintain the same Sqlserver name. The reason for this is that we have thousands of users out in the field and it would be too much trouble to identify these users and update the odbc entry to point to new sqlserver.
we recently got a scenario that we need to get the data from oracle tables which is installed on third party servers. we have sqlserver installed on ourservers. so they have created a DBLINK in oracle server to our sqlserver and published the DBLINK name.
what are the next steps that i need to follow on my sqlserver in order to access the oracle tables ?
hi, i need to transfer data from a table in sql 2005 to another one with the same structure and data type definitions in oracle, i have made data flows in ssis that retrieve data from oracle to sql but none that made the other way...
1. is there any special connection type i must use? 2. do i need an odbc to connect to oracle?
where can i find information about!?? thanks for your help. !!!
I need to export multiple tables from a database to multiple csv files (one for each table).
Rather than use SSIS and have multiple OLEDB sources and destinations (one for each table), is there a way to have a generic package that will export all the tables in the database ?
One way I can see is to use BCP in a loop - with the loop powered by a select statement that links to something like sys.tables etc, (or another table that i prepped with just the tables I want if I dont want them all).
i.e I would use a stored procedure that uses BCP (called via XPcmdShell) - so not via SSIS - although I could wrap up the whole thing in SSIS - but there is no realy need.
Hi, I am trying to transfer data from a sql server 2000 database to a Sql Server 2005 database... and i just want to transfer around 2 tables.. and my sql server 2000 database is located in one machine and the 2005 database is located in the other machine.. So i am trying to insert data into the sql server 2005 database. So can i do it like this.. Insert into SqlServer2005comp.Databasename..TableName ( Columns ) Select Columns From sqlserver2000comp.Databasename..tablenames Where UserId = @UserId. Is this the right way to do it.. and can i give this query in the sqlserver2005 db or the 2000 db any help or ideas will be appreciated. Regards... Karen
I have written an SSIS package with a Transfer SQLServer Objects task which I want to use to copy database objects from a SQL Server 2000 database to SQL Server 2005.
When I run this task, I find the following error:
[Transfer SQL Server Objects Task] Error: Execution failed with the following error: "Version80 database compatibility level is not supported
Is there a way around this aside from changing the compatibility level of my SQL Server 2005 database?
Hi, I am attempting to create SSIS packages to extract data from our AS/400 DB2 databases and populate tables for analysis and reporting in SQLServer. We have recently installed a new 2005 SQL Server to replace our existing 2000 databases. I have several DTS packages setup to connect to the 400s and copy the needed data to SQL2000 and these work great but do not transition to SSIS very well. I can get my SSIS packages to work if I create a seperate data flow task for each table but that does not seem appropriate for a tool like SSIS. I would think that what I am attempting is a very common thing to do. Maybe other enumerators could be used but I have not been able to get it to work.
I am trying to use a foreach loop, currently with a forech item enumerator, to first truncate the table on the SQL Server database using an Execute SQL task, and the to use a Data Flow task to refresh the data using an ole db source and destination. I have the truncate portion working, but when I try to setup the Ole db source using a variable it tells me I have no input columns. I have a variable from the forech loop that loops through the table names, this variable is used in a second variable which is set to EvaluateAsExpression with the expression set to the follwoing: "select * from " + @table_name
I have set delayvalidation to true and validateexternalmetadata to false. What do I do in order to get around this issue? Also, how should the old db destination be set so that it load the data into the table specified in the variable?
I am trying to transfer data from a sql server 2000 database to a Sql Server 2005 database... and i just want to transfer around 2 tables.. and my sql server 2000 database is located in one machine and the 2005 database is located in the other machine.. So i am trying to insert data into the sql server 2005 database. So can i do it like this..
Insert into SqlServer2005comp.Databasename..TableName
(
Columns
)
Select
Columns
From
sqlserver2000comp.Databasename..tablenames
Where UserId = @UserId.
Is this the right way to do it.. and can i give this query in the sqlserver2005 db or the 2000 db
I have several DTS packages that connect to various Oracle databases. An upgrade has recently been done to one of the databases from 7.3 to 8i. The other databases were always 8i. Last week, I could edit data transer tasks normally, this week, DTS hangs and I have to use task manager to kill the process. It worked fine last week. I can successfully run the packages, I just can't edit them. I have no trouble editing or running packages that connect to databases other than the one recently upgraded. I have tried both OLE DB and ODBC connections with the same results. Does anyone have any ideas on how to fix this?
I have a delimited text file with 650+ columns. The sum of the column lengths of a single row, if fully populated, exceeds 30K bytes. The "killer" fields lengthwise are the "Description" fields. If they were removed from the input file, the remainig columns would occupy about 5000 bytes, which is within SQL max row length.Â
Can SSIS be used to created these two tables? (one without  description fields, the other with those field but arranged vertically in the table rows).
The fundamental issue is I can not import a single file row into a sql table because that row length could exceed the max byte count for a row.
I have multiple xml data file in a directory say C:XMLData abc1.xml, abc2.xml, abc3.xml etc.
Need to loop through each file in ssis with Foreach loop container, and get the file name say abc1, and load the data of abc1.xml to abc1 table in sql server DB.
Next iteration will pick up the abc2.xml and find the abc2 table in sql server DB then insert the data in abc2 table.
While each iteration, xml source should also point each xsd file correspondingly.
 Tables are already created in DB
I solved my problem up to getting the file name from ech iteration and assigned file name to variable, in oledb destination data access mode I select Table or view name variable, then corresponding table will get selected for data insertation.
Just wanted to know how can I read each xsd file for each xml data files while iteration.Â
Develop on both Oracle 9i and SQLServer 2000 back ends and would like to set up a new test/development server. Are there any issues with running both systems on one box?
Questo Statement in SQL Server funziona.In Oracle PL/SQL se lo lancio funzionaQuando lo devo far funzionare da Vb.net mi si pianta e non va avanti . Stalì a pensare.Come mai ?UPDATE proc_aziendaSET cod_fase_sign = (SELECT MAX(pfa.COD_FASE)FROM PROC_FASE_RIGA pfaWHERE (COD_GRUPPO = 'x09') AND((TIP_DATI = 'I') OR (TIP_DATI = 'S')) ANDpfa.cod_processo = '02')WHERE cod_gruppo = 'x09' ANDcod_processo = '02'C'è un'altro Statement:UPDATE proc_aziendaSET cod_fase_sign = FaseFROM (SELECT ep.cod_processo,app.cod_azienda,MAX(cast(ep.cod_fase as int)) as FaseFROM esp_proc_prospetti ep,APP_PROSP_AZ_8377 app,(SELECT pa.cod_processo,pa.cod_azienda,pa.cod_fase_signFROM proc_azienda paWHERE pa.cod_gruppo = 'x09' andpa.cod_processo = '05' andpa.cod_fase_sign is nullGROUP BY pa.cod_processo,pa.cod_azienda,pa.cod_fase_sign) pnullWHERE ep.cod_gruppo = 'x09' andep.cod_processo = app.cod_processo andep.cod_processo = pnull.cod_processo andep.cod_prospetto = app.cod_prospettoGROUP BY ep.COD_PROCESSO,app.COD_AZIENDA) FaseAzienda ,proc_azienda paWHERE pa.cod_gruppo = 'x09' andpa.cod_processo = FaseAzienda.cod_processo andpa.cod_azienda = FaseAzienda.cod_aziendaQuesta sintassi sembra regolare per SQL Server ma non per ORACLE.Come deve essere ... ?Grazie milleS.
I'd like to transfer some records between the following 2 tables. Surely this should be a no-brainer - what am i missing that is making this so impenetrable?
I am currently: Hoping someone can help me get here: (this is my first time of using SSIS btw).
here is the source table (MS Sql Server 2005, SP 2)
I used the SSIS Import and Export wizard to copy data between the two tables, and attempted to execute it. I use Sql Native Provider on source, and Native Ole DBOracle Provider for OLEDB. however, I get an error:
[Destination - IMAGINE_DIVS [37]] Error: An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "OraOLEDB" Hresult: 0x80004005 Description: "ORA-12571: TNSacket writer failure".
I notice that the wizard has created a data flow task with 3 steps: source - imagine_divs, Destination - IMAGINE_DIVS and "data conversion 1".
Data Conversion 1 seems to be taking my source nvarchar columns and converting them to DT_STR with twice the size (for example div_mnemonic become DT_STR, size: 22).
If I change the mappings in the OLE DB Destination Editor, such that only the numeric and date-typed columns are included in the transfer, it works fine.
If I include any string-typed column in the destination editor mappings, I get the TNS Packet Writer error. If I remove the Data Conversion step and connection teh source and destination tasks directly, i get validation errors saying that:
Error 2 Validation error. Data Flow Task: OLE DB Destination [294]: Columns "div_mnemonic" and "DIV_MNEMONIC" cannot convert between unicode and non-unicode string data types. Package4.dtsx 0 0
this is despite the fact that everything is unicode here (right?)
Hi! I have a general SQL CE v3.5 design question related to table/file layout. I have an system that has multiple tables that fall into categories of data access. The 3 categories of data access are:
1 is for configuration-related data. There is one application that will read/write to the data, and a second application that will read the data on startup.
1 is for high-performance temporal storage of data. The data objects are all the same type, but they are our own custom object and not just simple types.
1 is for logging where the data will be permanent - unless the configured size/recycling settings cause a resize or cleanup. There will be one application writing alot [potentially] of data depending on log settings, and another application searching/reading sections of data. When working with data and designing the layout, I like to approach things from a data-centric mindset, because this seems to result in a better performing system. That said, I am thinking about using 3 individual SDF files for the above data access scenarios - as opposed to a single SDF with multiple tables. I'm thinking this would provide better performance in SQL CE because the query engine will not have alot of different types of queries going against the same database file. For instance, the temporal storage is basically reading/writing/deleting various amounts of data. And, this is different from the logging, where the log can grow pretty large - definitely bigger than the default 128 MB. So, it seems logical to manage them separately.
I would greatly appreciate any suggestions from the SQL CE experts with regard to my approach. If there are any tips/tricks with respect to different data access scenarios - taking into account performance, type of data access, etc. - I would love to take a look at that.