Looking for a faster method of moving data from SQL to Oracle.
I'm attempting to push a sql table into an oracle table (sql server 2000 oracle 7, 8, and 9). I have no problem doing this with either 'Oracle Provider for OLE DB' or the 'Microsoft OLE DB Provider for Oracle'. None of my data is being transformed so its a straight import. With the hardware I'm using it takes nearly 3 seconds to import 1000 rows. While this isn't too bad, I need to import upwards of 4 million rows and this results in unacceptable time results.
I do have an oracle script that imports the csv files of the tables, but I'm looking for an all inclusive sql solution.
Does anyone know of another method in SQL that I can use to push the data faster?
I was wondering if there was a different approach I should take in appending data to a table...
My destination table has about 94+ million records in it, and I have been taking two approaches to getting new files into this table:
1) I do a data pump task in a DTS to import the file to a trans (temp) table, which is truncated every time, and then do an INSERT INTO statement from the temp table to my destination table.
The import to the trans table only takes a few minutes (about 1 - 2 million records per file, but have short record lenghts,) but when I do the INSERT INTO statement, it takes upwards of 6 hrs to append.
2) I have tried doing a bulk insert task, going directly to the destination table (which defeats the purpose of my trans table to check out the data prior, but I feel the data is clean at this point.)
I am running the bulk insert right now, and it's been running for over 3 hours...so I'm going to assume this will take just as long as the INSERT INTO statement does like I did before.
My destination table does not have any indexes in it at all, and I don't need to do any transformations to the data when bringing it into SQL since the data is clean. Also, I have a default value constraint on one of my fields on the destination table.
Plus there are other ppl and applications hitting the server which could impact the overall processing, but nothing out of the ordinary is going on the server today. I know there are only so many ways to get a file into a table...but maybe someone knows a different way I should try this.
I'm trying to import from Oracle Rdb database into SQL server 2005 via ODBC without much success.
I've installed the correct Oracle Rdb driver and verified it all works ok by importing in to Access.
Using the import wizard within SQL mgt studio and the following connection strings I get a password error
Dsn=XRS;Driver={Oracle RDB Driver};sid=system
TITLE: SQL Server Import and Export Wizard ------------------------------ The operation could not be completed. ------------------------------ ADDITIONAL INFORMATION: ERROR [HY000] [Oracle][ODBC][Rdb]%SQLSRV-F-GETACCINF, Oracle SQL/Services authorization failed ERROR [HY000] [Oracle][ODBC][Rdb]%SQLSRV-F-GETACCINF, Oracle SQL/Services authorization failed ------------------------------ BUTTONS: OK ------------------------------
I know the username/password combination to be correct.
Is SSIS the way to go. I've had a quick look at this and setup the ODBC connection but the column mappings box reports the following...
error at data flow task [datareader source[1]; cannot aquire a managed connection from the run-time connection manager.
Anyone point me towards any examples of reading in data via ODBC and storing in a SQL server database.
Should point out that the source is not Oracle RDBMS (7,8 or 9) but Oracle Rdb which is a completely seperate product.
I am using Import and Export wizard and seeing a very weird behaviour.
I am able to choose 'Microsoft OLE DB Provider for Oralce' and provide server name, username/ password. The test connection results in a success. However when I try to move to next window/page by clicking next I get invalid oracle error ora-01017 - invalid username/ password.
I recently encountered what will appear to me as a very strange problem with my packages involving data import from an Oracle 10g database. The packages are simple data imports which select and insert data from Oracle to SQLServer 2005 tables.
Till recently, the packages ran without any problems but a day ago, these packages starts to fail when nothing was changed in them. They will run for about 8 minutes, at which approximately 400,000 records will be selected and processed. After that these packages will fail. The failure rate is inconsistent in the sense that some of these packages will fail while some will pass, depending on the 'mood' of SSIS.
These packages are using Oracle's supplied OLE DB drivers and have been running well since day 1. Should I be concerned with Oracle terminating connection after 'x' minutes something like that?
I'm absolutely puzzled with these behavior and will appreciate it if someone can point me to the right direction. This server is a 64 bit server and SQL Server is installed as 64 bit as well.
I have a column in an Oracle source system with data type NUMBER(38,2). The value "-0.01" is causing problems when trying to import into a SSIS data-flow.
The only way I can import this into my data-flow is by using a Datareader connection manager using the ODBC Data Provider. My DSN is using the Oracle ODBC driver.
If I try and use the "Native OLE DBMicrosoft OLE DB Provider for Oracle" I get an error: "The data value cannot be converted for reasons other than sign mismatch or data overflow"
Judging by this post: http://microsoftdw.blogspot.com/2005/11/final-storyhow-to-get-data-out-of.html there aren't really any other combinations to try that will bring my data in as I want it.
I don't want to use ODBC though as:
Its an old technology Its slow I have to deploy an additional DSN.
Can anyone tell me why the other options don't work? Why does Microsoft OLE DB Provider for Oracle have a problem with "-0.01"?
Does anyone know how to import data from an Oracle view (on Unix machine) to the tables in NT/SQL server 7.0? At least point me to the right docoment resource, if available. Thanks a lot.
We will need to routinely import only changed data from an Oracle data base into a SQL database. So we need an agent that will 1) Compare data in both databases (From disparate tables) 2) Import only that which is changes or new.
I am new to SQL server administration and am looking for a best practice method that we can be run on a weekly basis. I am open to using third party software solutions, but would prefer a native MS SQL 2000 solution. Can someone point me in the right direction? Thanks.
I used the Microsoft OLEDB provider for oracle from the dropdown,
The username and password I have also. I tried to enter in the Xtx20xxx as the server name, clicked "Properties"
put in the username and password I can given and clicked "Test Connection". No matter what I get the infamous ORA-12154: TNS:could not resolve service name error message. I am sure I am missing something obvious here but I have absolutely no clue and all the googling in the world has not given me any help. I do have the Oracle 10g client tools installed on this workstation (running Windows XP Prof SP2 and SQL 2005 STD).
I'm using - Destination - Oracle driver - oraOLEDB.Oracle.1 (native ole dboracle provider for ole db)
Source - SQL driver - microsoft ole db prover for sql server. I want to import data from sql server to oracle. Challenge is, I have 1 million records on oracle. I have 100 records on sql server (these 100 records count will change daily). So, I thought of using 'lookup' task looking taking record from ms sql and fetch corresponding record from oracle. But when I use lookup, all records from oracle are loading into cache, which is taking approx 3 hrs.
I am running SQL Server 2005 and have built a simple SSIS package to import data from an oracle database to my SQL Server 2005 database. When I run it in SSIS, it works and it imports just fine. When I schedule it, it gives me problems. Help!
It might be a problem with Oracle Provider client I installed. Is there a client version I can download and install? the one I downloaded from oracle doesn't work. I bet i did something wrong though.
Here is my version:
Microsoft SQL Server Management Studio 9.00.2047.00 Microsoft Analysis Services Client Tools 2005.090.2047.00 Microsoft Data Access Components (MDAC) 2000.086.1830.00 (srv03_sp1_rtm.050324-1447) Microsoft MSXML 2.6 3.0 4.0 6.0 Microsoft Internet Explorer 6.0.3790.1830 Microsoft .NET Framework 2.0.50727.42 Operating System 5.2.3790
Microsoft Visual Studio 2005 Version 8.0.50727.42 (RTM.050727-4200) Microsoft .NET Framework Version 2.0.50727
Installed Edition: IDE Standard
SQL Server Analysis Services Microsoft SQL Server Analysis Services Designer Version 9.00.2047.00
SQL Server Integration Services Microsoft SQL Server Integration Services Designer Version 9.00.2047.00
SQL Server Reporting Services Microsoft SQL Server Reporting Services Designers Version 9.00.2047.00
I am getting this error when connecting to Oracle db. I tried using Microsoft OLEDB provider for Oracle it give me error and tells me the error could not be retrieved from Oracle. When I try the Native OLDDB provider for Oracle I get
Warning at {0F67F2FA-E3F8-4F44-93EC-47D513A34FD4} [Orcale Database WPHP2 [1]]: Cannot retrieve the column code page info from the OLE DB provider. If the component supports the "DefaultCodePage" property, the code page from that property will be used. Change the value of the property if the current string code page values are incorrect. If the component does not support the property, the code page from the component's locale ID will be used.
Error at Copy DSS_SJV_Volume_History [DTS.Pipeline]: The "output column "COMP_UNIQUE_ID" (2007)" has a precision that is not valid. The precision must be between 1 and 38.
Im getting the following error when I try to export from Oracle 9i and import into SQL Server 2005 using Oracle and SQL OLE DB Provider's respectively.
Cannot retrieve the column code page infor from the OLE DB Provider. If the component supports the "DefaultCodePage" property, the code page from that property value will be used. Change the value of the property if the current string code page values are incorrect. If the component does not support the property, the code page from the component's locale ID will be used.
I know there is a way to change the property settings for the OLE DB Provider in the Data flow section of the development studio BUT ...
1) Is there a way to change this outside of the development studio?
2) I can't create the package to setup in the studio because even when I uncheck execute immediately and check save package, it still fails and never creates anything.
Hi, I've a question about importing Oracle data and some fields are null. I get an error 'Conversion failed because the data value overflowed the specified type'. When i look in preview query result, via OLE db Source editor > Preview, this field contains '<value too big to display>'.
create a new Connection Manager by right-clicking in the Connection Managers section of the design area of the screen. Select New OLE DB Connection to bring up the Configure OLE DB Connection Manager dialog box. Click New to open the Connection Manager. In the Provider drop-down list, choose the Microsoft Jet 4.0 OLE DB Provider and click OK. Browse to the Access database file and connection set up---all good!!!
Dataflow task Add an OLE DB Source component Double-click the icon to open the OLE DB Source Editor. Set the OLE DB Connection Manager property to the Connection Manager that I created . Select Table from the Data Access Mode drop-down list. I cannot see the tables set up as set up as pass-through table types to a Oracle 9i db
I have become frustrated and I am not finding the answers I expect.
Here's the gist, we support both Oracle and SQL for our product and we would like to migrate our Clients who are willing/requesting to go from Oracle to SQL. Seems easy enough.
So, I create a Database in SQL 2005, right click and select "Import Data", Source is Microsoft OLE DB Provider for Oracle and I setup my connection. so far so good.
I create my Destination for SQL Native Client to the Database that I plan on importing into. Still good
Next, I select "Copy data from one or more tables or views". I move on to the next screen and select all of the Objects from a Schema. These are Tables that only relate to our application or in other words, nothing Oracle System wise.
When I get to the end it progresses to about 20% and then throws this error about 300 or so times:
Could not connect source component. Warning 0x80202066: Source - AM_ALERTS [1]: Cannot retrieve the column code page info from the OLE DB provider. If the component supports the "DefaultCodePage" property, the code page from that property will be used. Change the value of the property if the current string code page values are incorrect. If the component does not support the property, the code page from the component's locale ID will be used.
So, I'm thinking "Alright, we can search on this error and I'm sure there's an easy fix." I do some checking and indeed find out that there is a property setting called "AlwaysUseDefaultCodePage" in the OLEDB Data Source Properties. Great! I go back and look at the connection in the Import and .... there's nothing with that property!
Back to the drawing board. I Create a new SSIS package and figure out quickly that the AlwaysUseDefaultCodePage is in there. I can transfter information from the Oracle Source Table to the SQL Server 2005 Destination Table, but it appears to be a one to one thing. Programming this, if I get it to work at all, will take me about 150 hours or so.
This make perfect sense if all you are doing is copying a few columns or maybe one or two objects, but I am talking about 600 + objects with upwards of 2 million rows of data in each!!
This generates 2 questions: 1. If the Import Data Wizard cannot handle this operation on the fly, then why can't the AlwaysUseDefaultCodePage property be shown as part of the connection 2. How do I create and SSIS Package that will copy all of the data from Oracle to SQL Server? The source tables have been created and have the same Schema and Object Names as the Source. I don't want to create a Data Flow Task 600 times.
Dear all,, I used a code that import data from an excel file into a dataset, now I want to insert the dataset into a table in my database(SQLserver database) using a VB.NET code Could you help me? Thanks in advance,, Here is my code: Imports System.Data.OleDbPartial Class _DefaultInherits System.Web.UI.Page
Protected Sub Button1_Click(ByVal sender As Object, ByVal e As System.EventArgs) Handles Button1.ClickDim connString As String = ConfigurationManager.ConnectionStrings("xls").ConnectionString ' Create the connection object Dim oledbConn As OleDbConnection = New OleDbConnection(connString) Try ' Open connection oledbConn.Open() ' Create OleDbCommand object and select data from worksheet Sheet1Dim cmd As OleDbCommand = New OleDbCommand("SELECT * FROM [Sheet1$]", oledbConn) ' Create new OleDbDataAdapter Dim oleda As OleDbDataAdapter = New OleDbDataAdapter() oleda.SelectCommand = cmd ' Create a DataSet which will hold the data extracted from the worksheet.Dim ds As Data.DataSet = New Data.DataSet() ' Fill the DataSet from the data extracted from the worksheet.oleda.Fill(ds, "Sheet1") ' Bind the data to the GridView GridView1.DataSource = ds.Tables(0).DefaultView GridView1.DataBind() Catch Finally ' Close connection oledbConn.Close() End TryEnd Sub End Class
I'm trying to replicate two very big databases with about 10 million of 4000 characters each. The publisher is SQL 2000, subscriber is SQL 7.0
The subscriber will also perform full text searches.
I'm trying to decide wheter I should use PULL or PUSH. The publisher is operating on a very low quality/speed internet connection, where the subscriber is enjoying a T1.
While trying to push a tracked table using RDA.push, I get the following error:
Error Code: 80004005
The message cannot be built. The make message failed.
Minor Err: 28581
Source: Microsoft SQL server 2005 Mobile Edition.
All other tables in the database are getting pulled and pushed correctly. This table is different only in the larger number of columns, around 150. It has a primary key, no other constraints.
Any help to find the reason for this error will be greatly appreciated.
I am developing an application in which i have to send data from local Sql Server compact edition database[Which is in a Windows Mobile Device,] to central server[SQL Server 2005]. I am using RDA method for communication
Can i use push method to send data from local DB to Central DB?
Is it must to use PULL method before using PUSH method?
Hello all. Please excuse my ignorance, as this is not my territory. I administer a website which is hosted remotely. This site has SQL7 running the data to dynamically build the site. Every Sunday our hosting service runs a DTS package to push the data they have down to us, so we can run reports and analyze it. We recently upgraded to SQL2000, while our host has stayed with SQL7. Now our DTS is failing. They say it is because 7 cannot push to 2000. But they think that we could pull from them. How do I go about setting that up? Will the DTS wizzard walk me through most of it?
I need to copy a large amount of data from one table and insert it into another table.
The design of the destination table is exactly the same as the source table except for the fact that it has one extra field. Can I copy; in a single SQL statement; all rows in one table (that match given criteria) into another table allowing for the extra field?
I have a production database that I would like to have copied over to a backup database on a separate server every evening. I don't want to mirror, I just want the databases synced up every evening.
The servers are physically attached through a gigabit switch and the database is relatively small, so I don't think that speed will be an issue.
Could someone point me to an article about the best way to accomplish this?
I did not see this one coming, and I am not sure if I did something wrong.
How do you push data from sql05 to sql2k?
I set up a data flow task, with one sql05 connection magager and another sql2k connection manager. Then when I tried to map them, I cann't!
The message on the box said: The connection manager uses an earlier version of sql server provider. Bulk insert operations require a connection that uses a sql server 2005 provider.
I have been trying different source, destination and transformation, but seems like missing something.