I have 53 tables I need to load from Oracle into SQL Server as part of a migration to a new system. All but three of the tables load perfectly. Three tables load approximately 95% of the rows. The problem tables have between 300,000 and 1,300,000 rows. Oddly some of the tables that migrate successfully have more than 10,000,000 rows. What to check to determine why these tables are failing to load?
I am using ssma to migrate database from oracle 9i to sql server 2000. I successfully migrated oracle tables,views,procedures,functions etc.. But can you please help me on how to migrate packages and triggers? Here i can able to view the headings like"Triggers", "packages", "Indexes" in SSMA. But none of my oracle objects are displaying under these headings. what could be the problem?
need a clue about how to migrate the data from an Oracle applications 11.03 and underlying Oracle 8.05 database to navision 4.0 running sql server 2000
I want to find out if .net + java + sql server etc is a replacement for my present development environment.
However, I cannot migrate an Oracle database to SQL server express. SSMA does not work. Is there another way to migrate my Oraclke database to SQL server express?
Recently my boss decide migrate reports in Oracle Discoverer to SqlServer 2005 using Reporting Services..because SqlServer is less expensive than Oracle. .. Well I don't know exactly How do I began? because it's new for me. Any has experienced? Can you give any recomendations to migration?
I attempted to use SSMA to migrate a 2000 file format .mdb into SQL Server Express (and, as I've just purchased VS2005 Pro, into SQL Server Developer Edition).
The file is actually a backend, so nothing to migrate other than tables.
After several runs during which I received (and fixed up) some errors, the process runs smoothly with no errors or warnings, until it comes to the last step; migrating the data itself.
Data migration into 'parent' tables works fine. However, wherever I have data in a table with a foreign key relationship to any of the aforementioned 'parent' tables, it refuses to migrate it. The text of the error message will be very smiliar to the following (from the log):
[Datamigrator: Error] [464/7] [2007-09-23 14:18:54]: Exception: The INSERT statement conflicted with the FOREIGN KEY constraint "Branch$CompanyBranch". The conflict occurred in database "DPMTest", table "dbo.Company", column 'CompanyID'.
There is nothing wrong, so far as I can determine, with the relationships involved.
I can insert data into the tables using any of the following methods: 1. Directly in Access, in the source backend. 2. Using the original Access frontend application, attached to the source backend. 3. Using VB.Net forms I am developing, in the 'upsized' database. 4. Directly in the Management Studio, in the 'upsized' database.
None of these four methods complains in the least about the relationships which SSMA balks at.
I am trying to use a lookup in a package and check for some conditions. On the advanced tab, I am trying to modify the condition from = to <=. But the same doesnt work when the target is on oracle, but the same works fine on SQL Server and DB2.
Im getting the following error when I try to export from Oracle 9i and import into SQL Server 2005 using Oracle and SQL OLE DB Provider's respectively.
Cannot retrieve the column code page infor from the OLE DB Provider. If the component supports the "DefaultCodePage" property, the code page from that property value will be used. Change the value of the property if the current string code page values are incorrect. If the component does not support the property, the code page from the component's locale ID will be used.
I know there is a way to change the property settings for the OLE DB Provider in the Data flow section of the development studio BUT ...
1) Is there a way to change this outside of the development studio?
2) I can't create the package to setup in the studio because even when I uncheck execute immediately and check save package, it still fails and never creates anything.
I'll say up front that this is going to be a long post because I plan on explaining my situation and what I've tried up front. It requires some explanation of how our application is secured since that appears to be interfering with SSIS being able to connect.
First the background scenario.
My company writes software that can use MS SQL or Oracle as the back end. I've begun work on an initiative to extract data to a star schema prior to purge in an attempt to improve reporting and analysis available to our customers. Initial plan is to use MS SQL Integration/Analysis/Reporting services for this. Even for our Oracle customers.
The problem symtoms.
I'm attempting to get a connection to Oracle with either an OLE DB or .Net provider. I've tried all available and get a ORA-12638: Credential retrieval failed error on all of them. I validate that a development username and password work using both SQL*Plus and Toad so troubleshooting begins. After some research I find that if I change my line for SQLNET.AUTHENTICATION_SERVICES in my sqlnet.ora file on the server from NTS to NONE that I can now get a connection with the SSIS Connection Manager.
The problem is, our application is secure and having this value set to NONE will never happen in production. I don't use it in dev or qa either. I also found that once I set this to none I could no longer connect with integrated security (EXTERNAL in Oracle speak) with Toad.
How our application is secured and why.
When setting up our application with Oracle as the back end we create only two users. One is the schema and owns all objects. It can't even log into the server though. The other is an external user. This users only rights are to log on and membership in a role. The role can execute stored procedures and nothing more. The application server services for our software run under the username of the external user in Oracle. That way we can use integrated security and no usernames or passwords are hard coded anywhere. Using an external user allows us to have our Windows application server work correctly with Oracle regardless of the OS that is hosting Oracle. This has made it easy on my developers since they just code for using integrated security and we have nearly identical data abstraction layers for both MS SQL and Oracle.
I've tried using the Integrated Security = True option in the .Net provider and still get the same error. I've tried passing / as the username and get the same error. I have a regular Oracle username/password in dev and qa environments that I provide the developers. It has more rights that the external user the services runs under so they can investigate what is happening behind the scenes. It works with Toad and SQL*Plus. When I use it in SSIS CM I get the same error message.
How do you use integrated security with Oracle from SSIS? I don't want to have to tell my boss that SSIS won't work for our Oracle customers.
I have created a DTS package that pulls data in from Oracle into SQLServer. When I run it directly on the server (from EnterpriseManager), it works fine. When I run it from Enterprise Manager on aclient machine that does not have the Oracle client software, it doesnot run, giving me the error: "The Oracle client and networkingcomponents were not found...". I was hoping I wouldn't need to havethe client software installed on a machine other than the server whereSql Server is running. The problem here is that there are a number ofmachines from where I would like to execute this DTS package that I donot want to install/configure the Oracle client software. I don'tquite understand why the configuration of the client is importanthere. In my mind, when I am using enterprise manager from a clientmachine, I am using it sort of like a terminal services client toconnect to the server. I guess there is a lot more happening in thebackground.Thanks for any feedback,Marcus
scott/tiger is given as an example here; used my real userid/pw instead. BTW, I am able to connect
to the Oracle database using the userid/pw from the same host that is running the Sqlserver db.So, the
Oracle client software is installed and is working correctly.
Then attempted to query a table on the Oracle database:
select count(*) from ORA10G..<schemaname>.<tablename>
Got the following error message(s)
OLE DB provider "MSDAORA" for linked server "ORA10G" returned message "ORA-12154: TNS:could not resolve the connect identifier specified ". Msg 7303, Level 16, State 1, Line 1 Cannot initialize the data source object of OLE DB provider "MSDAORA" for linked server "ORA10G".
All the previous discussions and suggested solutions are all for SqlServer 2000 and Oracle8. But I did not see much discussions on SqlServer2005 and Oracle10g. Any help to resolve this problem will be appreciated.
I am extracting data from an Oracle database and have installed the latest x64 ODAC. When I run the 64 bit dtexec in cmd the package runs fine with no errors, but when creating and executing a SSIS job in the agent it fails. I've tried creating the job using CmdExec as well and I get the same errors:
Microsoft (R) SQL Server Execute Package Utility Version 9.00.3042.00 for 64-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 11:12:43 AM Error: 2007-08-15 11:12:45.63 Code: 0xC0202009 Source: Load Dimension Data Connection manager "ora" Description: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x800703E6. An OLE DB record is available. Source: "Microsoft OLE DB Service Components" Hresult: 0x800703E6 Description: "Invalid access to memory location.". End Error Error: 2007-08-15 11:12:45.65 Code: 0xC020801C Source: CopyCustomers CustomerSource [1] Description: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "ora" failed with error code 0xC0202009. There may be error messages posted before this with more information on why the AcquireConnection method call failed. End Error Error: 2007-08-15 11:12:45.65
The CmdExec job step uses the _exact_ same command as the one I execute successfully in cmd: "c:Program FilesMicrosoft SQL Server90DTSBinnDTExec.exe" /FILE "C:PackagesLoad Dimension Data.dtsx" /MAXCONCURRENT " -1 " /CHECKPOINTING OFF /REPORTING EW
I have managed to find two work arounds, but they are quite ugly:
1) Schedule the package execution using Windows Task Scheduler, basically create a bat file which runs the package. 2) Schedule the package in SQL Server agent, but as T-SQL script and use xp_cmdshell, i.e. EXEC xp_cmdshell 'dtexec /FILE "C:PackagesLoad Dimension Data.dtsx" /MAXCONCURRENT " -1 " /CHECKPOINTING OFF /REPORTING EW'
Both work and use the 64 bit dtexec. Not that elegant though..
CmdExec and the 32 bit version of dtexec works, but is not good enough for me since we bought new hardware and 64 bit software just to be able to address more memory. Has anyone managed to execute a SSIS package successfully using the agent and the 64 bit OraOLEDB.Oracle.1?
Any help would be greatly appreciated. Thanks!
Btw, I am using Windows Server 2003 x64 and SQL Server with SP2.
hi everyone ,How are you all? i'm working on sql DTS to transfer data from oracle to MS SQL2000 i did the transfer but i need to delete transfered data from oracle in DTS How i could do that?
I seem to have a strange problem in that my code runs fine, but I am trying to INSERT rows into a database, and the rows affected by the INSERT command is alwsy zero, even though no error is thrown. Hence no data gets inserted.
Here is the code:
void recordTick(string pair, DateTime time, decimal sell, decimal buy)
{
SqlConnection thisConnection = new SqlConnection("Data Source=.\SQLEXPRESS;AttachDbFilename=|DataDirectory|\FOREX.mdf;Integrated Security=True;User Instance=True");
insert = "INSERT INTO " + pair + " VALUES(" +
"'" + time.ToString("yyyy-MM-dd HH:mms") + "'," +
"'" + sell.ToString() + "'," +
"'" + buy.ToString() + "'" + ")";
thisConnection.Open();
SqlCommand command = new SqlCommand(insert, thisConnection);
The database is type .MDF that I created with the Project/Add Component/SQL Database menu. I can inspect it easily in Server Exporer and add data manually with no problem.
Can any one suggest how to get the INSERT to actually add the data?
I was testing the SSMA with SQL Server Express yesterday and it converted the database over from Access 2000 to SQL Server without incident using the wizard; that is until I attempted to execute the final step and connect the SQL Server Express. At that point, I was unable to move the database into SQL Server because I the connection failed. It failed in the sense that I was able to enter the Server name but the boxes to move forward in the process were all grayed out.
Has anyone used SSMA (SQL Server Migration Assistant) with SQL Server Express?
I don't see all my MySQL tables inside SSMA 5.2 after connecting to the MySQL server and database. Some tables are shown but many are not.All tables are Innodb in mysql and in the same database. I can see the tables using many tools, such as MySQL Workbench. I also have a linked server in SQL Server 2012 that links to the same MySQL server and database and sees all the tables -but SSMA 5.2 does not. The LinkedServer uses the same ODBC connection, and the same credentials (username / password).
I have SSMA for MySQL Extension Pack installed, and I am using “Server Side Data Migration Engine” to migrate the tables.
I have done a "check table" from within MySQL on one of the missing tables, and there is no corruption.
I tried to use the LinkedServer to connect from within SSMA, but it appears that SSMA does not allow me to use them?
I have an ODBC connection with both the 32 big and the 64 bit driver. I'm running this from 64 bit Windows Server 2012.
I have an SSIS job that selects rows from a Sql Server table and inserts them into a table on an AS400 DB2 instance. This process has been running correctly for many months. Last week, the AS400 operating system was upgraded from V5r3 to V5r4 and since that upgrade, the SSIS job is failing on the insert step with the following error:
OLE DB provider "DB2OLEDB" for linked server "BKL400" returned message "SQLDA or descriptor area not valid. SQLSTATE: 07002, SQLCODE: -804".
Msg 7343, Level 16, State 2, Line 1
The OLE DB provider "DB2OLEDB" for linked server "BKL400" could not INSERT INTO table "[BKL400].[BKL400].[MM4R4LIB].[INV911WK]".
The insert is in an Execute SQL task and uses a linked server definition to access the AS400.
As I said, this process has been working well for many months until the OS upgrade. Any idea on what is causing this and how to correct it will be Greatly appreciated!
I am new to databases and need some info that I'm having trouble researching. I just migrated a access database to SQL server 2005 using SQL Server Migration assistant (SSMA). I wondered how SSMA handles NULL values? Also how it treats differences between Access and SQL? Any help would be appreciated.
I'm using the SSMA to migrate Access databases into SQL Sever. It's a pretty nifty tool, however, I am having some trouble fully migrating the data over.
When I select 'Convert Schema' the schemas for all the tables are created on the server and *all* the table names appear in the Server Metadata Explorer. There are the usual warnings about name changes and primary key additions and the like but importantly, the output window says that there were no errors. Scrolling through all the tables on the server explorer with the 'Table' tab selected confirms that the schemas for all the tables have been migrated, but for one or two of the newly created tables, the 'Data' tab shows the error message "Failed to retrieve data: Invalid object name 'NewDB.dbo.Pref'" for example. Consequently, when I move on to migrating the data, the operation fails for the tables showing these errors and the error message 'The table 'NewDB.dbo.[Pref]' does not exist in SQL Server' is shown when it is displaying in the Metadata Explorer. When I view the server schema in Management Studio Express, it confirms that they have not been migrated. I have tried to look at the failing tables' schemas and identify any similarities which might be causing them to fail migration but they are all very different so I don't think it's a schema issue.
I have just finished using the SSMA Access tool to migrate an Access database to SQL Server Express. I would just like some clarification on some of the warning messages please.
1. "New timestamp column 'SSMA_TimeStamp' created." How does SSMA determine whether or not to add a timestamp? 2. Why are indexes and keys represented by strings of numbers in Access? What are the numbers precisely? 3. "The foreign key '{32F3E8B2-03B3-46EE-BEBF-C202040E10F3}' is not enforced and therefore cannot be converted." Does this mean that the represented relationship will not be created on the server if referential integrity is not enforced in Access? If so, why doesn't SQL Server allow for this unenforced relationship?
Hi, Using VS.NET 2008 Beta2, and SQL Server 2005. I have a gridview bound to a linq data source, and when trying to update a row, I get an exception that no rows were modified. The query generated is: UPDATE [dbo].[package] SET [owner_id] = @p5 WHERE ([package_id] = @p0) AND ([title] = @p1) AND ([directory] = @p2) AND ([owner_id] = @p3) AND ([creation_date] = @p4) -- @p0: Input Int32 (Size = 0; Prec = 0; Scale = 0) [20006] -- @p1: Input String (Size = 22; Prec = 0; Scale = 0) [Visual Studio.NET 2005] -- @p2: Input String (Size = 26; Prec = 0; Scale = 0) [MSI_Visual_Studio.NET_2005] -- @p3: Input Int32 (Size = 0; Prec = 0; Scale = 0) [10000] -- @p4: Input DateTime (Size = 0; Prec = 0; Scale = 0) [11/07/2007 12:00:00 a.m.] -- @p5: Input Int32 (Size = 0; Prec = 0; Scale = 0) [10001] -- Context: SqlProvider(Sql2005) Model: AttributedMetaModel Build: 3.5.20706.1
If I run it manually on sql server, it fails until the directory column is removed. The type is varchar(50), with a uniqueness constraint. However, this is same type as the title column, which doesn't have this problem. Thanks, Jessica
I have SQL 2000 and installed SQL 2005 Express. Had done some test migrations of Access 2003 to SQL2000 using the tool for that. Upgraded that migration assistant to SQL 2005 (my mistake) and now I have 2 instances of SQL2005 running (according to surface area configuration) and even the server mgr for SQL2000 can see those instances.. but SSMA 2005 refuses to see them.. My migration test project is dead in the water.. Can I get a copy of SSMA for SQL2000 back?..Anybody?.. Thanks..
I have been reading some topics on "slow Oracle connections", but I have not found an answer for my question yet.
The problem is that I am extracting data from a Oracle database to Raw files (to import into our DWH later on). For one table this takes more that 1 hour! I find this to be very slow, but maybe I am expecting too much speed?
Can anyone tell me how long it should take to download 10 million rows from Oracle to Raw file (no transformations)?
Some extra information: - The table has 10 million rows and a little over 200 columns - The datatypes in the Oracle database are varchar, number and date - I tried to tune the buffer size in the data flow and found that I could fit about 1900 rows in a DefaultBuffersize of 100 MB - I use the "Native OLE DBMicrosoft OLE OB Provider for Oracle" (Provider=MSDAORA.1)
Can I do anything to get a better performance or is 1 hour for a table like this the best performance we can get?
(Changing the Oracle database is unfortunately not an option...)
This is the error I get with a simple Data Reader Source (SELECT * FROM A) to SQL Server Destination copy between identical tables from a linked to a local server:
I want to migrate DB from sybase to SQL server 2012 and i am using SSMA for it. Till yesterday it was working fine. Today when i try connecting sybase i am getting below error.
Unable to find specified provider.Error occurred while establishing connection to Sybase server. If you have 64-bit connectivity components installed on the machine, you can run 64-bit version of SSMA application whose shortcut can be found under the Programs menu. Or you can install the 32-bit connectivity libraries form Sybase product media or download it from Sybase web site.
I am copying data ( ~600000 rows) using SqlBulkCopy. The operation fails after copying 390000 rows with the follwoing exception ( this happens every time when i run and after the same number of rows copied). Is anything else i need to do differently. The server has 35GB of free space & 1 GB of RAM.
System.Data.SqlClient.SqlException: Timeout expired. The timeout period elapse d prior to completion of the operation or the server is not responding. at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConn ection) at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj) at System.Data.SqlClient.TdsParserStateObject.ReadSniError(TdsParserStateObject stateObj, UInt32 error) at System.Data.SqlClient.TdsParserStateObject.ReadSni(DbAsyncResult asyncResult, TdsParserStateO bject stateObj) at System.Data.SqlClient.TdsParserStateObject.ReadPacket(Int32 bytesExpected) at System.Data.SqlClient.TdsParserStateObject.ReadBuffer() at System.Data.SqlClient.TdsParserStateObject.ReadByte() at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataRe ader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj) at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternal() at System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServer(Int32 columnCount) at System.Data.SqlClient.SqlBulkCopy.WriteToServer(IDataReader reader) at PSMigrate.Program.MigrateWorkItesLatestData() in E:dd_tfs_beta3vsetSCMworkitemtrackingTo olsPSMigrateProgram.cs:line 522 at PSMigrate.Program.MigrateData() in E:dd_tfs_beta3vsetSCMworkitemtrackingToolsPSMigrate Program.cs:line 106 at PSMigrate.Program.Main(String[] args) in E:dd_tfs_beta3vsetSCMworkitemtrackingToolsPSMi grateProgram.cs:line 86
Here is the part of the code
using (SqlCommand cmd = new SqlCommand())
{
SqlDataReader dataReader;
cmd.CommandText = sql;
cmd.CommandType = CommandType.Text;
cmd.Connection = conn;
conn.Open();
dataReader = cmd.ExecuteReader();
// write the data to the server
using (SqlBulkCopy sqlBulkCopy = new SqlBulkCopy(m_destConnString))
{
// column mappings
// event settings
sqlBulkCopy.NotifyAfter = 5000;
sqlBulkCopy.SqlRowsCopied += new SqlRowsCopiedEventHandler(RowsCopiedEventHandler);