Importing Data From Oracle To Sql Loosing Data After The Decimal Point
Jun 18, 2007
I have created a simple package that uses a sql command to pull data from an oracle database and inserts the data into a sql 2005 table. Some of the data fields that i am pulling from contain two digits after the decimal point, however this data is lost when it gets into sql. I have even tried putting the data into a flat file, and still the data is lost.
In the package I have a ole db source connection which is the oracle database and when i do the preview i see all the data I need. I am very confused and tried a number of things to get the data into sql, but none work. Any ideas would be very helpful.
I am creating a table on SQL Server. One of the columns in this new table contains whole integer as wells as decimal values (i.e. 4500 0.9876). I currently have this column defined as Decimal(12,4). This adds 4 digits after the decimal point to the whole integers. Is there a data type that will have the decimal point only for decimal values and no decimal point for the whole integers?
Hi.. I have a column in the data base with the type Float, I want to limit the number of digits after decimal point to 2 when I display the value in ASP.NET but I don't know how!? the number that appear after calculation llike "93.333333" I use decimal(2,2) as data type but an error accour and this is the message "- Unable to modify table. Arithmetic overflow error converting float to data type numeric.The statement has been terminated." Can you help me.. thanks
I have a simple Integration Services project and the problem is that decimal fields are importing as real (I'm loosing the digits behind the decimal point).
The project contains a data flow task importing a flat file (.csv) to an SQL Server destination. My .csv file has two decimal type fields. Example:
Field 1: 12345.67
Field 2: .123456
My database table that I'm importing to has two fields. The only way that I can get this data to import is to define the fields as "float" in both the text file and database table. I want the database table fields to be defined as decimal or numeric, not float:
Field 1: decimal(7,2)
Field 2: decimal(6,6)
When all fields are defined as decimal (in both the flat file and database file), I get the following results:
Field 1: 12345.00
Field 2: .000000
How does one import decimal data from a flat file (.csv)?
Problem importing data from flat file into decimal(9,2) field. The data in the flat file is 000001453 and I am copying it to a decimal(10,2) field and instead of showing up in the 0000014.53 it comes across as 0001453.00. I tried defining the input columns a few different ways but none seemed to work. How do I do this with SSIS or do I need to write a SP and use convert? Thanks.
Using the DTS Wizard, on the Choose a Data Source screen, I pick the following:
Data Source: Microsoft ODBC Drive for Oracle Server: Username: Password:
I don't know what I should put into the other fields, given what I have above, i.e. what to put in for Server, Username, and Password! Or should I use a different procedure to import this Oracle data? I am almost positive I was not supplied the username/password. I was told I could connect given the above TNS information. What do you all recommned?
Hi, I created SQL Server package and schedule the job. SQL Server allows us to connected different database for e.g. for oracle using "Oracle Provide for OLD DB" to retrieve our database. A link is create between this database which move the data to SQL Server. I'm sure there is no issues in the link, as I'm using it to retrieve several oracle database which contains both Arabic and English data.
But after Importation it is showing Junk values. Please advise me what step I should take next.
OK so there is some data in an Oracle DB that I have to summarize based on grouping info stored in a SqlServer DB. How can I import the Oracle data into a SqlServer temp table using SqlServer Express? Thanks.
I wanted to convert a dataset from vb.net (2.0) to an .XLS file, by MS Jet. My national standard is using decimal commas, not decimal points for numbers signing the beginning of decimal places. But the MS Jet Engine uses decimal point,in default. Therefore, in the Excel file only string formatted cells can welcome this data, not number formatted. How can I solve or get around this problem? (with jet if it possible) iviczl
I'd like to convert a Decimal value into a string so that the entireoriginal value and length remains intact but there is no decimal point.For example, the decimal value 6.250 is selected as 06250.Can this be done?
I am designing some reports for a German branch of my company and need to replace decimal point with a comma and the thousand comma seperator with a decimal point.
e.g. ‚¬1,500,123.00 to ‚¬1.500.123,00
Is there a property that I can change in the report designer to allow this to happen or is this something I need to convert in a Stored Proc.
for some reason my Sql Server has stopped and i am not able to restart it at all, do i have to reinstall the complete SQl Software or is there ne way i can start the MSSQL Server
I need to copy the table structures from my production database to development database but not loose the data in developement. Is there a way to achive this by creating some scripts. thanks
I am using Windows 2003 server and Sqlserver 2005 by the use of Linked server , I made a connection to Oracle 10g after that I am importing records from Oracle to sqlserver 2005. When I made tnsnames.ora in sql machine , it worked fine but when i am using tnsnames file from oracle server then i fiired importing procedure it returns below maintain error :
OLE DB provider "MSDAORA" for linked server "BI_ORACLE_LS" returned message "Unspecified error".
OLE DB provider "MSDAORA" for linked server "BI_ORACLE_LS" returned message "Oracle error occurred, but error message could not be retrieved from Oracle.".
Msg 7311, Level 16, State 2, Line 1
Cannot obtain the schema rowset "DBSCHEMA_TABLES" for OLE DB provider "MSDAORA" for linked server "BI_ORACLE_LS". The provider supports the interface, but returns a failure code when it is used.
i'm using a "data conversion" object to convert a numeric field to a string just before i save the record set to the database.
the problem is when this numeric field is > 0 it looses the precision on its decimal value.
example, if numeric value is 0.32
after converting this to a string, the new value will be : .32
it's lost the 0 infront of it. i can't do this converion in the query level because it's a derived field, so i need to convert it to a string before stroing it.
when converting to string i'm using the code page 1252 (ANSI - Latin I). i also tried with unicode string both looses this 0 infront.
I created a custom transform that has a custom interface and is a wizard that uses a web service. It creates custom properties and output columns on the fly. I set the dialog result to Ok and close at the end of the steps. The transform then has the custom fields and output columns I created in the wizard. I've verified this by right clicking on the transform and going to the advanced editor. If I then immediately run the package, the custom fields don't exist in the CustomPropertiesCollection. If I close the package and reopen it, the properties now are gone. If I then go through the wizard again, thus recreating the properties, they stay and don't disappear. The quickest way to get a working transform is to add it to my data flow then save, close and reopen the package and then go through the wizard. Just saving after I add the transform does not help.
Does anyone know what might be causing this very strange problem?
I am working with a legacy SQL server database from SQL Server 2000. I noticed that in some places that they use decimal data types, that I would normally think they should be using integer data types. Why is this does anyone know?
Example: AutomobileTypeId (PK, decimal(10,0), not null)
I would like to cast (convert) data type decimal(24,4) to decimal(21,4). I could not do this using standard casting function CAST(@variable as decimal(21,4)) or CONVERT(decimal(21,4),@variable) because of the following error: "Arithmetic overflow error converting numeric to data type numeric." Is that because of possible loss of the value?
"pRecordSet" is an ADO recordset. The database column "MyColumn" is of type "decimal(19,10)".
The most important question for me is, if the regional settings of the database server or the regional settings of the client PC are considered during the conversion from the string to the decimal value. For example in standard French regional settings the "." would not be recognized as decimal separator.
I am also wondering if the language of the database instance, in which this data is saved, is considered during this conversion or any other settings of this database instance.
So my general question is: Does anybody know exactly what rules apply during the above mentioned conversion?
I have become frustrated and I am not finding the answers I expect.
Here's the gist, we support both Oracle and SQL for our product and we would like to migrate our Clients who are willing/requesting to go from Oracle to SQL. Seems easy enough.
So, I create a Database in SQL 2005, right click and select "Import Data", Source is Microsoft OLE DB Provider for Oracle and I setup my connection. so far so good.
I create my Destination for SQL Native Client to the Database that I plan on importing into. Still good
Next, I select "Copy data from one or more tables or views". I move on to the next screen and select all of the Objects from a Schema. These are Tables that only relate to our application or in other words, nothing Oracle System wise.
When I get to the end it progresses to about 20% and then throws this error about 300 or so times:
Could not connect source component. Warning 0x80202066: Source - AM_ALERTS [1]: Cannot retrieve the column code page info from the OLE DB provider. If the component supports the "DefaultCodePage" property, the code page from that property will be used. Change the value of the property if the current string code page values are incorrect. If the component does not support the property, the code page from the component's locale ID will be used.
So, I'm thinking "Alright, we can search on this error and I'm sure there's an easy fix." I do some checking and indeed find out that there is a property setting called "AlwaysUseDefaultCodePage" in the OLEDB Data Source Properties. Great! I go back and look at the connection in the Import and .... there's nothing with that property!
Back to the drawing board. I Create a new SSIS package and figure out quickly that the AlwaysUseDefaultCodePage is in there. I can transfter information from the Oracle Source Table to the SQL Server 2005 Destination Table, but it appears to be a one to one thing. Programming this, if I get it to work at all, will take me about 150 hours or so.
This make perfect sense if all you are doing is copying a few columns or maybe one or two objects, but I am talking about 600 + objects with upwards of 2 million rows of data in each!!
This generates 2 questions: 1. If the Import Data Wizard cannot handle this operation on the fly, then why can't the AlwaysUseDefaultCodePage property be shown as part of the connection 2. How do I create and SSIS Package that will copy all of the data from Oracle to SQL Server? The source tables have been created and have the same Schema and Object Names as the Source. I don't want to create a Data Flow Task 600 times.
I setup this package to import data from a Sharepoint list to a SQL Server data table. The primary key of my SQL table is mapped to the Title column of my Sharepoint list. There is a possibility that duplicate values will be entered in the Title field of the Sharepoint list. So when importing data into my table via SSIS, my package always error-out when there it comes across duplicate values. how you others have managed data integrity when importing from a Sharepoint list with the Title column being mapped to the primary key of a table.
The following SQL statement returns the correct totals except that the total value is shifted on decimal place to the right. i.e. a real total of 955.68 is displayed as 9,556.80. The total_ar field is a money type. Any help would be appreciated.
Mark
/* AR report Total Greater than 365 days sorted by Dept */
select
a.dept as `Department `, ` A/R 365+ `=sum(case when datediff(day,c.bill_date, getdate()) > 365 then b.total_ar else 0 end)
from
hbm_persnl a, blt_billm b, blt_bill c,hbl_dept d,hbm_matter e
where e.matter_uno = b.matter_uno and a.empl_uno = e.bill_empl_uno and b.bill_tran_uno = c.tran_uno and b.ar_status = `O` and e.status_code=`OPEN`
I have got txt file and I am trying to import this file into the database. For that action I use SQL Server Import and Export Wizard. I use Locale Czech and CodePage 1250. In the text file is the column in that format: 18152.65 - it is number with decimal point. When I use BCP utility for importing data, I use datatype decimal(10,2) and everything is OK. But when I try to use Import and Export Wizard, I choose for that column datatype numeric (DT_NUMERIC - precision 10, scale 2), the Import doesn't start and occurs the error:
- Executing (Error)
Messages
Error 0xc02020a1: Data Flow Task: Data conversion failed. The data conversion for column "PRED_CEL " returned status value 2 and status text "The value could not be converted because of a potential loss of data.". (SQL Server Import and Export Wizard)
I have one column in SQL Server 2005 of data type VARCHAR(4000).
I have imported sql Server 2005 database data into one mdb file.After importing a data into the mdb file, above column data type converted into the memo type in the Access database.
now when I am trying to import a data from this MS Access File(db1.mdb) into the another SQL Server 2005 database, got the error of Unicode Converting a memo data type conversion in Export/Import data wizard.
Could you please let me know what is the reason?
I know that memo data type does not supported into the SQl Server 2005.
I am with SQL Server 2005 Standard Edition with SP2.
Please help me to understans this issue correctly?
We have a daily process, which copies millions of rows of data from one DB to another over Linked Server. Just checking on the best practise, are there more efficient ways than the Linked server to copy millions of rows of data from one DB to another? I checked bulk insert but that transfers the data from the file to DB not DB to DB.Â
when i m importing data from excel to Sql using DTS the column which has text content was not imported as same in excel sheet. whereas a special character is appearing in between the lines. the text field contains multiple lines but the conetent is imported in single line .
I'm wondering if SSIS will be the solution to the problem I'm working on.
Some of our customers give us an Excel sheet with data they want to insert or update in the database.
I've created a package that will take an Excel sheet, do some data conversion so the data types match up and after that I use a Slowly Changing Data component to create the insert/update commands.
This works great. If a customer adds a new row to the Excel sheet or updates an existing row changes are nicely reflected in the database.
But now I€™ve got the following problem. The column names and the order of the columns in the Excel sheet are not standard and in the future it could happen a customer doesn't even use an Excel sheet but something totally different.
Can I use SSIS for this? Is it possible to let the user set the mappings trough some sort of user interface? I€™ve looked at programmatically creating the package but I€™ve got to say that€™s quit hard to do€¦ It would be easier to write the whole thing myself than to create the package trough code ;)
If not I thought about transforming the data in code before I pass it on to the SSIS package in something like XML. That way I can use standard column names and data types.
So how should I solve this problem? Use SSIS or not?