Text Qualifier In SSIS Designer For Flat File Source
May 19, 2008
Unlike SQL Server 2000 DTS, SSIS Flat File Connection Manager Editor does not provide available list of Text qualifier,
i tried
-- ""
-- double quote {"}
-- "
--{"}
but none of them worked....!
My file sample looks like this
"Col1","Col2"
"1234","3456"
"3456","1234"
what qualifier should I use then?
Many Thanks,
View 7 Replies
ADVERTISEMENT
Jan 3, 2008
I have made a package that reads a text file into a table.
The data in the file is roughly as follows.
1, "a", "b,c,ddddd", 4, "ee", 0 with the column sizes int, 1, 50, int, 2, int
On all the desktop Bids environments and a SQL 2005 instance on windows server 2000 the package runs fine.
on the 2 Windows server 2003/SQL 2005 machines a truncation error occurs on col 5.
What is happening is on these machines the text qualifier is being ignored and column 5 is reading as ddddd instead of ee!
I figured this out by changing all fields to text and seeing what came out of the other end.
It is very, very strange and I've hit a brick wall.
If Anyone has any ideas throw them in the mixer please?
Cheers,
Steve
View 5 Replies
View Related
Jun 7, 2007
I'm exporting using a query to a flat .txt file. The problem I'm encountering is when I export the data and then open the .txt file into excel some columns cause line breaks to the next row. The columns that are breaking to a new row are varchar fields where the user has entered text into the field with double quotes ".
When I export, I'm using row delimiter {CR}{LF} column delimiter Comma and text qualifier Double Quote (")
Is there a way to prevent this from happening when I export and open the flat file into Excel?
I tried using replace, but I was getting a syntax error in my query. Here is the query without using replace:
SELECT e.session_date, l.lab_no, i.first_name + ' ' + i.last_name AS Teacher,
tt.name, d.district_name, s.school_name, t.title, a.q1 AS Question1, a.q2 AS Question2,
a.q3 AS Question3, a.q4 AS Question4, a.q5 AS Question5, a.q6 AS Question6, a.q7 AS Question7,
a.q8 AS Question8, a.q9 AS Question9, a.q10 AS Question10
FROM evaluation e
LEFT OUTER JOIN training t ON t.id = e.training
LEFT OUTER JOIN lab l ON l.id = e.lab_no
LEFT OUTER JOIN instructor i ON i.id = e.instructor
LEFT OUTER JOIN trainee tt ON tt.id = e.trainee
LEFT OUTER JOIN district d ON d.id = e.district
LEFT OUTER JOIN school s ON s.id = e.school
LEFT OUTER JOIN answers a ON a.id = e.answers
WHERE session_date >= '20070401' AND session_date < '20070501'
I would need to use the replace on columns a.q7, a.q8, a.q9, and a.q10
I tried using another delimiter...pipes (|) and that didn't work? Maybe I was attempting it incorrectly?
Thanks in advance for any help.
View 3 Replies
View Related
Nov 13, 2007
i am unable to use the Text Qualifer in SSIS package Flat file connection manager Editor, it says, "The flat file parser does not support embedding text qualifier in data",why is that?
it was supported nicely in DTS 2000. also I have no control on Source file TXT. so I can not eliminate the Text qualifer (") from the file.
any advices.
View 1 Replies
View Related
Nov 8, 2007
We have a flat file import proces which imports data from a series of unicode flat files.
The files have text qualifiers and are being imported to a table with the following format:
CREATE TABLE [dsa].[OBS](
[Kundenummer] [nvarchar](10) NULL,
[Navn] [nvarchar](60) NULL,
[Adresse] [nvarchar](50) NULL,
[PostnrBynavn] [nvarchar](50) NULL,
[Kursusdato] [datetime] NULL,
[Varighed] [decimal](18, 2) NULL,
[Kursustype] [nvarchar](100) NULL,
[Risikokoder] [nvarchar](50) NULL
) ON [PRIMARY]
In one of our files we have two rows that looks like this:
"19298529";"THIS IS ROW 1";"ADDRESS 9 -13";"4200 SLAGELSE";"02-05-2006";8.00;"Kombikursus Førstehjælp - Brand 8 lek.";"37"
"19448242";"THIS IS ROW 2";"ADDRESS 50";"4140 BORUP";"04-05-2006";4.00;""Fra vil selv - til kan selv". Om børn 1½ - 3 Ã¥r";"22"
Both rows are OK according to the format, but the second row actually contains the text qualifier in one of the qualified fields (""Fra vil selv - til kan selv". Om børn 1½ - 3 Ã¥r"). It's the title of a course with a comment.
The proces fails on this file, and wont even redirect the row, as it does on other erroneous rows in other files we import.
We believe this is a valid text, but apparently SSIS doesn't
Is this a bug or is this record not allowed?
Is there a work around, and why wont SSIS redirect the row?
We believe the reason is that the field before is not text quaified (which is of course specified in the connection manager).
Thanks in advance,
Lasse
View 4 Replies
View Related
Feb 9, 2012
I have a simple SSIS package -> It reads a local text file which has 10 rows of data ( id, name, telephone # ) and puts it into a table.
It uses the "SSIS Flat File source" to read and a "SQL Command" to insert into the table. I can see that it reads line by line and puts each line into one row in my table.
Now, my production data is over 5 GiG of mainframe data and it seems their data is arranged in some hierarchical form.. so the position or arrangement of data in that file is important.
I pulled the data using my package and as far as I can see , my SSIS package pulled one line at a time ( from the flat file) and pushed it into my table. For each row, I also created an identity column in my table to be able to identify the positional arrangement of the hierarchical data and then use relational mappings to suit our business needs.
In all of this, my assumption is -
"SSIS reads one line at a time, inserts to my table and goes down to the next line .
It does NOT read a snapshot of rows from the flat file so as to write them into the table using internal ordering methods based on that particular snapshot "
My question is .. is my assumption correct ?
View 1 Replies
View Related
Jun 20, 2006
I have a flat file that is row delimited by x00 x0D x0A. Any ideas on how to specify the row delimiter in the Columns section of the Flat File Connection Manager?
View 2 Replies
View Related
Aug 14, 2007
My task is to write an SSIS package that picks up just one file from a directory and loads it into a database table. The filename is defined as being "ABC*.txt". So I must pick up only one file that matches that wildcard.
I can see two ways of doing this, but I can't get either to work:-
1. Use a Flat File Source connection and put the wildcard in the ConnectionString.
i.e. ConnectionString = "C:\mydirABC*.txt"
But SSIS doesn't seem to support that.
2. Use a Foreach Loop Container with a Foreach File Enumerator, and configure the enumerator as:-
Folder = c:mydir
Files = ABC*.txt
This works well, but loops round for as many files match the wildcard. Is there any way of forcing it to drop out after the first time round the loop?
Or am I missing a much easier solution?
Thanks.
View 10 Replies
View Related
May 21, 2008
HELLO,
I'M NEW OF SQL SERVER BUT I'VE ALREADY GOT A LOT OF GOOD ADVICES BY YOU
I'M NOT IN ABLE TO RUN BY SQL SERVER AGENT AN SSIS PACKAGE:
THE PACKAGE IS MADE AS BELOW:
DATA SOURCE: FLAT FILE;
OLE DB DESTINATION: SQL SERVER DBO.TABLE
I HAVE SEVERAL KIND OF DATA FLOWS SCHEDULED TO RUN DAILY, AND ALL OF THEM RUNNING CORRECTLY.
THE ONLY ONE I'M ACTUALLY NOT IN ABLE TO MAKE RUN BY JOB IS THIS ONE. (BY VISUAL STUDIO RUNS FINE )
I'M GETTING BACK THE FOLLOWING ERROR:
Description: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "Flat File Source 1 1" (32)
I'VE GOT A LOOK ON WEB, AND SEEN THAT THIS ERROR IS NOTICED WHEN SSIS IS NOT INSTALLED ON THE CLIENT MACHINE, BUT THIS IS NOT MY CASE SSIS IS FULL INSTALLED AND RUNNING.
I'VE ALSO TRIED TO RUN THE PACKAGE DIRECTLY FROM THE SERVER MACHINE...IT DOESN'T RUN EVEN LIKE THAT
I'VE DISCOVERED THAT SOME SSIS FEATURES IS NOT AVAILABLE WIT SQL SERVER STANDARD EDITION ( THIS IS MY LICENSE ) DO YOU HAPPEN TO KNOW WHETHER THIS IS THE CASE OR NOT?
P.S.
I'M PART OF SYSADMIN GROUP...JUST IN CASE
View 2 Replies
View Related
Jan 29, 2008
Hi,
I am trying to impliment a SSIS package where data source is a Flat file(.csv) file and destination is a sql server database.
The problem is my data source a flat file which consists of thousands of rows which are manually entered, so there is always a chance that in some rows they may miss a column value while entering data which results in an error.
Example: My flat file has headers like Sln, Name, Age, Designation. While entering data they may miss age and type it as 1,aaa,Consultant,,
Using SSIS package i want to track all row number in the flat file where data is entered wrongly so that i can correct only that row instead of checking all rows each time when my SSIS package throughs an error. I want to get all the row numbers in a sql server database which are wrongly entered.
Any suggestions are sincerly apriciated.
Thanks in advance
Regards,
gcs.
View 11 Replies
View Related
May 13, 2008
Hello Experts,
I am createing one task (user control) in SSIS. I have property grid in my GUI and 2 buttons (OK & Cancle).
PropertyGrid has Properties like SourceConnection, OutputConnection etc....right now I am able to populate Connections in list box next to Source and Output Property.
Now my question to you guys is depending on Source Connection it should read that text file associated with connection manager. After validation it should pick header (first line of text file bases on record type) and write it into new file when task is executed. I have following code for your reference. Please let me know I am going in right direction or not..
What should go here ?
->Under Class A
public override DTSExecResult Execute(Connections connections, VariableDispenser variableDispenser, IDTSComponentEvents componentEvents, IDTSLogging log, object transaction)
{
//Some code to read file and write it into new file
return DTSExecResult.Success;
}
public const string Property_Task = "CustomErrorControl";
public const string Property_SourceConnection = "SourceConnection";
public void LoadFromXML(XmlElement node, IDTSInfoEvents infoEvents)
{
if (node.Name != Property_Task)
{
throw new Exception(String.Format("Invalid task element '{0}' in LoadFromXML.", node.Name));
}
else
{
try
{
_sourceConnectionId = node.Attributes.GetNamedItem(Property_SourceConnection).Value;
}
catch (Exception ex)
{
infoEvents.FireError(0, "LoadFromXML", ex.Message, "", 0);
}
}
}
public void SaveToXML(XmlDocument doc, IDTSInfoEvents infoEvents)
{
try
{
// // Create Task Element
XmlElement taskElement = doc.CreateElement("", Property_Task, "");
doc.AppendChild(taskElement);
// // Save source FileConnection
XmlAttribute sourcefileAttribute = doc.CreateAttribute(Property_SourceConnection);
sourcefileAttribute.Value = _sourceConnectionId;
taskElement.Attributes.Append(sourcefileAttribute);
}
catch (Exception ex)
{
infoEvents.FireError(0, "SaveXML", ex.Message, "", 0);
}
}
In UI Class there is OK Click event.
private void btnOK_Click(object sender, EventArgs e)
{
try
{
_taskHost.Properties[CustomErrorControl.Property_SourceConnection].SetValue(_taskHost, propertyGrid1.Text);
btnOK.DialogResult = DialogResult.OK;
}
catch (Exception ex)
{
Console.WriteLine(ex);
}
#endregion
}
View 10 Replies
View Related
Feb 29, 2008
Each day I receive a file with a different name. For example, the name is filename_mmddyyyy.txt where filename_ stays constant and mmddyyyy is the date of the file. The file is always in the same format.
I want to build an SSIS where I pass it this file name. I can write a script to generate the correct file name. How do I build the SSIS so it can accept the input parameter and find the correct file to process?
Thanks
View 3 Replies
View Related
May 10, 2006
Hi,
I've just started looking at SSIS and have encountered what should hopefully be a simple problem to solve. I have a pipe-separated source file that looks like this (I've added Line numbers for simplicity):
Ln 01: HDR|FEED_CODE|31-MAR-2006
Ln 02: Tom|100|Jones|ZZ1 1ZZ|USA
Ln 03: Tom|200|Singer|
Ln 04: Tom|305||Red|Porche ||Lanzarote |Apple|Carrot| | |
Ln 05: Dick|100|Van Dyke|ZZ1 1ZZ|USA
Ln 06: Dick|200|Actor|
Ln 07: Dick|305||Blue|Ford||California |Tomato | |||Beef
Ln 08: Harry|100|Houdini|ZZ1 1ZZ|GBR
Ln 09: Harry|200|Escapologist|
Ln 10: Harryk|305| |Green ||Triumph |Poland|Banana|Sprout| | |
Ln 11: TRL|9
In addition to a header and footer records, this file contains three record types for each person.
Record types are identified by the second column.
Each record type has a different number of columns:
Type 100 has 5 columns
Type 200 has 4 columns
Type 305 has 12 columns
The Row delimiter for all records is the {CR}{LF} character
I've set up a flat file input source and specified {CR}{LF} as the row delimiter for both header and data rows and the "|" character as the field delimiter.
It appears that SSIS is assuming that because the first data row has 5 columns, then everything must fit that format too. So the {CR}{LF} character that separates lines 02 and 03 is interpreted as text rather than a separation character and all remaining | field separators after 305 are interpreted as text containing in the fifth column. SSIS is also complaining that the last row is incomplete.
A bit like this (I've used tildes to indicate column separation):
Tom~100~Jones~ZZ1 1ZZ~USA
Tom~200~Singer~{CR}{LF}Tom~305||Red|Porche ||Lanzarote |Apple|Carrot| | |
I've seen one other reference to this behaviour but the response seemed to be SSIS doesn't know which columns are missing. In this scenario, we don't have missing columns, rather, we have different types of record in a single file. in DTS I would effectively parse the file once for each record type thus:
if cStr(DTSSource("Col002")) = "100" then
DTSDestination("in_Name") = trim(DTSSource("Col001"))
...
Main = DTSTransformStat_OK
else
Main = DTSTransformStat_SkipInsert
end if
...not the most efficient solution I know but the load only runs once a month so this was an acceptable workaround.
DTS was never this fussy but I'm sure this is user error rather than an SSIS limitiion. Can someone please put me straight?
Many thanks,
Greg
View 7 Replies
View Related
Mar 2, 2007
I have a CSV Flat File Source with a Decimal column - but DataPrecision property is grayed out - why?
View 1 Replies
View Related
Apr 19, 2007
Hi all,
I am passing flat file source as a variable to Dtexec Utility. (like package.variables[User::varFileName].Value;"D:sourcedata.txt).
Destination table is having one more column.
I want to add custom value in that column at run time by parameter to Dtexec(User::varDate)
I dont know how to do it, please help me.
Madhukar
View 4 Replies
View Related
Sep 3, 2007
Hi,
I am migrating one of my DTS package to SSIS.
My task is to read the filename from a database table and transfer the flat file data in to a table.
In SSIS,I am able to fetch the file name using a Data Reader Source; but how to pass this fileName parameter to Flat File Source ?
In DTS I have used ActiveX script to pass filename variable as flatfilecon.Source.
Any help ?
Thanks,
Ravi
View 4 Replies
View Related
Jan 12, 2007
I'm having a problem using the Flat File Source while using the underlying .Net classes to execute SSIS Packages. The issue is that for some reason when I load a flat file it Empty's out columns randomly. Its happening in the Flat File Source Task. By random I mean that most of the times all the data gets loaded but sometimes it doesnt and it empty's out column data. Interestingly enough this is random and even the emptying out of columns isnt a complete empty, its more like a 90% emtpying. Now you'll ask that is the file different everytime and the answer is NO. Its the same file everytime. If I run the same file everytime for 10 times it would empty out various columns maybe 1 of those times. This doesnt seem to be a problem while working with dtexec or the Package Executor utility. Need Help!!
View 9 Replies
View Related
Mar 24, 2007
Hello there,
I have created a package which will copy rows from csv file to SQL database. I have a field into the csv file which contains numeric data. and I am keeping this into the database as numeric too. for example, a column into the csv named "amount" needs to be transfer into the data table where the corresponding column name is "amount" and its data type is numeric and the field can contain null values. I am using the double quote(") text qualifier on to the csv file. Now my problem is, some rows into the csv file contains null values for amount column. for example..lets take a look on my csv file content...
"Name", "Salary"
"Jhon Stuart", "35.66"
"Maria Gree", ""
Notice the second row of the csv where the Salary value has left as an empty string. Now my intention is to import these data into the database and the salary value for Maria should be remain as null. But the package is generating an error for this row. it says..
There was an error with input column "Salary" (61) on input "OleDB Destination Input (47)" . The column status returned was : The value could not be converted because of potential loss of data.
Can any body help me on this? What would be the solution for this? if I modify the row into csv file as following
"Maria Gree", "0.00"
then it works. But I dont want to fill the field with zero into the DB. I want it would be set with NULL value..which make sense.
Any Idea?
Thanks in advance.
View 16 Replies
View Related
Apr 20, 2015
I am working to archive some old data from a data warehouse using SQL server and SSIS. The data will be read and denormalized, then shipped out to a delimited text file.
The rowcount of the incoming data is significant, call it 10M+ rows per unit of work (one text file).
There are development advantages of using a stored proc for the data source - mainly ease of changing the denormalization logic as required. Wondering if there are performance advantages of an embeded query for the data source instead?
It was mentioned by one developer that when using a stored procedure, the output stream from the proc and subsequent SSIS steps cannot start until the full procedure processing is complete; i.e. the proc churns out its' result set in one big chunk.Â
He hinted that an embedded query does not have this same effect, but I am not sure that is accurate.
View 4 Replies
View Related
Jan 16, 2007
I have created an SSIS package, in my VS2005 solution, that Bulk Inserts a CSV file (see example below)
"100",2006-10-03 00:00:00,"HEX012",1"101",2006-10-03 00:00:00,"DS00130",1
I have a Bulk Insert Task that uses a Flat File Connection Manager to import my CSV file into my SQL2005 database.
My source CSV file (see example above), has double quatation marks surrounding any text fileds.
I have set the Flat File Connection Manager's 'Text Qualifier' to double quatation marks.
The Bulk Insert works ok, but ignores the Text Qualifier.
My database table is left with the original quatation marks in any text field.
Any help appreciated.
Regards,
Paul.
View 3 Replies
View Related
May 2, 2006
I am working on an SSIS project where I create two flat files for submission to a data contractor. This contractor requires a control record be the first line in the file. I create the control record based on the table information being exported.
What I would like to know is, is it possible to utilize the Header Section of the Flat File Destination Editor to insert the control record? And, as it is dynamic, what kind of coding must I do in order to utlise this functionality?
Thanks.
View 4 Replies
View Related
Jan 13, 2011
I've discovered an issue with the text qualifier field in the file connection manager when upgrading a SSIS 2005/2008 package from a 32 bit platform to 64 bit platform runninn SQL Server 2008 R2 10.5.1600.
The package will convert <none> in this field to _x003C_none_x003E and therefore any package using the file connection manager i.e. import/export - common tasks on SSIS! will cause problems either with output data or imported data.
Simply replacing _x003C_none_x003E with <none> fixes the issue but ofcourse there can be many packages affected as a result.
Any existing/impending cumulative update for SQL Server 2008 R2 Standard that will fix the problem?double quote delimiters are converted to _x0022_ which I am assume by replacing with a double quote will fix the problem.
View 9 Replies
View Related
Jan 2, 2007
Hi Guys,
I
have a flat file which is loaded into the database on a daily basis.
The file contains rows of strings which I load into a table,
specifically to a column of length 8000.
The string has a length of 690, but the format is like 'xxxxxx xx xx..'
and so on, where 'xxxx' represents data. So there are spaces, etc present in the middle.
Previously
I used SQL 2000 DTS to load the files in, and it was just a Column
Transformation with the Col001 from the text file loading straight to
my table column. After the load, if I select len(col) it gives me 750
for all rows.
Once I started to migrate this to SSIS, I
allocated the Control Flow Task and specified the flat file source and
the oledb destination, and gave the output column a type of String and
output column width of 8000. But when I run the data flow task it
copies only 181 or 231 characters out of the 750 required.
I feel it stops where it finds the SPACES and skips the rest.
I
specified row delimiters or CR, and LF. I checked the file under
UltraEdit and there were no special characters in the file that would
cause the problem.
Any suggestions how I can get it to load the full data?
Thanks
View 26 Replies
View Related
Nov 10, 2006
Hi all,
I m using SSIS and i am transfering the data from Flat File Source to the OLE DB destination File. The source file contain some corrupt data which i am transfering to the other Flat file destination file.
Debugging is succesful but i am not getting any error output in the Flat file destination file.
i had done exactly which is written in the msdn tutorial of SSIS.
Plz tell me why i am not getting the error output in the destination flat file?
thanx
View 1 Replies
View Related
Apr 16, 2014
I have an source file and i have to load it into the data base by changing datatype of the columns in ssis
View 1 Replies
View Related
Jul 20, 2005
Hi All there!I am quite new in MS SQL administration so let me explain how it workon Your instances of SQL Servers.We have several DTS packages on our server, all of them managed onsome station which have seriously hardvare problem. So we wolud liketo catch two problems at one time and decided to develop systematicway of DTS manipulation.One of several aspects of this operation would be migration fromsystem ODBC data sources definitions into file ODBC sources ( .dsnfiles) in order to make them easier to manage ( backup for example,and even reusability on other workstations). All .dsn files should belocated on some network resource (\server\directory...) which wouldbe set as default ODBC directory in ODBC administrator on managementstation.When I begin to do so, then it apears that EM DTS Designer does notremember the path to the DSN files ( for example on design panel Ichose file dsn and by browse button point at the certain .dsn file,and then after DTS save the path disapears).Do You use this facility ( file .dsn) in DTS EM Designer, or maybe MShas it treated as usless, and nobody wants to use this?RegardsK
View 3 Replies
View Related
Jul 14, 2006
I have text data files from a third party and they use comma as field delimiters and enclose the text for each column in double-quotes. Not a problem for most of the data files until they start sending files where there is " within the column values. SSIS package fails with the error:
The column delimiter for column "Column 1" was not found.
Any ideas on how to resolve this issue will be greatly appreciated.Thankspcp
View 15 Replies
View Related
Mar 12, 2015
In SQL 2005 if i was trying to insert some data with a text qualifier inside a text qualified field, it would work, for example:
"Name","ID ","Location","","Comany",""House Name" Road",
In SQL 2012, this fails with the error message, cannot find the text qualifer for field.
To get around this, we are having to import the data into a Dirty Data column of aTEMP table, ID, Dirty Data, Clean data - perform multiple updates and change the text qualifier and ensure they are only changed in the right places so we can keep the ". In this example, we changed the text qualifier to PIPES.
After these updates, we then export the data from CLEAN data back out to CSV, then reimport it into the origional destination table with a new text qualifer.
View 5 Replies
View Related
Feb 13, 2007
Hi,
I am trying to create a program that transfers tables to flat files.
At this point in time, I have suceeded in created one that creates delimited files.
However, I am now trying to create fixed-width files as you can do with the SSIS designer, but programatically.
Is there a way to programatically determine the width of a column from the source table? I can not seem to find any kind of function or member that stores this information or allows me to retrieve it.
I know what I need to change in order to set a width for a column, but I just don't know how to find the width without just asking the user to provide one.
View 5 Replies
View Related
Mar 28, 2006
Hi,
we have one requirement to run the package daily basis.
The package should run at specific time on that day.
we are using windows schedular for that.
we will have one new flatfile everyday.
Is there any process to attach this file to flat file source dynamically?
The requirement is,
The flat file should be able to read the new flatfile everyday.
we have no option change it manually, the flatfile source should have to take the file automatically at that time.
So that it can take that flatfile and load it into database table.
View 1 Replies
View Related
Feb 14, 2007
I am wondering how easy is to check for file locks and have our SSIS Package to wait until file has been release by the process which is using it.
Also, same question when we're writing to a Flat File (or Flat File Destination).
Thanks,
View 3 Replies
View Related
Aug 15, 2007
Hello Ereryone,
I have Flat File as my source. Before i tried to load the data in to ORACLE Destination thru SCD component the error was with ole db.
any ways i try to load the data in Access DB but I€™m getting different error in same component (OLE DB) After SCD Component. can any one help me out in this.
thank you
View 1 Replies
View Related
Mar 6, 2006
Hi all
I have some problems with the "Flat File Source" ...
I am trying to load a textfile, but IS allways cuts the rows ...
When I look at the preview while designing, the row is complete,
so I am wondering what IS is doing ...
Thanks for any comments
Best regards
Frank Uray
Here is what I am trying to load (one row from the file):
WPBX1 1.2 19330065002695435000 001200526000 000020002002-11-13-11.17.55.2220262006-03-03-05.50.44.322629002000010001AG2006-03-03-05.50.44.322629WIS030EPF033200602173410567000101 271275 2006030220060303200603032006030320060303 200603032006030320060303 200603032006030320060303200603031 0.000 200.000A UWCE 1 24617 10844890000000000 0.000000 0.000 0.000 0 149.500 149.500 149.500 00100010 0.000000000000000E+00 0.000000000000000E+00 0.000000000000000E+00 0.000000000000000E+00 1.000000000000000E+00 1.000000000000000E+00 0.000000000000000E+00 0.000000000000000E+0000 0000000001CV ÊÊ 00 1.150 200.000 0001-01-01-00.00.00.00000000120052600071200180K 712 71550 230.000 230.000 230.000 0.000 230.0000010 C 100.000000 2006-03-03-05.50.44.3226291567 230.000 230.000 230.000 0.000 230.0000010?C 100.000000 2006-03-03-05.50.44.3226291568 230.000 230.000 230.000 0.000 230.0000010?C 100.000000 2006-03-03-05.50.44.3226291585 230.000 230.000 230.000 0.000 230.0000010?C 100.000000 2006-03-03-05.50.44.3226291590 -80.500 -80.500 -80.500 0.000 -80.5000010?C 35.000000 2006-03-03-05.50.44.3226291640 -80.500 -80.500 -80.500 0.000 -80.5000010 C 35.000000 2006-03-03-05.50.44.3226291830 149.499 149.499 149.499 0.000 149.4990010?C 65.000000 2006-03-03-05.50.44.322629
On SQL Server I get only this:
WPBX1 1.2 19330065002695435000 001200526000 000020002002-11-13-11.17.55.2220262006-03-03-05.50.44.322629002000010001AG2006-03-03-05.50.44.322629WIS030EPF033200602173410567000101 271275 2006030220060303200603032006030320060303 200603032006030320060303 200603032006030320060303200603031 0.000 200.000A UWCE 1 24617 10844890000000000 0.000000 0.000 0.000 0 149.500 149.500 149.500 00100010 0.000000000000000E+00 0.000000000000000E+00 0.000000000000000E+00 0.000000000000000E+00 1.000000000000000E+00 1.000000000000000E+00 0.000000000000000E+00 0.000000000000000E+0000 0000000001CV
View 7 Replies
View Related