Complex File Parsing Issue

Nov 28, 2007

Hello,

I have a file that looks like this:

Summary
A ABCD
A Category MarketValue Margin
A category1 1.0000000 1.000000
A category2 2.0000000 2.000000

H Totals Total Cash Net
H 2.00000 200000 2000000

Another Summary
B BCDE
B Activity MarketValue Margin
B activity1 3.00000 3.000000
B activity2 4.00000 4.000000

The items in blue are headers. I don't want to capture those. However, I want to capture all the data in black, and put it into 3 separate tables (or maybe the same table, under the appropriate column names)

This situation differs from anything I've done before in that you can't identify what row contains what data by what's in the row itself. That is, what's in the data rows is random and subject to change. So you can't search the row itself to determine which table it goes to.

However, if there's a way to capture all the rows after a certain header before the header changes again, that might work.

That is, get all rows between A Category MarketValue Margin and H Totals Total Cash Net
and
get all rows between H Totals Total Cash Net and Another Summary
and
get all rows after B Activity MarketValue Margin

Any examples of how I might script this?

Thanks


View 2 Replies


ADVERTISEMENT

Complex XML Parsing/Shredding

Jun 20, 2006

Hi,

I have generated a sample XML document from the XSD file using XMLSpy and used these in XML Source stage.

While I was parsing/Shredding the XML file and to write to sequential files I got the following error message.

The XML Source Adapter does not support mixed content model on Complex Types.

Additional Information:
Pipeline component has returned HRESULT error code )xC02092A1 from a method call. [Microsoft.SqlServer.DTSPipelineWrap]

Can anybody give a sample code with Complex XML type.

 -----------------------------------

 

Find attached the XSD format. This is FYI.

View 4 Replies View Related

Reporting Services :: Parsing SSRS Config File And Dynamically Changing File Path Of Config File In Code

Sep 2, 2015

Currently have a single hard coded file path to the SSRS config file which parses the file and provides the reporting services web service url.  My question is how would i run this same query against 100s of servers that may or may not share the same file path as the one hard coded ?

Is there a way to query the registry to find the location of the config file of any server ? which could be on D, E, F, H, etc. 

I know I can string together the address followed by "reports" and named instance if needed, but some instances may not have used the default virtual directory name (Reports).

Am I going about this the hard way ? Is there a location where the web service url exists in a table ? I could not locate anything in the Reporting service database. Basically need to inventory all of my reporting services url's.

View 2 Replies View Related

Flat File Connection Manager Not Parsing File Correctly

Apr 3, 2007

Hi,



I have a flat file, comma-delimited, with strings in double-quotes.



In the connection manager for the file, I have specified that the Text Qualifier = ""



However, in the preview tab, it still shows the strings as surrounded by the quotes, e.g. "mycol1" whereas it should show mycol1 without the quotes.



Next, when I examine the data in the database after the load, it's messed up there also.



"mycol1" ends up in the database as "mycol1



"mycol2" ends up as "mycol2



This is not right.



I have format set to delimited, header row delimiter crlf, etc.



Any ideas?



Thanks

View 3 Replies View Related

Parsing RTF File

Dec 3, 2007

Hi:

I need to parse an regularly outputted rtf file and was wondering if it is possible in SSIS. I am trying to use the flat file connection manager to do this.

Now, I can't treat tab stops in an rtf like tab stops in a csv, since when you treat an rtf as a text file, you see the format code of the rtf. If I open the rtf in a text editor, the entire file is one line, with lines breaking with:

par}

Columns are tab delimited in the rtf, and they look like this when you treat the rtf as a text file.

plain abfs16f4cf0cb1

(or something like that, the word "tab" is the important part.)

So I use the "plain ab" part to delimit in SSIS, since that is consistent (planning to parse out all the garbage later on). The problem is, sometimes lines don't have a "city" and "state", so it "tabs" right over to the next field. So like this (looking in MS Word):

Phone <tab> City <tab> State <tab> Date <tab> Other fields.....
847-111-2222 <tab> Omaha <tab> NB <tab> 9/14/2007 <tab>
222-222-3333 <tab> 9/14/2007 <tab>
555-121-1212 <tab> Houston <tab> TX <tab> 9/14/2007 <tab>

Now, if you treat an RTF as a text file, it has only one "plain abfs16f4cf0cb1" after the phone number, so even for the missing line there is only one tab, not 3. This is because in the beginning of the row tabs for each row are defined like this:

tql x90 ql x840 ql....etc...

with "tql" and "tx" tags basically saying where all the tab stops are for that row. So for the row above with missing info, it lists fewer tab stops. So the "date" (and associated garbage) ends up under "City" for this row. All of the "Houston" row's data starts appearing in the sql server output table's 2nd last field, as you might expect.

Any suggestions how to pull this in in SSIS during the transformation? I could deal with it after I pull it in, I still have all the data. I'm thinking the logic to do this could be complicated though. I take the data out of the last two fields of the missing row into some other table, use UPDATES to shift the values 2 fields to the right, and then figure out a way to take the data I just put in a temp table back in, but it all sounds a bit complicated.

Let me know if this makes sense--I've almost got it going, I just need to sort this last bit out.

Thanks,
Kayda

View 4 Replies View Related

Parsing A QFX File?

Jan 19, 2008

i am trying to read a qfx file from quicken.
it looks like xml, but its not, but i cannot figure out how to grab what ive got to parse the line.
i put this into a derived column, but its not getting it

SUBSTRING([Column 0],FINDSTRING("<STMTTRN>",[Column 0],1),FINDSTRING("</STMTTRN>",[Column 0],1))

because inside the data, it lools like that's what brackets a tranasction; the data looks like this and varies by trntype, but the columns are tagged like so


<STMTTRN>
<TRNTYPE>POS
<DTPOSTED>20070129160000
<TRNAMT>-0000000000026.50
<FITID>20070129011
<NAME>SUNOCO
<MEMO>01/24 ENGLWD CLIFF NJ 8015V200006
</STMTTRN>
<STMTTRN>
<TRNTYPE>POS
<DTPOSTED>20070129160000
<TRNAMT>-0000000000023.47
<FITID>20070129012
<NAME>KFC
<MEMO>01/26 NANUET NY 8015V215116
</STMTTRN>


i tried the xml transform and unpivot, but have not cracked it.
thanks for any light you can shed
drew


View 1 Replies View Related

Parsing A Tab Delimited File

Dec 5, 2007

I have a tab delimited file with 122 columns. Can any one let me know if there is a better way of parsing/extracting few columns (say about 15) from the file and loading it into a table using SSIS.

View 1 Replies View Related

Parsing Text File And Inserting Into DB

Mar 19, 2008

Hello all,
I have a question regarding importing text file data into SQL Server.  I'm hoping someone can point me in the right direction, as my searches haven't turned up anything specific enough.
I'm trying to parse a large (24MB) text file.  It's a fixed-width file, with multiple columns.  I need to parse this file, check if a record already exists, and then import the data into the database.  But I don't need to insert every column.  There's only a few columns from the file I need to insert.  This parsing also needs to occur at regular intervals (daily).
I looked at BULK INSERT, but I can't find an example that uses only some of the columns.  Every example uses all columns, and the file is delimited, not fixed-width.
Is there anything within SQL Server that can accomplish this?  I haven't turned up anything that will solve my problem.  The only other solution I can think of is an application that parses the file for me and inserts the data into the database.  But can I schedule that application to run every night at midnight (for example) through SQL Server?
I'm not too familiar with SQL Server, so I appreciate any help offered.
Thanks,Jay

View 7 Replies View Related

Reading File As One String, Then Parsing - How To Do This?

May 7, 2007

Hi,



The suggestion to do this is buried deep in one of my posts, however I still do not have a clear idea of how to do this.



I have a flat file which has several "bad rows" in it. Because file error redirection is buggy, I need a manual approach to get rid of these incomplete rows in my data file.



Phil, you suggested I read the file as one long string, then parse out the bad rows (using a script?).... however I have no idea as to how to actually do this.



I was wondering if it's possible to clarify the steps involved in doing this, or perhaps point me to an example I can look at, as I cannot seem to get around this problem on my own.



Thanks much!!

View 24 Replies View Related

Another Flat File Parsing Problem

Dec 5, 2006

Hello All!

I know this has come up before and I have tried several of the solutions found within the forum but I just can't seem to import my file correctly and could use some input, please.

Sample file (less fields than actual file):

Name (str), Phone# (str), Description(str), Resolved(bool), Met(bool)

"Kay, Mary","123-4567","Used a "."not a"," in text", "1", "1"

The text is qualified with " and columns delimited with commas but the description field has embedded quotes and commas. Normally it works except if there embedded quotes and commas.

I have tried unqualified data and undouble, but that does not work either because of the embedded commas in quotes.

Do I need to do something before the data flow? Do I need to do custom code similar to undouble (I tried modifying undouble but using unqualified fields caused the source file to not like the data and go red)? Should the row be read as one field and parsed?

Thanks in advance for any help you can give!

View 12 Replies View Related

Need Suggestions On Text File Parsing Into Database

Feb 28, 2007

I have a website, where people upload tab delimited text files of their product inventories, which the site parses and inserts into a database table.  Here's the catch: Instead of insisting that each user use a standardized format, each user can upload the file in whatever column order they want, they just have to let the site know through a GUI which column is in which order.   And, they may upload columns that if not mapped, will be ignored.  Right now, I am doing all of this in code and it runs slow, I was thinking of offloading this to either a stored procedure, ssis, or bulk upload.   But, with the varying format of the uploaded text file, I am not sure how I could do that.  Any suggestions? Thanks! 

View 1 Replies View Related

SQL Server 2008 :: Parsing Unstructured CSV File?

Oct 1, 2015

I have a CSV file with roughly 6 million rows. The file is unstructured; that is, some rows have 5 fields, others have 15, and there are as many 50 fields in one row.

I am using bulk insert to read the entire file into a table in database, with each row being a database record. With that, I have one column that contains a row of comma delimited fields. All fields are character string and I want to find a quick way of parsing each row and placing each comma-delimited value in a column. For example:

CREATE TABLE MyTable
(
CSVString varchar(1000),
C1 varchar(20),
C2 varchar(20),
...
C50 varchar(20),
)

Column CSVString contains the a CSV row (I don't know how many filelds (no. of commas + 1) in the row, but if the row contains 10 fields, I need to populate columns C1-C10. If the row has 15 fields, I populate columns C1-C15.

How can I do this in a very efficient way? I tried CTE but performance was not very good.

View 8 Replies View Related

Help, Fairly Complicated File Parsing Issue

Jun 7, 2007

Hi,



I have a situation where I'm having to extract key data from a financial file. Problem is, the columns are not nice and tidy.



Basically the file looks like this:



row 1: "788","Company","OPENING BALANCE:", 2084587.76
row 2: "313947","04/01/07","3","CS","FF", 170.00,"AZT","XYC INC", 20.8, 351.00
row 3:"788","06/06/07 CLOSING BALANCE:", 206203893.03



So, I'm going to need to get the OPENING BALANCE and CLOSING BALANCE figures, as well as all the data in between, ie) row 2 through n.



Does anyone have an example of a script that can be used for extracting very specific values from a file?



I have a script that checks for incomplete rows, but it is not sophisticated enough for this situation.



Thanks much




View 9 Replies View Related

Flat File Source Column Parsing Error

May 12, 2006

Hello All,



I have come across this issue with the Flat File Source when the delimiter is set to a comma.

"""KAILUA KONA,HI""","CA",

In the data snippet above and with the setting of using a comma as a column delimiter

and a " as the text qualifer.

the data will be parsed in this fashion:

"""KAILUA as a column:

HI""" as a column

CA as column

when it should be

"KAILUA,HI" as a column

CA as column.



Is there a way to let the Flat File Source to let it know not to parse the data in multiple quotes ?



Thank you

Eric Flores

View 5 Replies View Related

Flat File Source - If An Error Occurs, Continue Parsing The Remaining Columns In The Row Before Failing

Jan 14, 2008

Hello everyone,


I have a package that extracts data from a Flat File. If any errors or truncation occur during the extraction of the input data, the package should fail. All fields that have erroneous values should be reported in the log file.


My Solution:
- I have created a Data Flow Task that contains a Flat File Source Adapter and a dummy destination.

- I have left the default "Error Output" configuration of the Flat File Source adapter, namely if a truncation or an error occur for a certain column, then the reaction is "Fail Component".


Problem:
This configuration gives me only the first erroneous column in the row being processed.


Question:
Is it possible to make the Flat File Source adapter continue parsing the current row before it fails? This way, I would be able to get all the erroneous columns in the row in one shot.


Thanks in advance...
Samar

View 6 Replies View Related

Import Complex XML File Into Sql Server 2000

Aug 21, 2007

I need help importing a complex xml file using the XML Bulk Load component. I need there to be 2 tables as shown below. I just
cannot seem to figure out how to get this to work with such a complex XML structure. I have shown below my table structure, a
sample of one of the entries of the XML files and what I have so far for my XSD schema. Any help would be great!!!
My Tables:CREATE TABLE [dbo].[WPXML] (    [Part] [varchar] (100) PRIMARY KEY,    [BaseVehicle] [int]  NULL ,    [Qty] [int]  NULL ,    [PartType] [int]  NULL ,    [EngineBase] [int]  NULL ,    [EngineDesignation] [int]  NULL ,    [ImageURL] [varchar] (100) NULL ,    [ThumbURL] [varchar] (100) NULL) GOCREATE TABLE [dbo].[WPPRODUCT] (    [Part] [varchar] (100) PRIMARY KEY ,    [PartNumber] [varchar] (100) NULL ,    [BrandID] [varchar] (4) NULL ,    [BrandDescription] [varchar] (100)  NULL ,    [Price] [varchar] (10) COLLATE  NULL ,    [ListPrice] [varchar] (10) COLLATE  NULL,    [Weight] [varchar] (10) COLLATE  NULL,    [Popularity] [varchar] (10)  NULL,    [OEFlag] [varchar] (10) NULL,    [ProductRemark] [varchar] (1000) NULL,    [Note] [varchar] (5000)  NULL ) GOSample of XML:<App action="A" id="1484266">   <BaseVehicle id= "5899"/>   <EngineBase id= "555"/>   <EngineDesignation id= "138"/>   <Qty>0</Qty>   <PartType id= "6192"/>   <Part>W0133-1621038</Part>   <Product>    <PartNumber>W0133-1621038</PartNumber>    <BrandID>FUL</BrandID>    <BrandDescription><![CDATA[Full]]></BrandDescription>    <Price>17.38</Price>    <ListPrice>36.60</ListPrice>    <Available>Y</Available>    <Weight>1.05</Weight>    <Popularity>B</Popularity>   </Product>   <Product>    <PartNumber>W0133-1611982</PartNumber>    <BrandID>KN</BrandID>    <BrandDescription><![CDATA[K&N Filters]]></BrandDescription>    <Price>68.78</Price>    <ListPrice>105.81</ListPrice>    <Available>Y</Available>    <Weight>1.80</Weight>    <Popularity>E</Popularity>   </Product>   <Product>    <PartNumber>W0133-1626304</PartNumber>    <BrandID>ND</BrandID>    <BrandDescription><![CDATA[Denso]]></BrandDescription>    <Price>22.34</Price>    <ListPrice>36.60</ListPrice>    <Available>Y</Available>    <OEFlag>OEM</OEFlag>    <Weight>1.05</Weight>    <notes>Notes For This Part</notes>    <Popularity>D</Popularity>   </Product>    <ImageURL><![CDATA[http://img.eautopartscatalog.com/live/W01331621038OES.JPG]]></ImageURL>   <ThumbURL><![CDATA[http://img.eautopartscatalog.com/live/thumb/W01331621038OES.JPG]]></ThumbURL>  </App>
My XSD Schema Thus Far:<xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:sql="urn:schemas-microsoft-com:mapping-schema"><xsd:annotation>  <xsd:appinfo>    <sql:relationship name="test"        parent="WPXML"        parent-key="Part"        child="WPPRODUCT"        child-key="Part" />  </xsd:appinfo></xsd:annotation>  <xsd:element name="App" sql:relation="WPXML" sql:relationship="test">   <xsd:complexType>     <xsd:sequence>                 <xsd:element name="Qty" type="xsd:integer" />     <xsd:element name="Part" type="xsd:string" /> <xsd:element name="BaseVehicle">    <xsd:complexType>    <xsd:attribute name="BaseVehicle" type="xsd:integer" sql:field="BaseVehicle" />  </xsd:complexType>    </xsd:element>    <xsd:element name="PartType">    <xsd:complexType>    <xsd:attribute name="id" type="xsd:integer" sql:field="PartType" />   </xsd:complexType>    </xsd:element>    <xsd:element name="EngineBase">    <xsd:complexType>    <xsd:attribute name="id" type="xsd:integer" sql:field="EngineBase" />   </xsd:complexType>    </xsd:element>    <xsd:element name="EngineDesignation">    <xsd:complexType>    <xsd:attribute name="id" type="xsd:integer" sql:field="EngineDesignation" />   </xsd:complexType>    </xsd:element> <xsd:element name="ImageURL" type="xsd:string" /> <xsd:element name="ThumbURL" type="xsd:string" /> <xsd:element name="Product" sql:relation="WPPRODUCT" sql:key-fields="Part" sql:relationship="test">                  <xsd:complexType>             <xsd:sequence>              <xsd:element name="Part" type="xsd:string" />              <xsd:element name="PartNumber" type="xsd:string" >              </xsd:element>              <xsd:element name="BrandID" type="xsd:string" >              </xsd:element>  <xsd:element name="BrandDescription" type="xsd:string" >              </xsd:element>      <xsd:element name="Price" type="xsd:string" >             </xsd:element>             <xsd:element name="ListPrice" type="xsd:string" >             </xsd:element>             <xsd:element name="Weight" type="xsd:string" >             </xsd:element>             <xsd:element name="Popularity" type="xsd:string" >             </xsd:element>      <xsd:element name="OEFlag" type="xsd:string" >             </xsd:element>      <xsd:element name="ProductRemark" type="xsd:string" >             </xsd:element>      <xsd:element name="Note" type="xsd:string" >             </xsd:element>            </xsd:sequence>           </xsd:complexType>          </xsd:element>         </xsd:sequence>     </xsd:complexType>  </xsd:element></xsd:schema>
 

View 15 Replies View Related

Complex DB Search Forms (Store Proc Vs. Complex Where)

Nov 12, 2003

I have web forms with about 10-15 optional search parameters (fields) for a give table. Each item (textbox) in the form is treated as an AND condition.

Right now I build complex WHERE clauses based on wheather data is present in a textbox and AND each one in the clause. Also, if a particular field is "match any word", i get a ANDed set of OR's. As you can imagine, the WHERE clause gets quite large.

I build clauses like this (i.e., 4 fields shown):

SELECT * from tableName WHERE (aaa like '%data') AND (bbb = 'data') AND (ccc like 'data%') AND ( (xxx like '%data') OR (yyy like '%data%') )

My question is, are stored procedures better for building such dynamic SQL clauses? I may have one field or all fifteen. I've written generic code for building the clauses, but I don't know much about stored procedures and am wondering if I'm making this more difficult on myself.

View 7 Replies View Related

XML Parsing

Jul 24, 2002

I am trying to process an XML document that contains the attribute 'from_x'. However an openxml query can't seem to find any column with a '_x' suffix. For example if I were to execute the following fragment:

declare @hDoc int, @Message varchar(200)
select @Message = '<BACK_FM from_x="12"></BACK_FM>'
exec sp_xml_preparedocument @hDoc OUTPUT, @Message
select from_x from openxml(@hDoc, 'BACK_FM',1) with (from_x int)

I get back a null value from the openxml query. Attribute names 'fromx' and 'from_y' work ok but nothing I have tried with a trailing '_x' will work.

Does anyone know if this is a known SQL Server bug? Is it a bug at all or something about XML that I don't know about?

Thanks,
Wayne King

View 1 Replies View Related

Parsing

Aug 13, 2001

I need to figure out how to parse a comma separated value.

Lets say I have a variable equal to a comma separated list.

SET @Variable = '045, 032, 025, 653'

I need to create a dynamic sql string to look like:

SET @Variable = ''' + '045' + ''' + ',' + ''' + '032' + ''' + ',' etc...

Can someone teach me a optimized query to do this?

Thanks
Greg

View 1 Replies View Related

Parsing

Mar 9, 2005

What is parsing?? can someone give me an example please?? this what I got from BOL

Returns the specified part of an object name. Parts of an object that can be retrieved are the object name, owner name, database name, and server name.

View 11 Replies View Related

Help With Parsing

Feb 8, 2007

Hi -

I am new to SQL server and was wondering if someone can help me with this one. Thanks
My table holds 2 columns (SECTOR and TERM) with following example values

SECTOR TERM
Hybrid 6/18
Hybrid 9/19
Hybrid 10/17
Hybrid 3/13

I would like to find out the rows where my values from SECTOR before '/' does not equal TERM

i.e.
Row 1 where 6<>8
and row 3 where 10<>7

Thanks.

View 5 Replies View Related

Parsing A Sql Field

Jul 6, 2007

Can you parse a SQL field? Let's say, FULLNAME field got a TEXT datatype with the following data: <firstname>Norm</firstname><lastname>bercasio</lastname><Color>blue</color>then using a select statement, parse the field to find the lastname then write it to another field called LASTNAME on the same table, same rowID. Can you send a select statement how it can be done? I am using SQL 2003 or 2005. thank you so much.
 
 

View 2 Replies View Related

Need Help Parsing Out Info

Jul 22, 2004

Hi All,

I'm using a SQL selection to fill a DataGrid. One of the fields I have is called diagnosis. This field in the database can contain multiple diagnosis. But I use a set of characters to divide each diagnosis.
Example : Sick!@#$%Hurt!@#$%Ill!@#$%
My problem is this is how it looks in my Data Grid. Can someone tell me how to parse out each diagnosis.

Thanks

View 1 Replies View Related

String Parsing

Jun 7, 2002

How to remove same repeated string in a column per row from a table? Looked at
replace, stuff string functions, but none take a column name as a parameter.

Help is appreciated.

Thanks,

View 1 Replies View Related

Parsing Data

Apr 21, 2000

Do anyone know of any functions I can use to parse the following data eg.
M 3480-7 should be 3480
M 3477-19 should be 3477
M 28-10 should be 28

Thanks in advance,
Vic

View 1 Replies View Related

Parsing A Name Field Using SQL

Aug 18, 2000

Anybody out there ever take a column containing names and parse it out to salutation, first name, middle initial/name, last name, suffix using Transact-SQL? I think I know how to do it using an array in a procedural language, but using SQL I'm drawing a blank.

Any ideas or help will be appreciated!

View 1 Replies View Related

Parsing Strings

Feb 27, 2001

I have a varchar field that contains answers to questions separated by commas. Say there are 4 questions for each user. Here is an example of what the table would look like:
User Answer
1 Good,Fair,Good,Bad
2 Bad,Good,Good,Good
3 Fair,Good,Bad,Fair

I need to write a stored procedure to report off of that separates the Answer field into 4 different columns. How can this be achieved? Any assistance would be greatly appreciated.

View 1 Replies View Related

Parsing XML Data

Nov 7, 2003

I have a SQL Server table that holds XML documents. Is there a known SQL Server XML parser ?....how can I export XML data into a readable format ?

thanks

View 1 Replies View Related

Parsing Data

Sep 13, 2005

Does anyone know any good URL's for examples on parcing data using SQL?

As an example, i've got First/Middle/Last name of a person inside a single field, I want to turn that into 3 fields.

Thanks!
Caden

View 8 Replies View Related

SQL Parsing Challenge!

Nov 9, 2004

Hi guys!

Can anyone tell how I can parse the WHERE clause of an SQL statement to check for special characters such as ''' (single quotes) in fields of type varchar?

thanks
nelo

View 2 Replies View Related

Anyone Know About Parsing An Email?

Mar 20, 2006

Hello everyone

Heres what it looks like:

I have a large file of over 40k email records. The emails are all mixed up and come in various formats but i noticed that most of them are in this format:

firstname.lastname@email.com
firstname.middlename.lastname@email.com

For all those emails with the period (.) in between, the (.) actually separates an individuals first and last name.

My task is this, to separate all the emails that are in this format into first and last name fields. I'm stimped folks and I'll really appreciate any pointers or ideas on how to go about solving this task.

Thanks

View 1 Replies View Related

Parsing A Name Field

Aug 9, 2012

I have a name field that has many variations and been fighting with it for a few days. The possible name variations are:

Smith, Mike
Smith, Mike K.
Smith, Mike K., JR.

I would like to parse each of the for pieces in a separate field.

View 3 Replies View Related

Parsing Address

Feb 13, 2004

Parsing Address
This is not really a reply, but I saw the problem and the replies look very promissing.
I'm using ss2k, I have a table with an address column.
here is some example of the records under ADDRESS :

WILLOW CREEK PL
RED BARN DR
RED BARN DR
CARRINGTON DR
RENNER RD
EDMONTON CT
SPRINGBRANCH DR
HILLROSE DR
CEDAR RIDGE DR
LARTAN TRL
PRESIDENT GEORGE BUSH HWY

What I want to do is to write script that runs daily and parse the street names (RED BARN) and street types (Dr, PL , etc.. ) to 2 colums. As u can see there is no fixed length or fixed number of words ...etc ...
Any help would be really appreciated.
thnks

View 3 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved