Parsing A Tab Delimited File

Dec 5, 2007

I have a tab delimited file with 122 columns. Can any one let me know if there is a better way of parsing/extracting few columns (say about 15) from the file and loading it into a table using SSIS.

View 1 Replies


ADVERTISEMENT

Parsing Delimited String

Jul 3, 2013

parsing any delimited string (in above example it is using ',' as parsing delimiter. This query can be useful in many business scenarios where in we have input data as a long string containing delimited values.

declare
v_sql VARCHAR2(2000);
v_pos INTEGER;
v_differentiator VARCHAR2(10);

[code]...

View 4 Replies View Related

Parsing Variable Length Delimited Records

Mar 25, 2006

I am running SQLServer 2000 to parse and store records in the EDIX12
format. This consists of variable length delimited records which
I am passing to the "transforms" tab to process with VBScript.



The problem is though each segment has a defined number of fields, N,
the standard states that if the final M fieds are empty/blank they are
not to be sent. Thus, a segment defined to have 20 fields may
have 6 the first time I see it, 13 the next time, etc. To access
the columns in VBScript I use DTSSource("Col001"). This works as
long as the columns are there, but gives an error when they are
not. Is there a parameter telling me how many columns are
defined? Or is there something akin to IFEXISTS("Colxxx") or
exceptions?



How can I handle this situation? One suggestion has been to pass
the entire segment to the Transforms section and break it up there.



Finally, what resources can yuo point me to for reference? I'd
like to get good at using DTS since my client wants their project
written for it.



Thanks for yuor help,

--greg

View 1 Replies View Related

Reporting Services :: Parsing SSRS Config File And Dynamically Changing File Path Of Config File In Code

Sep 2, 2015

Currently have a single hard coded file path to the SSRS config file which parses the file and provides the reporting services web service url.  My question is how would i run this same query against 100s of servers that may or may not share the same file path as the one hard coded ?

Is there a way to query the registry to find the location of the config file of any server ? which could be on D, E, F, H, etc. 

I know I can string together the address followed by "reports" and named instance if needed, but some instances may not have used the default virtual directory name (Reports).

Am I going about this the hard way ? Is there a location where the web service url exists in a table ? I could not locate anything in the Reporting service database. Basically need to inventory all of my reporting services url's.

View 2 Replies View Related

Flat File Connection Manager Not Parsing File Correctly

Apr 3, 2007

Hi,



I have a flat file, comma-delimited, with strings in double-quotes.



In the connection manager for the file, I have specified that the Text Qualifier = ""



However, in the preview tab, it still shows the strings as surrounded by the quotes, e.g. "mycol1" whereas it should show mycol1 without the quotes.



Next, when I examine the data in the database after the load, it's messed up there also.



"mycol1" ends up in the database as "mycol1



"mycol2" ends up as "mycol2



This is not right.



I have format set to delimited, header row delimiter crlf, etc.



Any ideas?



Thanks

View 3 Replies View Related

Parsing RTF File

Dec 3, 2007

Hi:

I need to parse an regularly outputted rtf file and was wondering if it is possible in SSIS. I am trying to use the flat file connection manager to do this.

Now, I can't treat tab stops in an rtf like tab stops in a csv, since when you treat an rtf as a text file, you see the format code of the rtf. If I open the rtf in a text editor, the entire file is one line, with lines breaking with:

par}

Columns are tab delimited in the rtf, and they look like this when you treat the rtf as a text file.

plain abfs16f4cf0cb1

(or something like that, the word "tab" is the important part.)

So I use the "plain ab" part to delimit in SSIS, since that is consistent (planning to parse out all the garbage later on). The problem is, sometimes lines don't have a "city" and "state", so it "tabs" right over to the next field. So like this (looking in MS Word):

Phone <tab> City <tab> State <tab> Date <tab> Other fields.....
847-111-2222 <tab> Omaha <tab> NB <tab> 9/14/2007 <tab>
222-222-3333 <tab> 9/14/2007 <tab>
555-121-1212 <tab> Houston <tab> TX <tab> 9/14/2007 <tab>

Now, if you treat an RTF as a text file, it has only one "plain abfs16f4cf0cb1" after the phone number, so even for the missing line there is only one tab, not 3. This is because in the beginning of the row tabs for each row are defined like this:

tql x90 ql x840 ql....etc...

with "tql" and "tx" tags basically saying where all the tab stops are for that row. So for the row above with missing info, it lists fewer tab stops. So the "date" (and associated garbage) ends up under "City" for this row. All of the "Houston" row's data starts appearing in the sql server output table's 2nd last field, as you might expect.

Any suggestions how to pull this in in SSIS during the transformation? I could deal with it after I pull it in, I still have all the data. I'm thinking the logic to do this could be complicated though. I take the data out of the last two fields of the missing row into some other table, use UPDATES to shift the values 2 fields to the right, and then figure out a way to take the data I just put in a temp table back in, but it all sounds a bit complicated.

Let me know if this makes sense--I've almost got it going, I just need to sort this last bit out.

Thanks,
Kayda

View 4 Replies View Related

Parsing A QFX File?

Jan 19, 2008

i am trying to read a qfx file from quicken.
it looks like xml, but its not, but i cannot figure out how to grab what ive got to parse the line.
i put this into a derived column, but its not getting it

SUBSTRING([Column 0],FINDSTRING("<STMTTRN>",[Column 0],1),FINDSTRING("</STMTTRN>",[Column 0],1))

because inside the data, it lools like that's what brackets a tranasction; the data looks like this and varies by trntype, but the columns are tagged like so


<STMTTRN>
<TRNTYPE>POS
<DTPOSTED>20070129160000
<TRNAMT>-0000000000026.50
<FITID>20070129011
<NAME>SUNOCO
<MEMO>01/24 ENGLWD CLIFF NJ 8015V200006
</STMTTRN>
<STMTTRN>
<TRNTYPE>POS
<DTPOSTED>20070129160000
<TRNAMT>-0000000000023.47
<FITID>20070129012
<NAME>KFC
<MEMO>01/26 NANUET NY 8015V215116
</STMTTRN>


i tried the xml transform and unpivot, but have not cracked it.
thanks for any light you can shed
drew


View 1 Replies View Related

Parsing Text File And Inserting Into DB

Mar 19, 2008

Hello all,
I have a question regarding importing text file data into SQL Server.  I'm hoping someone can point me in the right direction, as my searches haven't turned up anything specific enough.
I'm trying to parse a large (24MB) text file.  It's a fixed-width file, with multiple columns.  I need to parse this file, check if a record already exists, and then import the data into the database.  But I don't need to insert every column.  There's only a few columns from the file I need to insert.  This parsing also needs to occur at regular intervals (daily).
I looked at BULK INSERT, but I can't find an example that uses only some of the columns.  Every example uses all columns, and the file is delimited, not fixed-width.
Is there anything within SQL Server that can accomplish this?  I haven't turned up anything that will solve my problem.  The only other solution I can think of is an application that parses the file for me and inserts the data into the database.  But can I schedule that application to run every night at midnight (for example) through SQL Server?
I'm not too familiar with SQL Server, so I appreciate any help offered.
Thanks,Jay

View 7 Replies View Related

Complex File Parsing Issue

Nov 28, 2007

Hello,

I have a file that looks like this:

Summary
A ABCD
A Category MarketValue Margin
A category1 1.0000000 1.000000
A category2 2.0000000 2.000000

H Totals Total Cash Net
H 2.00000 200000 2000000

Another Summary
B BCDE
B Activity MarketValue Margin
B activity1 3.00000 3.000000
B activity2 4.00000 4.000000

The items in blue are headers. I don't want to capture those. However, I want to capture all the data in black, and put it into 3 separate tables (or maybe the same table, under the appropriate column names)

This situation differs from anything I've done before in that you can't identify what row contains what data by what's in the row itself. That is, what's in the data rows is random and subject to change. So you can't search the row itself to determine which table it goes to.

However, if there's a way to capture all the rows after a certain header before the header changes again, that might work.

That is, get all rows between A Category MarketValue Margin and H Totals Total Cash Net
and
get all rows between H Totals Total Cash Net and Another Summary
and
get all rows after B Activity MarketValue Margin

Any examples of how I might script this?

Thanks


View 2 Replies View Related

Reading File As One String, Then Parsing - How To Do This?

May 7, 2007

Hi,



The suggestion to do this is buried deep in one of my posts, however I still do not have a clear idea of how to do this.



I have a flat file which has several "bad rows" in it. Because file error redirection is buggy, I need a manual approach to get rid of these incomplete rows in my data file.



Phil, you suggested I read the file as one long string, then parse out the bad rows (using a script?).... however I have no idea as to how to actually do this.



I was wondering if it's possible to clarify the steps involved in doing this, or perhaps point me to an example I can look at, as I cannot seem to get around this problem on my own.



Thanks much!!

View 24 Replies View Related

Another Flat File Parsing Problem

Dec 5, 2006

Hello All!

I know this has come up before and I have tried several of the solutions found within the forum but I just can't seem to import my file correctly and could use some input, please.

Sample file (less fields than actual file):

Name (str), Phone# (str), Description(str), Resolved(bool), Met(bool)

"Kay, Mary","123-4567","Used a "."not a"," in text", "1", "1"

The text is qualified with " and columns delimited with commas but the description field has embedded quotes and commas. Normally it works except if there embedded quotes and commas.

I have tried unqualified data and undouble, but that does not work either because of the embedded commas in quotes.

Do I need to do something before the data flow? Do I need to do custom code similar to undouble (I tried modifying undouble but using unqualified fields caused the source file to not like the data and go red)? Should the row be read as one field and parsed?

Thanks in advance for any help you can give!

View 12 Replies View Related

Importing A Tab Delimited File

Jul 5, 2007

hi all,

While importing a tab delimited file..
it seems ssis interprets the incoming col as 50 chars in length even
though it is far smaller.
any ideas how this could be??

any help would be appreciated.

View 2 Replies View Related

Tab-delimited Header File

Feb 19, 2008

used bcp utility to send data to output file in tab-delimited format (-t ), but headerfile is separate entity in this query.

when I set FILEheader = firstname,lastname...what must I use to change the comma to tab in the header string. I have tried various ways , {t}, [-t], and others. what am I missing?

View 1 Replies View Related

Character Delimited BCP File

Feb 6, 2008

Is there a faster way to create my pipe delimited BCP file, besides from creating a format file? Actually, my problem is that I am having issue with the file. It looks perfect, like:




Code Snippet
Marie|32|brown|single
Gay|33|black|married


But when I load it to DataStage it puts the entire row as one column. I already specified the | as the delimiter in DataStage. I think the issue is from the column collation. If my data is as simple as my example above, what column collation should i use for the format file? Currentyl, i have something like:




Code Snippet8.0
4
1 SYBCHAR 0 4 "|" 1 emp_id ""
2 SYBCHAR 0 4 "|" 2 emp_cand_id ""
3 SYBCHAR 0 4 "|" 3 emp_statusid ""
4 SYBCHAR 0 4 "
" 4 emp_type SQL_Latin1_General_CP1_CI_AS

I generated this through prompts given by BCP for each column. Then I changed the 2nd column values all to SYBCHAR and 0 prefix length..

View 2 Replies View Related

Need Suggestions On Text File Parsing Into Database

Feb 28, 2007

I have a website, where people upload tab delimited text files of their product inventories, which the site parses and inserts into a database table.  Here's the catch: Instead of insisting that each user use a standardized format, each user can upload the file in whatever column order they want, they just have to let the site know through a GUI which column is in which order.   And, they may upload columns that if not mapped, will be ignored.  Right now, I am doing all of this in code and it runs slow, I was thinking of offloading this to either a stored procedure, ssis, or bulk upload.   But, with the varying format of the uploaded text file, I am not sure how I could do that.  Any suggestions? Thanks! 

View 1 Replies View Related

SQL Server 2008 :: Parsing Unstructured CSV File?

Oct 1, 2015

I have a CSV file with roughly 6 million rows. The file is unstructured; that is, some rows have 5 fields, others have 15, and there are as many 50 fields in one row.

I am using bulk insert to read the entire file into a table in database, with each row being a database record. With that, I have one column that contains a row of comma delimited fields. All fields are character string and I want to find a quick way of parsing each row and placing each comma-delimited value in a column. For example:

CREATE TABLE MyTable
(
CSVString varchar(1000),
C1 varchar(20),
C2 varchar(20),
...
C50 varchar(20),
)

Column CSVString contains the a CSV row (I don't know how many filelds (no. of commas + 1) in the row, but if the row contains 10 fields, I need to populate columns C1-C10. If the row has 15 fields, I populate columns C1-C15.

How can I do this in a very efficient way? I tried CTE but performance was not very good.

View 8 Replies View Related

Help, Fairly Complicated File Parsing Issue

Jun 7, 2007

Hi,



I have a situation where I'm having to extract key data from a financial file. Problem is, the columns are not nice and tidy.



Basically the file looks like this:



row 1: "788","Company","OPENING BALANCE:", 2084587.76
row 2: "313947","04/01/07","3","CS","FF", 170.00,"AZT","XYC INC", 20.8, 351.00
row 3:"788","06/06/07 CLOSING BALANCE:", 206203893.03



So, I'm going to need to get the OPENING BALANCE and CLOSING BALANCE figures, as well as all the data in between, ie) row 2 through n.



Does anyone have an example of a script that can be used for extracting very specific values from a file?



I have a script that checks for incomplete rows, but it is not sophisticated enough for this situation.



Thanks much




View 9 Replies View Related

Delimited Txt File Into SQL Server Table

May 16, 2007

 
Hi
I am new to VB, and am looking to write some code to import a delimited (by a ~) .txt file into a SQL server table.  It doesn't need to append, just to totally overwrtie the table.  Is this possible?  I have been looking at the Bulk Insert, but this doesn't seem to be quite right.  Can anyone help?
Cheers,
S

View 2 Replies View Related

Comma Delimited Text File

May 18, 2004

Hi I'm pretty new to using Microsoft Visual C# .NET and I want to upload a comma delimited text file from my local machine into a table in an sql server database through a web app. How would I go about programming this and what controls do I need? Any help would be much appreciated. Thanks in advance.

View 4 Replies View Related

Importing A Null Delimited File

Jul 15, 2004

I have a text file that is delimited by nulls. Any idea on the best way to get this into a SQL Server table?

View 9 Replies View Related

Export To Comma Delimited File

Aug 4, 2007

I'm trying to upload a small Web application with a one table database. The hosting company, GoDaddy requires that I upload the database as a comma delimited file.
I created the database in Visual Web Developer Express but also have Visual Studio and SQL Server Management Studio Express.
I can't figure out how to export the database into a comma delimited file using any of these tools.
This should be simple like it is in Access but that doesn't seem to be the case. This is holding up deploying my Web Application.

Can anyone help me?

Thanks

View 1 Replies View Related

Reading Tab Delimited Text File

Mar 28, 2008

Is there anyway Sql Server reads a "Tab Delimited Text File" and Compare each record with the Column in a table..

my question is..

I've a Country_Code table which has 3 letter Country Code and the Actual Country names are listed in a Tab Delimited Text File "Country Data" with Country Code and Country Name, how do i read each record and compare to get the Actual Country Name for Display.

any ideas/suggestions.

thanks

View 3 Replies View Related

COMMA Delimited TEXT FILE

Jul 20, 2005

Hi,On SQLServer 2000, I have a table with a following structure:MYTABLEcol1 char,col2 date,col3 numberMy Objective:------------Externally (from a command line), to select all columns and write theoutput into a file delimited by a comma.My method:---------1. Probably will use OSQL or BCP to do this.2. Use the following syntax:select RTRIM(col1) +','+ RTRIM(col2) +','+ RTRIM(col3)from MYTABLE;My 3 Problems:-------------1) If there is a NULL column, the result of concatenating any value withNULL, is NULL. How can I work around this? I still want to record thiscolumn as null. Something like say from the example above, if col2 isnull, would result to: APPLE,,52) The time format when querying the database is: 2003-06-24 15:10:20.However, on the file, the data becomes: 24 JUN 2003 3:10PM. How can Ipreserve the YYYY-MM-DD HH:MM:SS format? Notice that I also lost theSS.3) Which utility is better? BCP or OSQL?For OSQL, it has a "-s" flag which gives me the option of putting acolumn separator. But the result is:"APPLE ,14 JUN 2003 , 5"I don't need the extra space.While for BCP, there is no column separator flag.You will notice from my inquiry above that my background in SQLServer isnot very good.Thanks in Advance!!RegardsRicky*** Sent via Developersdex http://www.developersdex.com ***Don't just participate in USENET...get rewarded for it!

View 1 Replies View Related

Exporting Numerics In A Delimited File

May 29, 2008



I'm trying, through the SQL 2005 Mgt Studio, to export a simple table in a delimited format. I'm selecting a double quote as the text qualifier. My expectations were that only text type fields would be exported with the double quotes an numerical fields would not have any quotes around them. SQL 2000 does this just fine, but 2005 is exporting all my text type and numeric fields with double quotes. Is this a change to SQL 2005 or am I doing something wrong.

Thanks for the help,
Tim

View 1 Replies View Related

Importing A Comma Delimited File

Apr 12, 2006

Hello...



I have a problem... When I insert data from a comma delimited
file using this mehod a flat file connection sorting, merge join and
inserting into the database I get "" around all the data!! The
quotes end up around the column names and everything! I had to go
in and manually remove the quotes in the text file to get some of my
data conversions to work. I know there is a better way. How
do I get SSIS to load the data without the quotes? This is an
example of the data in the file:

"1007","1","A","","Congratulations - No Health Violations Found","11/02/2005","1007"



When I remove the quotes I do not have any problems. How do I
do this without modifying the underlying data? Any ideas would be
greatly appreciated!!



Thank you for your help!



SD

View 3 Replies View Related

Export As Tab Delimited Text File

Dec 5, 2007



Hi,

I am trying to export as a tab delimited text file. For that I have changed my config file as :



<Extension Name="TXT" Type="Microsoft.ReportingServices.Rendering.CsvRenderer.CsvReport,Microsoft.ReportingServices.CsvRendering">
<OverrideNames>
<Name Language="en-US">TXT (Tab Delimited Text File)</Name>
</OverrideNames>
<Configuration>
<DeviceInfo>
<FieldDelimiter>&#9;</FieldDelimiter>
<Extension>TXT</Extension>
<Encoding>ASCII</Encoding>
<NoHeader>true</NoHeader>
</DeviceInfo>
</Configuration>
</Extension>


I got this code from another one of the MSDN forms. When I run the report and try to export using this format, it still gives me a csv file instead of tab delimited file.

Can someone please help me fix this code so I can get tab delimited text files.
Thanks a lot,
-Rohit

View 8 Replies View Related

Flat File Source Column Parsing Error

May 12, 2006

Hello All,



I have come across this issue with the Flat File Source when the delimiter is set to a comma.

"""KAILUA KONA,HI""","CA",

In the data snippet above and with the setting of using a comma as a column delimiter

and a " as the text qualifer.

the data will be parsed in this fashion:

"""KAILUA as a column:

HI""" as a column

CA as column

when it should be

"KAILUA,HI" as a column

CA as column.



Is there a way to let the Flat File Source to let it know not to parse the data in multiple quotes ?



Thank you

Eric Flores

View 5 Replies View Related

Export SQL Server To A Comma Delimited File

Aug 1, 2007

Hi,
I'm trying to deploy my Web site to GoDaddy. They told me I have to export the SQL Server Express database to a comma delimited file and then upload that file. The export procedure is simple in Access but I don't see any way to do it in SQL Server or from Visual Web Developer or Visual Studio.
 Also, I can ask them, but I assume I have to export each table separately and also export the ASPNETDB as well.
Thanks for the help 
 

View 2 Replies View Related

How To Export Records From Sql Database To Tab-delimited File

Jul 7, 2005

Hi,
I want to export data/records coming from the database and save it as a
.txt file but tab-delimited. The flow of my project is something this.

Web Form->SQL Database->Web Report->Tab-Delimited file.

I will explain more..What we want to do is an online application form.
We have a form and will save all the data to sql server database. We
also want to save all those information in a tab-delimited file. I
would like to save this first in the database(no problem in this part).
Then later on export this in tab-delimited file.

If you can give me a little bit tutorial of this i really
appreciated..Even 3 records can do as long i can see how to do
this..Ooops btw, i also want  to name the .txt file as
(userid+transactionid).

Thank you very much!

View 2 Replies View Related

Importing Tab-delimited File Into SQL Server 2005

Apr 20, 2006

I use SQL server 2005...I have a tab delimited file which I want to import into my SQL server database.My sql server table setup is:CountryID   int (autogenerated, identity specification)CountryName   nvarchar(40)CountryAbbreviation nvarchar(3)In my tab delimted file I have two columns:CountryName and CountryAbbreviation How can I best solve this?

View 2 Replies View Related

Export Data Into Text File Using ç Delimited

Sep 8, 2005

Hi,
I need to export data from SQL server 2000 database into text file uisng ç Delimited. Because my destination database will be teradata. Could you let me know if you have any method for this.
Thanks

View 1 Replies View Related

Can MSSQL Load A Tab Delimited Text File?

Jul 23, 2005

Hi all...I would like to know if SQL SERVER can load a tab delimited text file.If yes, how?A search on the web did not return me the "load data" command as mysqlor other.Thank you all.

View 5 Replies View Related

Problem Ith DTS And Delimited File With Empty Last Column

Jan 22, 2006

Hello,I'm not getting any response to this on the SQLDTS newsgroup, so Ithought that I would try here:I just ran into this problem and I can't find any other mention of itthrough Google. I have a text file that is comma-delimited. It alsouses double quotes as text identifiers. A new column has been added tothe file, but currently has no values. I would like to finish mydevelopment so that when it does finally get some values, they will beimported as well. The problem is, the last column does not show up inDTS.I can reproduce this problem easily enough... create a text file withthe following two lines in it:1,"test",2,"test2",Now, create a new DTS package and add a text file connection. Point itto the new file and go through the properties for the file. You willnotice that on the second screen where it displays the preview of thedata there are only two columns shown.This does not happen if there is no text qualifier or if at least onerow has the final column value filled. Is there any way around thisproblem?Thanks!-Tom.

View 2 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved