Creating A New SQL Table By Importing In A Delimited File
Feb 17, 2008
How do I do this?
I cannot find any facility like there is a Access for getting external data.
How do I do this?
I cannot find any facility like there is a Access for getting external data.
I have a text file I am trying to import to a table. This text file is in a tab delimited format. I am using DTS to import the data to a new table I made. The fields are varchar and are set to allow nulls & allow 8,000 characters per field.
The error I am getting is that the data exceeds the allowed amount (or something like that) in col4.
Now I have checked everything in column 4 and nothing exceeds 5,000 spaces/characters combined. I have checked the entire sheet (in excel) for that fact, and there is not one single column/row/cell that exceeds 5,000 spaces/characters combined.
What the heck could be causing SQL to tell me I am trying to import too much data in one column when there is nothing that even comes close to 8,000 characters & spaces combined?
I am using the Bulk Insert command and trying to import a CSV delimited text file into a table and I am having problems with the quote field delimiters ", " The command below works but it takes in all the "" quotes as well and the field delimiter comma , works only if the commas are the separators only. If I have a comma within a address field for example then the data gets imported into the wrong fields. What can I use to identify that the text qualifier is ". I don't see where I can use the bulk insert command to determine this. Is there another command that I can use or am I using this command incorrectly. I thank you in advance for any response or suggestion you may have.
BULK INSERT AdventureWorks.dbo.MbAddress
FROM 'a:mbAddress.txt'
WITH (
DATAFILETYPE = 'char',
FIELDTERMINATOR=',',
ROWTERMINATOR='',
CODEPAGE = '1252',
KEEPIDENTITY,
KEEPNULLS,
FIRSTROW=2)
Here is a sample ascii file I am importing as well you can see that 6330 has a extra comma in the address line.
"AddressAutoID","Memkey","Type","BadAddress","Address1","Address2","Address3","City","State","Zip","Foreign","CarrierRoute","Dpbc","County","CountyNo","ErrorCode","ChangeDate","UserID"
6317,26517,1,0,"1403 W. Kline Ave","","","MILWAUKEE","WI","53221","","",0.00,"MILWAUKEE",79,"",1/25/2006 0:00:00,"admin"
6318,26225,1,0,"501 Dunford Dr","","","BURLINGTON","WI","53105","","",0.00,"RACINE",101,"",1/25/2006 0:00:00,"admin"
6319,20101,1,0,"2115 Cappaert Rd #35","","","MANITOWOC","WI","54220","","",0.00,"MANITOWOC",71,"",1/25/2006 0:00:00,"admin"
6320,23597,1,0,"728 Woodland Park Dr","","","DELAFIELD","WI","53018","","",0.00,"WAUKESHA",133,"",1/25/2006 0:00:00,"admin"
6321,23392,1,0,"7700 S. 51st St","","","FRANKLIN","WI","53132","","",0.00,"MILWAUKEE",79,"",1/25/2006 0:00:00,"admin"
6322,26537,1,0,"W188 S6473 GOLD DRIVE","","","MUSKEGO","WI","53150","","",0.00,"WAUKESHA",133,"",1/26/2006 0:00:00,"admin"
6323,25953,1,0,"3509 N. Downer Ave","","","MILWAUKEE","WI","53211","","",0.00,"MILWAUKEE",79,"",1/26/2006 0:00:00,"admin"
6324,19866,1,0,"10080 E. Mountain View Lake Rd. #145","","","SCOTTSDALE","AZ","85258","","",0.00,"MARICOPA",13,"",1/27/2006 0:00:00,"admin"
6325,25893,1,0,"W129 N6889 Northfield Dr. Apt 114","","","MENOMONEE FALLS","WI","53051-0517","","",0.00,"WAUKESHA",133,"",1/27/2006 0:00:00,"admin"
6326,26569,1,0,"8402 64th Street","","","KENOSHA","WI","53142-7577","","",0.00,"KENOSHA",59,"",1/27/2006 0:00:00,"admin"
6327,24446,4,0,"83 Sweetbriar Br","","","LONGWOOD","FL","32750","","",0.00,"SEMINOLE",117,"",1/30/2006 0:00:00,"admin"
6328,19547,1,0,"4359 MERCHANT AVENUE","","","SPRING HILL","FL","34608","","",0.00,"HERNANDO",53,"",2/8/2006 0:00:00,"admin"
6329,26524,1,0,"264 Lakeridge Drive","","","OCONOMOWOC","WI","53066","","",0.00,"WAUKESHA",133,"",2/10/2006 0:00:00,"admin"
6330,23967,1,0,"3423 HICKORY ST","100 Tangerine Blvd., Brownsville, TX 78521-4368","Texas Phone Number: 956-546-4279","SHEBOYGAN","WI","53081","","",0.00,"SHEBOYGAN",117,"",2/15/2006 0:00:00,"admin"
6331,25318,1,0,"3960 S. Prairie Hill Lane Unit 107","","","Greenfield","WI","53228","","",0.00,"MILWAUKEE",79,"",2/20/2006 0:00:00,"admin"
6332,24446,1,0,"83 Sweetbriar BR","","","LONGWOOD","FL","32750","","",0.00,"SEMINOLE",117,"",2/21/2006 0:00:00,"admin"
6333,26135,1,0,"P.O. Box 8 127 Main Street","","","CASCO","WI","54205","","",0.00,"KEWAUNEE",61,"",2/21/2006 0:00:00,"admin"
hi all,
While importing a tab delimited file..
it seems ssis interprets the incoming col as 50 chars in length even
though it is far smaller.
any ideas how this could be??
any help would be appreciated.
I have a text file that is delimited by nulls. Any idea on the best way to get this into a SQL Server table?
View 9 Replies View RelatedHello...
I have a problem... When I insert data from a comma delimited
file using this mehod a flat file connection sorting, merge join and
inserting into the database I get "" around all the data!! The
quotes end up around the column names and everything! I had to go
in and manually remove the quotes in the text file to get some of my
data conversions to work. I know there is a better way. How
do I get SSIS to load the data without the quotes? This is an
example of the data in the file:
"1007","1","A","","Congratulations - No Health Violations Found","11/02/2005","1007"
When I remove the quotes I do not have any problems. How do I
do this without modifying the underlying data? Any ideas would be
greatly appreciated!!
Thank you for your help!
SD
I use SQL server 2005...I have a tab delimited file which I want to import into my SQL server database.My sql server table setup is:CountryID int (autogenerated, identity specification)CountryName nvarchar(40)CountryAbbreviation nvarchar(3)In my tab delimted file I have two columns:CountryName and CountryAbbreviation How can I best solve this?
View 2 Replies View RelatedI am struggling on using bcp to import data. Here is my steps:
1. I created a Test database on my localhost
2. In the Test database, I created a Test table, the query is here for your convenience:
CREATE TABLE [dbo].[Test](
[id] [int] IDENTITY(1,1) NOT NULL,
[network_group_name] [varchar](128) NULL,
[IP] [varchar](15) NULL,
[OS] [varchar](128) NULL,
[Code] ....
I then create the format file used in bcp:
bcp Test.dbo.Test format nul -c -t, -f C:RXieSQLTest.fmt –T
Here is the format file:
9.0
8
1 SQLCHAR 0 12 "," 1 id ""
2 SQLCHAR 0 128 "," 2 network_group_name SQL_Latin1_General_CP1_CI_AS
3 SQLCHAR 0 15 "," 3 IP SQL_Latin1_General_CP1_CI_AS
[Code] ...
The data file is called 20150902FullTest.rpt and the first couple lines (first line is the header and followed by two rows) are posted here:
network_group_name,IP,OS,App_Name,vuln_name,host_score,recordswritten
Domestic,10.216.56.88,Windows XP SP3,Adobe / Macromedia Flash Player,APSB14-17: Adobe Flash Player CVE-2014-0537 Vulnerability,4350,2015-09-01 09:55:07.720
Domestic,10.216.56.88,Windows XP SP3,Adobe / Macromedia Flash Player,APSB14-17: Adobe Flash Player CVE-2014-0539 Vulnerability,4350,2015-09-01 09:55:07.720
With the format file and the data file, I use the following bcp command:
bcp Test.dbo.Test in C:RxieSQL20150902FullTest.rpt -f C:RxieSQLTest.fmt -T
I got the following error messages:
Starting copy...
SQLState = 22005, NativeError = 0
Error = [Microsoft][SQL Native Client]Invalid character value for cast specification
SQLState = 22005, NativeError = 0
Error = [Microsoft][SQL Native Client]Invalid character value for cast specification
[Code] ...
I do want to mention here is the rpt file contains three BOM characters EF BB BF at the beginning of the file.
I am using SQLServer 8.
I have several files that were FTP'd from our legacy COBOL system. I am in the process of cleaning them up and saving them into CSV format.
I was told that I could use the BCP utility to import them into a SQL database, but the documentation I have is not real clear on that.
Does anyone have any suggestions for how to import these files? I have created the database but not the tables.
Thanks.
Hi
I am new to VB, and am looking to write some code to import a delimited (by a ~) .txt file into a SQL server table. It doesn't need to append, just to totally overwrtie the table. Is this possible? I have been looking at the Bulk Insert, but this doesn't seem to be quite right. Can anyone help?
Cheers,
S
Hi,
Could you help me to write a script to import a CSV delimited text file into a sql server table.?
Thanks,
carlos
Hi,
I am new to SSIS but i have avg working knowledge in sql.
My problem is as follows ,I have a text pipe dilimited file in some folder and the number of columns and the name of the column is not consistant. It can have N number of column and it can have any column names. I need to load this text file data into a sql table. All that i want is to load this file to SQL Database with some temp name. Once i get the table in SQL Database, i can match the column names of both taget table and this temp table and only push those column which matches with the target table. For this i can frame Dynamic SQL. This part is clear to me.
Now the problem is , I developed a SSIS pacakge to push the text file to SQL Table. I am able to do this. But if i change the column names or added new column SSIS is not able to push the new columns. Is this functionality available in SSIS, is it can be dynamic like this?
I hope i am clear with my prob... if need any clarification please let me know
thanks in advance
Mike
Hi,
I have a tab delimited file I need to transfer to a table using SSIS. Columns can have NULL value and there might be extra tabs in each row also. How can I do this? Maybe fuzzy lookup?
Thanks
Hi all-
I am in need of some help importing a .CSV file into a SQL Server 2005 Enterprise Edition.
The problem is I already implemented Bulk Insert task in SSIS but it is not importing any data. My detailed layout is as follows :
In SSIS package1 -
In Control Flow Bulk Insert Task has been inserted
Properties of Bulk Insert Task:
Connection adtc009d.ganny
Destination Table ganny.dbo.t4
Format
Format Specify
Row Delimiter {CR}
Column Delimiter Comma{,}
Source Connection
File r.csv
Options
Options Check Constraints
Maxerrors 20
This bulk insert task is connected to Data flow task, if we click edit to data flow task, data flow section will come, here Flat file source & OLE DB Destination is there. Flat file source is connected to OLE DB Destination.
Properties of Flat File
Connection Manager
Flat file connection Manager
here by clicking new link flat file properties to this.
Preview
by clicking preview all data are visible
Properties of OLE DB Destination editor
Oledb connection manager adtc009d.ganny
Data access mode: Table or View - fast load
Name of Table or view dbo.t4
After designing all this then if I start debugging I could able to get records are imported to a table.
Please suggest me where I am going wrong.
Thanks in advance
Karna
Hi to all
I am working on import module.
is there any direct query to import a text(CSV) file in to a database table?
any one can help me in this matter?
Hello all,
I am trying customize the CSV rendering extension with different device information settings. I would loke to change the CSV extension to a Tab delimited output. However, whenever I add a new extension line to the RSReportServer.config file I run into problems.
1. What is the correct character to use in the delimiter to specify a Tab character? I seem to be unable to get this correct
2. Why when I override the name in the rendering extension does it still show up as the CSV (comma delimited) name in the export dropdown.
Here is my customized rendering extension.
<Extension Name="TXT" Type="Microsoft.ReportingServices.Rendering.CsvRenderer.CsvReport,Microsoft.ReportingServices.CsvRendering">
<OverrideNames>
<Name>TXT (Tab Delimited Text File)></Name>
</OverrideNames>
<Configuration>
<DeviceInfo>
<Encoding>ASCII</Encoding>
<FieldDelimiter><![CDATA[#x9]]></FieldDelimiter>
<NoHeader>true</NoHeader>
</DeviceInfo>
</Configuration>
</Extension>
Thanks
Justin
I hope this is the right forum for my question.
I'm developing a website for a Prepaid Calling Cards distributor. Each of the cards they sale have a list of the countries the card is good for. I need to import this data into my countries_rates table. The file they are giving me is an excel file that contain 3 colums (fields)
1- Country-Name
2- Rate
3- Card_$_Price
these files contain aproximaly 400 rows so it will be a hasle to have to insert it manually every week.
In my web application I need to create a form where the user will select the card from a dropdownlist and then find the excel file to be imported for that card.
I would like to know how do I do that with Visual Studio 2005, SLQ 2005 and C#
please direct me to some links where I can learn how to do this or please send me some code snips I can see how is done.
Tia
Charles
Hello
I am writing program in VC++ through SQl-DMO calls.My problem is when i when i tranfer(import) a text file(comma seperated) into SQl server through a SQl-DMO method called ImportData which is a method of Bulk copy object.Its is not able to convert the data field in the text file to corresponding value datetime in SQl server whereas other data types are working perfectly.
This is the record i need to convert:
90,MichaelB,Wintriss,Inspection,Paper,11,Job101,1, {ts '2000-12-10 15:54:56.000'},D:public233 and 247233.mcs,
and this is the date field
{ts '2000-12-10 15:54:56.000'}
Whereas if i export a table in SQl server in Binary mode and then import the file back it works but when do it as text it gives the above error
Pls help me in this i would be very thankful to you.
Note: I am using SQL Server 7.0 version
Regards
Jitender Singh
Hi There,I am looking for information on how to import the txt or csv file tothe multiple table in sql 2000. If you have any kind of inf. pleaselet me know wheather we can do this and how.below is the detail information.I received txt file every day which contain the information from 3different related table in my sql 2000 database. Right now we arekeyin the information from the web site (which is link to the txtfile) to our database, but i am wondering if we can import thoserecord in the tables.the header of the file goes to table1 and when we insert the record intable1, it should generate the autoidentityrecord (PK), and that PK islink to other table2 and table3 where rest of the information from txtfile goes. For table2 and table3 there are multiple record per txtfiles.in our txt file each row is separated with row header, like HTC100WITH ROW NO. 1,2,3.., which indecate this information goes to tableand 1,2....are the different row number.Please let me know whether we can achive this task or not.Thanks for all you help in advance.Indra.I have pasted my txt file below:========"FHS000",20041029,0900,,"10",1,"TRAILB10_20041029_1B",3,"2.20","Pason-DataHub",,"#Well 1098831406 Tour 2004/10/29 Trailblazer 10 148",1,"EDR_3-0-10_HF2ETS 2.2""CON000",1,0000,0759"CON000",2,0800,1559"CON000",3,1600,2359"HWI010","0312857","COMPTON BRANT 15-7-18-24","COMPTON PETROLEUMCORP.","TRAILBLAZER DRILLINGCORP.","15-07-018-24W4","100/15-07-018-24W4/00","HANKPARANYCH","CURTIS FIESEL",20041029,,,"10",20041027,0600,,,"148","DD04485","VERT.","NO",,"HCO030",1,"Daily Walk Around Inspection","HP","CF""HCO030",2,"Detailed Inspection - Weekly (using checklist)","HP","CF""HCO030",3,"H2S Signs Posted (if required)",,"HCO030",4,"Well License & Stick Diagram Posted","HP","CF""HCO030",5,"Flare Lines Staked","HP","CF""HCO030",6,"BOP Drills Performed","HP","CF""HCO030",7,"Visually Inspect BOP's - Flarelines and DegasserLines","HP","CF""HDC040",1,"Rig Site Health and Safety Meeting (one/crew/month)","CF""HDC040",2,"C.A.O.D.C. Rig Safety Inspection Checklist(one/rig/month)","CF""HDC040",3,"Mast Inspection Before Raising or Lowering","CF""HDC040",4,"Crown Saver Checked","CF""HDC040",5,"Motor Kills Checked","CF""HFU050",2300,2100,,"HWE060",-5,"Deg C","COOL","WEST","SLIPPERY",,"HCS070",1,177.8,,"mm",25.3,"STELCO","J-55",8,108.44,3.84,108.44,"HCS070",2,114.3,,"mm",14.14,"STELCO","J-55",72,979.50,3.84,979.0,"HDP080",1,127,79.4,"kg/m","E",57,127,"mm","3 1/2 IF",10,"DC","HDP080",2,89,19.7,"kg/m","E",68,120,"mm","3 1/2 IF",15,"DP","HPU090",1,"F-800","EMSCO",254,"mm",,,,"HPU090",2,"F-800","EMSCO",254,"mm",,,,"HTC100",1,"Rig up and tear down""HTC100",2,"Drill Actual""HTC100",3,"Reaming""HTC100",4,"Coring""HTC100",5,"Condition Mud & Circulate""HTC100",6,"Trips""HTC100",7,"Rig Service""HTC100",8,"Repair Rig""HTC100",9,"Cut off drilling line""HTC100",10,"Deviation Survey""HTC100",11,"Wire Line Logs""HTC100",12,"Run Case & Cement""HTC100",13,"Wait on Cement""HTC100",14,"Nipple up B.O.P.""HTC100",15,"Test B.O.P.""HTC100",16,"Drill Stem Test""HTC100",17,"Plug Back""HTC100",18,"Squeeze Cement""HTC100",19,"Fishing""HTC100",20,"Directional Work""HTC100",21,"Safety Meeting""HTC100",24,"WOD""HSS110",1,1,"SWACO","N","110",,"84",,"HPA130","COMPTON BRANT 15-7-18-24",20041029,"COMPTON PETROLEUMCORP.","TRAILBLAZER DRILLING CORP.","CURTISFIESEL","10","ALBERTA","N",253"TCP130",1,,,,"kPa",140,,,,"mm",,"TCP130",2,,,,"kPa",140,,,,"mm",,"TCP130",3,,,,"kPa",140,,,,"mm",,"TTL160",1,1,0.00,0.25,0.25,21,"SAFETY MEETING WITH TONG HAND""TTL160",1,2,0.25,1.75,1.50,12,"RIG TO AND RUN CASING""TTL160",1,3,1.75,2.00,0.25,7,"RIG SERVICE""TTL160",1,4,2.00,2.50,0.50,5,"CONDITION MUD & CIRC.""TTL160",1,5,2.50,2.75,0.25,21,"SAFETY MEETING WITH CEMENTERS""TTL160",1,6,2.75,3.50,0.75,12,"RIG TO AND CEMENT CASING""TTL160",1,7,3.50,6.00,2.50,1,"SET SLIPS, TEAR OUT RIG, CLEAN TANKS""TTL160",1,8,6.00,8.00,2.00,24,"WAIT ON DAYLIGHT/TRUCKS""TTL160",1,9,,,,,"CEMENT WITH BJ USING 13 TONNES OF BVF-1500 NP + .7%FL-5,GIVING 15.5 m3 OF GOOD""TTL160",1,10,,,,,"SLURRY @ 1718 kg/m3,PLUG BUMPED & HELD @ 03:30 HRSOCT 29/04.""TTL160",1,11,,,,,"RIG RELEASED @ 08:00 HRS OCT 29/04""TTL160",1,12,,,,,"MOVE TO 12-3-18-25W4""TDI170",1,"JEFF CASE",8,10,475,"Deg C",,,"RUNNING CASING",,,,,"TLN175",1,"VISUALLY INSPECT PINS, RAMS AND STOOLS PRIOR TO LAYINGOVER DERRICK""TPA180",1,1,"DRILLER",647172865,"JEFF CASE",8,,,"JC""TPA180",1,2,"DERRICK HAND",648519056,"BRYAN VANHAM",8,,,"BV""TPA180",1,3,"MOTOR HAND",651056533,"NEIL WILLIAMS",8,,,"NW""TPA180",1,4,"FLOOR HAND",640352662,"TARAS WOITAS",8,,,"TW""TPI190",1,"REG",25,,,,,,"TPI190",2,"REG",25,,,,,,"TPI190",3,"REG",25,,,,,,=====
View 4 Replies View RelatedHi,
I have a complex query where each row in the final dataset is a
product.
However each product has a number of authors associated with it.
What I
would like to do is have a query/subroutine join the authors to the
product,
as a string:
ProductID
Title
Authors
1 The Sacred and the Profane John Rieggle, George
Alexi
2 Dancing
in the Dark Dan
Brown, Peter Kay, Paul
Dwebinski
Products
Table
==============
ProductID
Title
Authors
Table
=============
AuthorID
Name
Product Authors
Table
=====================
AuthorID
ProductID
Is this at all
possible?
Thanks
jr.
Hi All
This is my first official post...very exciting:)
OK....
I need to split a string, based on comma delimetrs, into columns of a table.
I am using SQL Server 2005.
The plan is to use webmethods (Integration Software) to receive a flat file, loop through the records in that flat file and during each iteration of the loop call a SQL stored procedure that will split the record, based on comma delimetrs, into columns of a table.
A typical record will look like this:
AP05-07,ACTUAL,ZA01,......
I have looked at some of the past solutions to this type of post, but am battling to understand....
So if its possible, is there a simple way to create a stored procedure that will do this?
Many thanks
'm getting the following error when trying to import an Excel file into SQL..I'm using SQL Server 2014 Express
- Validating (Error)
Messages
Error 0xc00470b6: Data Flow Task 1: The LocaleID 0 is not installed on this system. (SQL Server Import and Export Wizard)
Error 0xc004706b: Data Flow Task 1: "Source - Sheet1$" failed validation and returned validation status "VS_ISBROKEN". (SQL Server Import and Export Wizard)
Error 0xc004700c: Data Flow Task 1: One or more component failed validation. (SQL Server Import and Export Wizard)
Error 0xc0024107: Data Flow Task 1: There were errors during task validation. (SQL Server Import and Export Wizard).
SQL7 SP3
Hi.
I have a table in which I want to create a delimited list of values from one field which I will be using for validation.
How can I do this without using a cursor to build the string. The SQL would be something like:
SELECT *
FROM myCrossRefTable
WHERE SourceTable = 'FieldValueList'
I'm looking to return on string like -
~value1~value2~value3~value4~value5
Thanks,
Craig
I have a need to create a delimited string so that I can use this to create a data driven subscription on SSRS.
In the below code, I need to create a delimited string using the branch number, grouped by the email address
USE tempdb
GO
IF OBJECT_ID('tempdb..#emails') IS NOT NULL
BEGIN
DROP TABLE #emails
[code]....
how to import a text file with a list of NI numbers into a new table with a column to list all the NI numbers? I think I use the Select INTO clause, but not sure how to do this?
View 1 Replies View RelatedI'm importing comma-delimited text files into a SQL table. The data imports in a seemingly random order. One time I import and the lines appear one way and the next time I import they import another way.
Is there a way to force the text files to import in the same order the data is found in the file?
Can we do that and how pls?
The purpose is to avoid creating sql server table structure manually when we already have xsd file.
Is it possible to create a schema or table in sql server from a dbf file instead of manully creating it
Regards
Karen
used bcp utility to send data to output file in tab-delimited format (-t ), but headerfile is separate entity in this query.
when I set FILEheader = firstname,lastname...what must I use to change the comma to tab in the header string. I have tried various ways , {t}, [-t], and others. what am I missing?
Is there a faster way to create my pipe delimited BCP file, besides from creating a format file? Actually, my problem is that I am having issue with the file. It looks perfect, like:
Code Snippet
Marie|32|brown|single
Gay|33|black|married
But when I load it to DataStage it puts the entire row as one column. I already specified the | as the delimiter in DataStage. I think the issue is from the column collation. If my data is as simple as my example above, what column collation should i use for the format file? Currentyl, i have something like:
Code Snippet8.0
4
1 SYBCHAR 0 4 "|" 1 emp_id ""
2 SYBCHAR 0 4 "|" 2 emp_cand_id ""
3 SYBCHAR 0 4 "|" 3 emp_statusid ""
4 SYBCHAR 0 4 "
" 4 emp_type SQL_Latin1_General_CP1_CI_AS
I generated this through prompts given by BCP for each column. Then I changed the 2nd column values all to SYBCHAR and 0 prefix length..
I have a tab delimited file with 122 columns. Can any one let me know if there is a better way of parsing/extracting few columns (say about 15) from the file and loading it into a table using SSIS.
Hi I'm pretty new to using Microsoft Visual C# .NET and I want to upload a comma delimited text file from my local machine into a table in an sql server database through a web app. How would I go about programming this and what controls do I need? Any help would be much appreciated. Thanks in advance.
View 4 Replies View RelatedI'm trying to upload a small Web application with a one table database. The hosting company, GoDaddy requires that I upload the database as a comma delimited file.
I created the database in Visual Web Developer Express but also have Visual Studio and SQL Server Management Studio Express.
I can't figure out how to export the database into a comma delimited file using any of these tools.
This should be simple like it is in Access but that doesn't seem to be the case. This is holding up deploying my Web Application.
Can anyone help me?
Thanks