Importing CSV File In To Database Table
Aug 12, 2007Hi to all
I am working on import module.
is there any direct query to import a text(CSV) file in to a database table?
any one can help me in this matter?
Hi to all
I am working on import module.
is there any direct query to import a text(CSV) file in to a database table?
any one can help me in this matter?
good morning,
I want to load data that i receive everydays from my customers in .xls file format (excel) or cvs file format, to the database that i have created on this purpose. but when trying to do that whith SSIS; i got an error message .... that i can't import redudant data in my database column.
Best regards.
Hi all-
I am in need of some help importing a .CSV file into a SQL Server 2005 Enterprise Edition.
The problem is I already implemented Bulk Insert task in SSIS but it is not importing any data. My detailed layout is as follows :
In SSIS package1 -
In Control Flow Bulk Insert Task has been inserted
Properties of Bulk Insert Task:
Connection adtc009d.ganny
Destination Table ganny.dbo.t4
Format
Format Specify
Row Delimiter {CR}
Column Delimiter Comma{,}
Source Connection
File r.csv
Options
Options Check Constraints
Maxerrors 20
This bulk insert task is connected to Data flow task, if we click edit to data flow task, data flow section will come, here Flat file source & OLE DB Destination is there. Flat file source is connected to OLE DB Destination.
Properties of Flat File
Connection Manager
Flat file connection Manager
here by clicking new link flat file properties to this.
Preview
by clicking preview all data are visible
Properties of OLE DB Destination editor
Oledb connection manager adtc009d.ganny
Data access mode: Table or View - fast load
Name of Table or view dbo.t4
After designing all this then if I start debugging I could able to get records are imported to a table.
Please suggest me where I am going wrong.
Thanks in advance
Karna
Hello Everyone,
I would like to import a text file which contains one string (a large integer) per line not separated by commas or anything else except a carriage return. Does anyone know of an easy way to store this in a database file? I'm open to suggestions if there is more than one way to save this kind of information within a database. I have SQL server 2005 developer edition if that helps in any way. I'm also starting to learn about Linq so if there is some other way you would store this information for that purpose I would love to hear about that as well. C# code is preferable, but I can use the automatic translators if that's all you have. By the way, I'm a newbie to this subject (if you couldn't tell). Thanks in advance.
Robert
Hi, I am very new to ssis, I am trying to set up a package that I can set to run every hour or so, what it will do is look for all excel files in a certain folder and import them into a table on an sql server.
I managed to get it working but my problem is that my data is survey answers and some columns contain a comment.
I get these files on a weekly basis and some weeks the length of the longest comment makes ssis want to use a different data type for the comment column (sometimes it wants DT_NTEXT, other times it wants DT_WSTR).
as this column is filled out by a human I guess aposrtophies and other characters may affext this as well.
when I made the data flow task, I used the wizard on a file which contained about 8 weeks worth of data. when I use 1 weeks worth of data where the comment length is very low, the task gives a validation error saying the metadata is out of sync. if I go back and set the data type for that column to DT_WSTR and rerun the task, it works but then when it tries to process a different weeks worth of data it will fail again
here is an example of an error I get when it thinks the data type is wrong.
[Source - 'raw data$' [1]] Error: The output column "Question ID50# (Verbatim)" (439) on the error output has properties that do not match the properties of its corresponding data source column.
I played around with the data types for a while and managed to get it to process the first file and then try to process the secondfile, in the second file it got around the validation but then got this error:
[Source - 'raw data$' [1]] Error: Failed to retrieve long data for column "Question ID3# (Verbatim)".
is there a way to make it recalculate the data types by itself for each excel file?
I am stuck trying to figure this one out.
sorry if I havent provided enough information, I am not sure which direction to head with this
I hope this is the right forum for my question.
I'm developing a website for a Prepaid Calling Cards distributor. Each of the cards they sale have a list of the countries the card is good for. I need to import this data into my countries_rates table. The file they are giving me is an excel file that contain 3 colums (fields)
1- Country-Name
2- Rate
3- Card_$_Price
these files contain aproximaly 400 rows so it will be a hasle to have to insert it manually every week.
In my web application I need to create a form where the user will select the card from a dropdownlist and then find the excel file to be imported for that card.
I would like to know how do I do that with Visual Studio 2005, SLQ 2005 and C#
please direct me to some links where I can learn how to do this or please send me some code snips I can see how is done.
Tia
Charles
Hello
I am writing program in VC++ through SQl-DMO calls.My problem is when i when i tranfer(import) a text file(comma seperated) into SQl server through a SQl-DMO method called ImportData which is a method of Bulk copy object.Its is not able to convert the data field in the text file to corresponding value datetime in SQl server whereas other data types are working perfectly.
This is the record i need to convert:
90,MichaelB,Wintriss,Inspection,Paper,11,Job101,1, {ts '2000-12-10 15:54:56.000'},D:public233 and 247233.mcs,
and this is the date field
{ts '2000-12-10 15:54:56.000'}
Whereas if i export a table in SQl server in Binary mode and then import the file back it works but when do it as text it gives the above error
Pls help me in this i would be very thankful to you.
Note: I am using SQL Server 7.0 version
Regards
Jitender Singh
Hi There,I am looking for information on how to import the txt or csv file tothe multiple table in sql 2000. If you have any kind of inf. pleaselet me know wheather we can do this and how.below is the detail information.I received txt file every day which contain the information from 3different related table in my sql 2000 database. Right now we arekeyin the information from the web site (which is link to the txtfile) to our database, but i am wondering if we can import thoserecord in the tables.the header of the file goes to table1 and when we insert the record intable1, it should generate the autoidentityrecord (PK), and that PK islink to other table2 and table3 where rest of the information from txtfile goes. For table2 and table3 there are multiple record per txtfiles.in our txt file each row is separated with row header, like HTC100WITH ROW NO. 1,2,3.., which indecate this information goes to tableand 1,2....are the different row number.Please let me know whether we can achive this task or not.Thanks for all you help in advance.Indra.I have pasted my txt file below:========"FHS000",20041029,0900,,"10",1,"TRAILB10_20041029_1B",3,"2.20","Pason-DataHub",,"#Well 1098831406 Tour 2004/10/29 Trailblazer 10 148",1,"EDR_3-0-10_HF2ETS 2.2""CON000",1,0000,0759"CON000",2,0800,1559"CON000",3,1600,2359"HWI010","0312857","COMPTON BRANT 15-7-18-24","COMPTON PETROLEUMCORP.","TRAILBLAZER DRILLINGCORP.","15-07-018-24W4","100/15-07-018-24W4/00","HANKPARANYCH","CURTIS FIESEL",20041029,,,"10",20041027,0600,,,"148","DD04485","VERT.","NO",,"HCO030",1,"Daily Walk Around Inspection","HP","CF""HCO030",2,"Detailed Inspection - Weekly (using checklist)","HP","CF""HCO030",3,"H2S Signs Posted (if required)",,"HCO030",4,"Well License & Stick Diagram Posted","HP","CF""HCO030",5,"Flare Lines Staked","HP","CF""HCO030",6,"BOP Drills Performed","HP","CF""HCO030",7,"Visually Inspect BOP's - Flarelines and DegasserLines","HP","CF""HDC040",1,"Rig Site Health and Safety Meeting (one/crew/month)","CF""HDC040",2,"C.A.O.D.C. Rig Safety Inspection Checklist(one/rig/month)","CF""HDC040",3,"Mast Inspection Before Raising or Lowering","CF""HDC040",4,"Crown Saver Checked","CF""HDC040",5,"Motor Kills Checked","CF""HFU050",2300,2100,,"HWE060",-5,"Deg C","COOL","WEST","SLIPPERY",,"HCS070",1,177.8,,"mm",25.3,"STELCO","J-55",8,108.44,3.84,108.44,"HCS070",2,114.3,,"mm",14.14,"STELCO","J-55",72,979.50,3.84,979.0,"HDP080",1,127,79.4,"kg/m","E",57,127,"mm","3 1/2 IF",10,"DC","HDP080",2,89,19.7,"kg/m","E",68,120,"mm","3 1/2 IF",15,"DP","HPU090",1,"F-800","EMSCO",254,"mm",,,,"HPU090",2,"F-800","EMSCO",254,"mm",,,,"HTC100",1,"Rig up and tear down""HTC100",2,"Drill Actual""HTC100",3,"Reaming""HTC100",4,"Coring""HTC100",5,"Condition Mud & Circulate""HTC100",6,"Trips""HTC100",7,"Rig Service""HTC100",8,"Repair Rig""HTC100",9,"Cut off drilling line""HTC100",10,"Deviation Survey""HTC100",11,"Wire Line Logs""HTC100",12,"Run Case & Cement""HTC100",13,"Wait on Cement""HTC100",14,"Nipple up B.O.P.""HTC100",15,"Test B.O.P.""HTC100",16,"Drill Stem Test""HTC100",17,"Plug Back""HTC100",18,"Squeeze Cement""HTC100",19,"Fishing""HTC100",20,"Directional Work""HTC100",21,"Safety Meeting""HTC100",24,"WOD""HSS110",1,1,"SWACO","N","110",,"84",,"HPA130","COMPTON BRANT 15-7-18-24",20041029,"COMPTON PETROLEUMCORP.","TRAILBLAZER DRILLING CORP.","CURTISFIESEL","10","ALBERTA","N",253"TCP130",1,,,,"kPa",140,,,,"mm",,"TCP130",2,,,,"kPa",140,,,,"mm",,"TCP130",3,,,,"kPa",140,,,,"mm",,"TTL160",1,1,0.00,0.25,0.25,21,"SAFETY MEETING WITH TONG HAND""TTL160",1,2,0.25,1.75,1.50,12,"RIG TO AND RUN CASING""TTL160",1,3,1.75,2.00,0.25,7,"RIG SERVICE""TTL160",1,4,2.00,2.50,0.50,5,"CONDITION MUD & CIRC.""TTL160",1,5,2.50,2.75,0.25,21,"SAFETY MEETING WITH CEMENTERS""TTL160",1,6,2.75,3.50,0.75,12,"RIG TO AND CEMENT CASING""TTL160",1,7,3.50,6.00,2.50,1,"SET SLIPS, TEAR OUT RIG, CLEAN TANKS""TTL160",1,8,6.00,8.00,2.00,24,"WAIT ON DAYLIGHT/TRUCKS""TTL160",1,9,,,,,"CEMENT WITH BJ USING 13 TONNES OF BVF-1500 NP + .7%FL-5,GIVING 15.5 m3 OF GOOD""TTL160",1,10,,,,,"SLURRY @ 1718 kg/m3,PLUG BUMPED & HELD @ 03:30 HRSOCT 29/04.""TTL160",1,11,,,,,"RIG RELEASED @ 08:00 HRS OCT 29/04""TTL160",1,12,,,,,"MOVE TO 12-3-18-25W4""TDI170",1,"JEFF CASE",8,10,475,"Deg C",,,"RUNNING CASING",,,,,"TLN175",1,"VISUALLY INSPECT PINS, RAMS AND STOOLS PRIOR TO LAYINGOVER DERRICK""TPA180",1,1,"DRILLER",647172865,"JEFF CASE",8,,,"JC""TPA180",1,2,"DERRICK HAND",648519056,"BRYAN VANHAM",8,,,"BV""TPA180",1,3,"MOTOR HAND",651056533,"NEIL WILLIAMS",8,,,"NW""TPA180",1,4,"FLOOR HAND",640352662,"TARAS WOITAS",8,,,"TW""TPI190",1,"REG",25,,,,,,"TPI190",2,"REG",25,,,,,,"TPI190",3,"REG",25,,,,,,=====
View 4 Replies View Related
How do I do this?
I cannot find any facility like there is a Access for getting external data.
I have a text file I am trying to import to a table. This text file is in a tab delimited format. I am using DTS to import the data to a new table I made. The fields are varchar and are set to allow nulls & allow 8,000 characters per field.
The error I am getting is that the data exceeds the allowed amount (or something like that) in col4.
Now I have checked everything in column 4 and nothing exceeds 5,000 spaces/characters combined. I have checked the entire sheet (in excel) for that fact, and there is not one single column/row/cell that exceeds 5,000 spaces/characters combined.
What the heck could be causing SQL to tell me I am trying to import too much data in one column when there is nothing that even comes close to 8,000 characters & spaces combined?
'm getting the following error when trying to import an Excel file into SQL..I'm using SQL Server 2014 Express
- Validating (Error)
Messages
Error 0xc00470b6: Data Flow Task 1: The LocaleID 0 is not installed on this system. (SQL Server Import and Export Wizard)
Error 0xc004706b: Data Flow Task 1: "Source - Sheet1$" failed validation and returned validation status "VS_ISBROKEN". (SQL Server Import and Export Wizard)
Error 0xc004700c: Data Flow Task 1: One or more component failed validation. (SQL Server Import and Export Wizard)
Error 0xc0024107: Data Flow Task 1: There were errors during task validation. (SQL Server Import and Export Wizard).
Hi every body.
As you can see in the title I would like to import my tables from the data base which is in the SQL Server, to an mdf file which can be attached to the project.Can some one give an indice please?
Hi
I am trying to to import a flat file into a table in my database, i get all the values right except for the date, it keeps on inserting NULL values into the date fields.
The date format in the flat file is '20070708' etc.
Does anyone know what i can do to fix this?
I've tried to change the datatype values that it imports, but it still ignores it and inserts NULL values
Any help will be greatly appreciated
Kind Regards
Carel Greaves
how to import a text file with a list of NI numbers into a new table with a column to list all the NI numbers? I think I use the Select INTO clause, but not sure how to do this?
View 1 Replies View RelatedHi everyone, sorry if this message is not supposed to be posted here.
I'm learning asp.net , and would like to know how I can insert data from a excel file into a database on my web account. Pretty much insert/update information in the database using excel file or a access file.
Thanks a lot in advance
I am using the Bulk Insert command and trying to import a CSV delimited text file into a table and I am having problems with the quote field delimiters ", " The command below works but it takes in all the "" quotes as well and the field delimiter comma , works only if the commas are the separators only. If I have a comma within a address field for example then the data gets imported into the wrong fields. What can I use to identify that the text qualifier is ". I don't see where I can use the bulk insert command to determine this. Is there another command that I can use or am I using this command incorrectly. I thank you in advance for any response or suggestion you may have.
BULK INSERT AdventureWorks.dbo.MbAddress
FROM 'a:mbAddress.txt'
WITH (
DATAFILETYPE = 'char',
FIELDTERMINATOR=',',
ROWTERMINATOR='',
CODEPAGE = '1252',
KEEPIDENTITY,
KEEPNULLS,
FIRSTROW=2)
Here is a sample ascii file I am importing as well you can see that 6330 has a extra comma in the address line.
"AddressAutoID","Memkey","Type","BadAddress","Address1","Address2","Address3","City","State","Zip","Foreign","CarrierRoute","Dpbc","County","CountyNo","ErrorCode","ChangeDate","UserID"
6317,26517,1,0,"1403 W. Kline Ave","","","MILWAUKEE","WI","53221","","",0.00,"MILWAUKEE",79,"",1/25/2006 0:00:00,"admin"
6318,26225,1,0,"501 Dunford Dr","","","BURLINGTON","WI","53105","","",0.00,"RACINE",101,"",1/25/2006 0:00:00,"admin"
6319,20101,1,0,"2115 Cappaert Rd #35","","","MANITOWOC","WI","54220","","",0.00,"MANITOWOC",71,"",1/25/2006 0:00:00,"admin"
6320,23597,1,0,"728 Woodland Park Dr","","","DELAFIELD","WI","53018","","",0.00,"WAUKESHA",133,"",1/25/2006 0:00:00,"admin"
6321,23392,1,0,"7700 S. 51st St","","","FRANKLIN","WI","53132","","",0.00,"MILWAUKEE",79,"",1/25/2006 0:00:00,"admin"
6322,26537,1,0,"W188 S6473 GOLD DRIVE","","","MUSKEGO","WI","53150","","",0.00,"WAUKESHA",133,"",1/26/2006 0:00:00,"admin"
6323,25953,1,0,"3509 N. Downer Ave","","","MILWAUKEE","WI","53211","","",0.00,"MILWAUKEE",79,"",1/26/2006 0:00:00,"admin"
6324,19866,1,0,"10080 E. Mountain View Lake Rd. #145","","","SCOTTSDALE","AZ","85258","","",0.00,"MARICOPA",13,"",1/27/2006 0:00:00,"admin"
6325,25893,1,0,"W129 N6889 Northfield Dr. Apt 114","","","MENOMONEE FALLS","WI","53051-0517","","",0.00,"WAUKESHA",133,"",1/27/2006 0:00:00,"admin"
6326,26569,1,0,"8402 64th Street","","","KENOSHA","WI","53142-7577","","",0.00,"KENOSHA",59,"",1/27/2006 0:00:00,"admin"
6327,24446,4,0,"83 Sweetbriar Br","","","LONGWOOD","FL","32750","","",0.00,"SEMINOLE",117,"",1/30/2006 0:00:00,"admin"
6328,19547,1,0,"4359 MERCHANT AVENUE","","","SPRING HILL","FL","34608","","",0.00,"HERNANDO",53,"",2/8/2006 0:00:00,"admin"
6329,26524,1,0,"264 Lakeridge Drive","","","OCONOMOWOC","WI","53066","","",0.00,"WAUKESHA",133,"",2/10/2006 0:00:00,"admin"
6330,23967,1,0,"3423 HICKORY ST","100 Tangerine Blvd., Brownsville, TX 78521-4368","Texas Phone Number: 956-546-4279","SHEBOYGAN","WI","53081","","",0.00,"SHEBOYGAN",117,"",2/15/2006 0:00:00,"admin"
6331,25318,1,0,"3960 S. Prairie Hill Lane Unit 107","","","Greenfield","WI","53228","","",0.00,"MILWAUKEE",79,"",2/20/2006 0:00:00,"admin"
6332,24446,1,0,"83 Sweetbriar BR","","","LONGWOOD","FL","32750","","",0.00,"SEMINOLE",117,"",2/21/2006 0:00:00,"admin"
6333,26135,1,0,"P.O. Box 8 127 Main Street","","","CASCO","WI","54205","","",0.00,"KEWAUNEE",61,"",2/21/2006 0:00:00,"admin"
I'm importing comma-delimited text files into a SQL table. The data imports in a seemingly random order. One time I import and the lines appear one way and the next time I import they import another way.
Is there a way to force the text files to import in the same order the data is found in the file?
I have a database dump using mysqldump. Now I want to import a table from the dump file. How would one do this?
I've used mysqlimport < table.sql, but what if I have databasedump.sql which includes all the tables, but I only want to import ONE table?
-D
Hi
I am facing issue while loading data from excel into Sql database table.
The format of spreadsheet is like
Date -
Name - A B C
Section Marks
X1 10 10 10
X2 20 30 10
X3
I need to load this data into table with columns
Date
Name
Section
Marks
Please share your thoughts on how to load this kind of data into table.
Thanks
Anshu
Greetings, I have just arrived back into the country (NZ) and back into ASP.NET.
I am having trouble with the following:An attempt to attach an auto-named database for file (file location).../Database.mdf failed. A database with the same name exists, or specified file cannot be opened, or it is located on UNC share.
It has only begun since i decided i wanted to use IIS, I realise VWD comes with its own localhost, but since it is only temporary, i wanted a permanent shortcut on my desktop to link to my intranet page.
Anyone have any ideas why i am getting the above error? have searched many places on the internet and not getting any closer.
Cheers ~ J
Hello,
I am using the SSIS import/export wizard to import an Access table into a sql server 2005 database. The table has fields in Arabic (name, last name, etc.) and non-Arabic fields (gender, phone number, category, etc.).
The destination table has nvarchar columns.
After the import, I can see the Arabic characters in the destination table, but they appear in inverse order (from left to right). In Access (or Excel), Arabic fields appear as they should (from right to left) and non-Arabic fields are OK as well (from left to right).
If I do a simple copy-and-paste of a "correct" Arabic text into the table, the result is still wrong (inversed letters)...
Please help, I can't see what else to do.
Thank you.
I am looking for a Sql query to verify the inserted values from one table(in CSV file) with another table(in sql database)
For example: I have below Values column that is present in once CSV file, after my data migration the values get stored in Results table under Message column.
I need to very whether values(1X,1Y) are inserted in Message record "successfully inserted value 1X"
Values (CSV)
1X
1Y
Results Table(SQL)
CreatedDate                  Message
2015-08-04 08:45:29.203Â Â successfully inserted value 1X
2015-08-04 08:44:29.103Â Â TEst pass
2015-08-04 08:43:29.103Â Â successfully inserted value 1X
2015-08-04 08:42:29.203Â Â test point
2015-08-04 08:35:29.203Â Â successfully inserted value 1Y
2015-08-04 08:30:29.203Â Â Test Pass
2015-08-04 08:28:29.203Â Â successfully inserted value 1Y
If all values are inserted:
Output:
All values from values table are inserted successfully
Total count of values inserted: 2
If only few values are inserted, example only 1X from Values table is inserted in Message
Example:
Results Table CreatedDate    Message
2015-08-04 08:45:29.203Â Â successfully inserted value 1X
2015-08-04 08:44:29.103Â Â TEst pass
2015-08-04 08:43:29.103Â Â successfully inserted value 1X
2015-08-04 08:42:29.203Â Â test point
Output:
All values from values are not inserted successfully in result table.
Total count of values inserted: 1
Missing Values not inserted in results table are: 1Y
Any easy way for a batch file or automated process to read from one db and table and what ever entry is missing out of another database + table it writes those missing entries to.
This is a simple table in one db that is filled with usernames, I want to see if there are missing usernames in another db and table and write those entries.
db1.usr_table.usr_name = jdoenew
If jdoenew is missing in the 2nd db I will need to write entries like:
db1.usr_table.usr_name = jdoenew
db1.usr_table.password = tmppassword
db1.usr_table.active = 1
Hi i'v e installed the file: SQLServer2005_DTS.msiBecause i've heard that i need that to import a csv file into a ms sql database. now i have no idea how to work it, like to even make it open.Where would i open it from? I reckon i'v eread enough literature about using it, to be able to have a bash... thanks in advance...
View 3 Replies View RelatedHi, this is my first time and was wondering what's the best way to do the following:
i've sql server 6.5. also have a sql database(.DATfile) in a folder.
i need to import this database to my server. should i just LOAD/import it? or do i need to first create a database device on my server(how much size should i allocate etc) and then load it??
thanks a lot
rohit
Hi guys,
I have to write a Store procedure which will pickup a txt file from a destination, read it and update some table.
How to pick and read a txt file in a Stored procedure.
Thanks in advance
Vineet
Hi,
i am having problems while importing a .csv file.
when i import a .csv file in sql server I get the folowing error message.
"Cannot create an OLE DB accessor. Verify that the column metadata is valid".
Any help would be appreciated
I have a .txt file that I need to add to an existing table, which ismade up of Varchar, Char, numeric and int fields.What is the best way to do it.The first thing I tried was importing the .txt file, and then goinginto the design and changing the field type, but I hit problems on theNumeric fields.Then I tried changing the field types from varchar to different typesat the transfom stage of importing, but that failed too.Regards,Ciarán
View 2 Replies View RelatedI have a CSV file that i need to import into a SQL table. The problemis the values in the first column are not brackited in "". There areover 700K rows. Is there an easy way to fix the data or have SQLcorrectly import the the data?The data looks like this1, "xxx","zzz"2, "aaa","bbb"an so on...Thanks
View 1 Replies View RelatedHi all-
I am in need of some help importing a .CSV file into a SQL Server 2005 Express database. I can't use SSIS because it's SQL Server Express. The operation needs to run as a parameterized Stored Proc which I will call from ASP.NET.
The problem is that I need to get the only 2 columns from the text file, then "tack on" 3 more columns that will have data that changes, which will come in as parameters to the T-SQL stored proc. This prevents me from using a straight BULK INSERT.
The data in each of these columns will be the same for each record in that column - that is, every record in every field in column 3, for example, will be the same.
Some of the files will have 4 million records and up, so speed is of the essence here. I tried using BULK INSERT to dump the data into a #temporary table - which took 38 seconds and was acceptably fast - but then, my next step was to dump the additional data into the other columns using UPDATE... SET. I gave up on this after the query ran for THIRTY MINUTES! My next step was going to be to move the data from the temporary table into the target permanent table somehow, but I never got that far when I saw how long the previous step took...
It's a little odd, because I can do the same thing in MS Access in under 5 minutes by using a SQL statement like this in my ASP.NET code:
"INSERT INTO [Target_Table] (field1, field2, field3, field4)" & _
"SELECT F1, F2, '" & strSource & "', Now() AS DateTimeStamp " & _
"FROM [Text;HDR=NO;DATABASE=" & strPath & ";].[" & strFilename & "]"
strSource is a string that the User enters (properly vetted for security); strPath and strFilename are strings holding the path and filename to the .CSV file. I create a unique filename from the file the user uploads. This works in under 5 minutes for several millions of records in MS Access, as I said above. I've had no luck getting anything similar to work in SQL Server, though.
Anyway, does anyone have any ideas? This is somewhat urgent, as the project was about to go "live" when it was discovered that the actual, live data had grown to the point where Access couldn't hold it, and a move to SQL Server Express was necessary.
Thanks in advance,
-Andrew
I need to be able to create a DTS package that imports a CSV file which is loated at URL. I.E. HTTP://www.url.com/csv/thefile.xls I tried copying the URL an pasting it in the file location when in SQL wazird but I got an error message.
View 1 Replies View RelatedI'm a newbie to SSIS, so there is probably an easy solution:
I'm importing from a csv file into a db table, but would like to remove the quotation marks from around the data before inserting into the table.
Are there any suggestions of how to remove the "data" marks? Would a Foreach Loop container work for this?
Hi all,
Hopefully I am posting this question in the correct forum. I am still learning about SQL 2005. Here is my issue. I have an access db that I archive weekly into and SQL server table. I have used the dst wizard to create an import job and initally that worked fine. field I have as the primary key in the access db cannot be the primary key in the sql table since I archive weekly and that primary key field will be imported several time over. I overcame this initally by not having a primary key in the sql table. This table is strictly for reference. However, now I need to setup a unique field for each of the records in the sql table. What I have done so far is create a recordID field in the sql table that is an int and set as yes to Identify (auotnumber). That worked great and created unique id for all existing records. The problem now is on the import. When I try to import the access table i am getting an error because of the extra field in the sql table, and the error is saying cannot import null value into this field. So... my final question is how can I import the access table into the sql table with one extra field which is the autonumber unique field? Thanks a bunch for any asistance.
Bill