BULK Insert From A Text File Name Stored Into A Variable
Mar 25, 2004
Hello I need to write a proc to load data from txt files I receive into a table. It works fine when I specify
bulk insert.... from 'myfilename.txt'
BUT my filename will always change and I store it into a variable @filename
When I try to run the bulk insert instruction ... from @filename it doesn't work..
do you know why?
Thank you in advance
View 1 Replies
ADVERTISEMENT
Jan 2, 2008
Hey All,
Similar to a previous post (http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=244646&SiteID=1), I am trying to import data into a SQL Table.
I am trying to program a small application that will import product data obtained through suppliers via CD-ROM. One supplier in particular uses Fixed width colums, and data looks like this:
Example of Data
0124015Apple Crate 32.12
0124016Bananna Box 12.56
0124017Mango Carton 15.98
0124018Seedless Watermelon 42.98
My Table would then have:
ProductID as int
Name as text
Cost as money
How would I go about extracting the data with an XML Format file? I am stumbling over how to tell it where to start picking up data for a specific column.
Is there any way that I could trim the Name column (i.e.: "Mango Carton " --> "Mango Carton")?
I don't know if it makes any difference, but I've been calling SQL from my code by doing this:
Code in C# Form
SqlConnection SqlConnection = new SqlConnection(global::SQLClients.Properties.Settings.Default.ClientPhonebookConnectionString);
SqlCommand cmd = new SqlCommand();
cmd.CommandType = CommandType.Text;
cmd.CommandText = "INSERT INTO PhonebookTable(Name, PhoneNumber) VALUES('" + txtName.Text.ToString() + "', '" + txtPhoneNumber.Text.ToString() + "')";
cmd.Connection = SqlConnection;
SqlConnection.Open();
cmd.ExecuteNonQuery();
SqlConnection.Close();
RefreshData();
I am running Visual Studio C# Express 2005 and SQL Server Express 2005.
Thanks for your time,
Hayden.
View 1 Replies
View Related
Jun 27, 2002
Hi ,
I have imported an event log file from an NT server in a test database . The table has been created automatically with some 10 columns as Col001...Col002 ...and so on till Col0010 .
Now i want to copy the data of the event logs to specific columns . So i created a table with name such as Evnt , date , time , server and evntdescription so that whenever i can execute a simple query like
Select * from tablename where type = 'app' , Server = 'test1 ' .
so that i get all the results for that server 'test1 with type 'application' .
The problem is how do i insert that event log file into the table which i have created with different columns names . So that the data with 'App' should go to column 'type , data with server name should go to the column name server and so on ..
I tried all the was but could not succeed . Can i get some help in this regard please through some stored procedures or through DTS , if its possible .
The server is SQL Server 2000 with SP 1.
Many thanks
Anita .
View 1 Replies
View Related
Feb 15, 2007
I import a group of sentences INSERT from a text file .... test
Insert Into XXXXX Values('UUUUUU','3')
Insert Into XXXXX Values('UUUUUU','3')
Insert Into XXXXX Values('UUUUUU','3')
Insert Into XXXXX Values('UUUUUU','3')
Insert Into XXXXX Values('UUUUUU','3')
Insert Into XXXXX Values('UUUUUU','3')
Insert Into XXXXX Values('UUUUUU','3')
Insert Into XXXXX Values('UUUUUU','3')
The file contains 1000 insert (Aprox); I read lines for lines the file I make the insert
In VS.NET 2003 it works correctly and the process consumes little memory but In VS.NET 2005 the pocket is without space.
How I can specify the factor of growth of the database SQL Mobile?
How another thing can be happening?
I sorry for my inlges ... i speak spanish
Thanks.
View 4 Replies
View Related
Feb 5, 2007
OK, Ive read many posts on this problem but have seen no resolution.
Here's the deal. When on the actual SQL box (There's 2 in the cluster) the bulk insert works fine. No errors and the event log on the machine that is hosting the text file shows that the account that SQL runs on has accessed the file. This account is a DOmain account and is in the Local Administrator of the SQL server and the remot box hosting the text file.
Thats great if you want your developers testing and accessing your SQL box as Administrators. We don't work that way. Our developers connect via SQL Management Studio and test ther stuff that way. That's where the problem rears it's ugly head.
When anyone runs a Bulk Insert command that accesses a text file that is on a remote server, they get an "Access Denied". Now, I did a lot of testing and found that when the users executes the Bulk Insert command from the SQL Management studio on their desk top, they connect to the SQL box with their credentials (OK, that's correct right?), SQL then runs the Bulk Isert command which then reaches out to the remote file server but gets the "Access Denied". I check the logs and it shows that "Anonymous" was trying to accesss the file.
Why is Anonymouss coming over as credentials for SQL on a Bulk Insert? I'm no idiot but this sounds like a big crappy bug tha M$ will not fess to. I followed many suggestions, made sure NTFS and Share level permissions were correct (That was the first thing...), made sure the account that was running as SQL Server within the cluster on both nodes in the cluster was the same, that wasn't it, I even created a SPN for SQL to run and automatically register in AD with the credentials that SQL runs as. NOTHING!!!
Has anyone gotten their bulk insert to work when inserting from a file that is NOT local to the SQL box? Any help is appreciated, but putting the text files on the local SQL box IS NOT AN OPTION.
Thanks
View 28 Replies
View Related
Sep 15, 2015
How do I do a bulk insert into a temp table from a text file. Text file looks like that:
ver_id TYPE
E57AB326-803C-436E-B491-398A255C919A 58
34D2A601-6BBA-46B1-84E1-3A805FDA3812 58
986E140C-62F1-48F1-B428-3571EBF00DA0 58
My statement looks like this:
CREATE TABLE [dbo].[tblTemp]([ver_id] [nvarchar](255), [TYPE] [smallint])
GO
BULK INSERT [dbo].[tblTemp]
FROM 'C:v2.txt'
I keep receiving errors.
The error I receive is: Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 2 (TYPE).
View 2 Replies
View Related
Apr 23, 2008
I am using Microsoft SQL 2005, I need to do a BULK INSERT from a .csv I just downloaded from paypal. I can't edit some of the columns that are given in the report. I am trying to load specific columns from the file.
bulk insert Orders
FROM 'C:Users*******DesktopDownloadURL123.csv'
WITH
(
FIELDTERMINATOR = ',',
FIRSTROW = 2,
ROWTERMINATOR = ''
)
So where would I state what column names (from row #1 on the .csv file) would be used into what specific column in the table.
I saw this on one of the sites which seemed to guide me towards the answer, but I failed.. here you go, it might help you:
FORMATFILE [ = 'format_file_path' ]
Specifies the full path of a format file. A format file describes the data file that contains stored responses created using the bcp utility on the same table or view. The format file should be used in cases in which:
The data file contains greater or fewer columns than the table or view.
The columns are in a different order.
The column delimiters vary.
There are other changes in the data format. Format files are usually created by using the bcp utility and modified with a text editor as needed. For more information, see bcp Utility.
View 12 Replies
View Related
Feb 3, 2010
we can easily load a file into db tables. However, my main concern here is the number of columns in the file. A text file TEXT_1400.txt has 1400 columns. I am unable to load data to my db table using BCP or BULK INSERT commands, as maximum of 1024 columns are allowed per table in SQL Server 2008.
We can still go ahead and create ‘Wide Table’ (a special table that holds up to 30,000 columns. The maximum size of a wide table row is 8,019 bytes.). But when operating on wide table, BCP/BULK INSERT commands still fail. After few hours of scratching my head over BCP and BULK INSERT, I observed that while inserting BCP/BULK INSERT commands are unable to identify SPARSE columns and skip these columns, which disturbs column mapping and results in data conversion and trancation errors.
Is there any proper way to load this kind of files into the db table?
View 6 Replies
View Related
Nov 14, 2007
I have a web form with a text field that needs to take in as much as the user decides to type and insert it into an nvarchar(max) field in the database behind. I've tried using the new .write() method in my update statement, but it cuts off the text after a while. Is there a way to insert/update in SQL 2005 this without resorting to Bulk Insert? It bloats the transaction log and turning the logging off requires a call to sp_dboptions (or a straight-up ALTER DATABASE), which I'd like to avoid if I can.
View 6 Replies
View Related
Aug 7, 2006
hi, good day,
if i would like to pass a parameter into store procedure for bulk insert command as below
BULK INSERT [mytable] FROM
@FILE_PATH
WITH (FIELDTERMINATOR = ' ', ROWTERMINATOR = '')
howto i put @FILEPATH in above query inside store procedure ?
thanks for guidance :)
View 2 Replies
View Related
Dec 6, 2007
Hi folks....
I am able to import a CSV file into a temporary table as long as I know the number of fields in the CSV file. Here is what I would like to do:
I would like to have a CSV file which has UP to 6 entries per row. I would like to insert each row into a table; if the there three fields, then I want to insert them into the first three columns to the temporary table. If there are four, then insert into the first four fields. Is this possible?
Does anyone have any suggestions?
Thanks!
Forch
View 4 Replies
View Related
May 30, 2004
Hi,
I am trying to do a bulk insert using a text file that is comma delimited.
In the text file there are fields which are surrounded by quotes and contain commas; these commans should not be considered delimiters.
Is there a way to tell the BULK INSERT statement not to treat these as delimiters?
thanks,
sivilian
View 2 Replies
View Related
Jun 29, 2015
I'm trying to use Bulk insert for the first time and getting the following error. I think it might have something to do with my Format File and from the error msg there's a conversion error for the first column. In my database the Field is nvarchar(6) so my best guess is to use SQLNChar for the first column. I've checked the end of each line is CR LF therefore the is correct for line 7 right?
Msg 4863, Level 16, State 1, Line 1
Bulk load data conversion error (truncation) for row 1, column 1 (ASXCode).
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
BULK
INSERTtbl_ASX_Data_temp
FROM
'M:DataASXImportTest.txt'
WITH
(FORMATFILE='M:DataASXSQLFormatImport.Fmt')
[code]...
View 5 Replies
View Related
Oct 12, 2007
Hi,
i have a file which consists data as below,
3
123||
456||
789||
Iam reading file using bulk insert and iam inserting these phone numbers into table having one column as below.
BULK INSERT TABLE_NAME FROM 'FILE_PATH'
WITH (KEEPNULLS,FIRSTROW=2,ROWTERMINATOR = '||')
but i want to insert the data into table having two columns. if iam trying to insert the data into table having two columns its not inserting.
can anyone help me how to do this?
Thanks,
-Badri
View 5 Replies
View Related
Aug 19, 2014
The files have pipe delimters and double quotes as text qualifiers. I can get the file to import with a bulk insert statement, but it brings in the double quotes in as well. What setting is it that can be set to indicate what the text qualifiers are?
Here is are a few sample lines of data:
"id"|"system_id"|"system"|"last_modified_on"|"status"
"1"|"30101"|"H1"|"2013-05-16 09:33:19"|"1"
"2"|"30100"|"H1"|"2013-05-16 16:22:32"|"1"
"3"|"30103"|"H4"|"2013-05-16 16:22:32"|"1"
"4"|"30104"|"H3"|"2013-05-05 01:26:20"|"1"
View 0 Replies
View Related
Jan 16, 2007
I have created an SSIS package, in my VS2005 solution, that Bulk Inserts a CSV file (see example below)
"100",2006-10-03 00:00:00,"HEX012",1"101",2006-10-03 00:00:00,"DS00130",1
I have a Bulk Insert Task that uses a Flat File Connection Manager to import my CSV file into my SQL2005 database.
My source CSV file (see example above), has double quatation marks surrounding any text fileds.
I have set the Flat File Connection Manager's 'Text Qualifier' to double quatation marks.
The Bulk Insert works ok, but ignores the Text Qualifier.
My database table is left with the original quatation marks in any text field.
Any help appreciated.
Regards,
Paul.
View 3 Replies
View Related
Aug 12, 1999
Is it possible to Export a SQL passthrough query to a text file
Like
@A1 int
bcp '"select * from bank where id = @a1" query out ....... "
Or I have to use sth. like SQLDMO ???? But I dont know how to do it ..
Please Helpppppp
Edwin
View 2 Replies
View Related
Nov 28, 2005
I would appreciate some help on a procedure that I have. Using BulkInsert, I would like to import records from a text file. The issue Ihave is the file contains a header - '1AMC_TO_Axiz' and a footer'1AMC_TO_Axiz2". Using a format file, I can get the import to work byediting the file and removing these two entries. Is there a way tosetup the format file to skip these two entries? My file currentlylooks like this:7.0161 SQLCHAR 0 50 "|" 1 keyMemberNo2 SQLCHAR 0 10 "|" 2 fldEffdate.................16 SQLCHAR 0 10 "
" 16 fldNewRecordThanksCharles
View 3 Replies
View Related
Mar 16, 2015
I am trying my first bulk update to an existing SWL table from a CSV text file,The text file naming is exacrtly the same as the SQL table, with the same attributes
The statements:
BULK INSERT [Jedox_prod].[dbo].[B_BP_Customer]
FROM 'c:Baanjedox_dailyjdcom4401.txt'
WITH
[code]....
The error message is:
[size=1Msg 4864, Level 16, State 1, Line 1
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 2, column 3 (BP_Country).
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".size=1]..The have checked and re-checked the BP_Country field ( the 1st field after the key) and I am not seeing any mismatches.
View 5 Replies
View Related
Dec 13, 2007
Hi,
I am new to SSIS but i have avg working knowledge in sql.
My problem is as follows ,I have a text pipe dilimited file in some folder and the number of columns and the name of the column is not consistant. It can have N number of column and it can have any column names. I need to load this text file data into a sql table. All that i want is to load this file to SQL Database with some temp name. Once i get the table in SQL Database, i can match the column names of both taget table and this temp table and only push those column which matches with the target table. For this i can frame Dynamic SQL. This part is clear to me.
Now the problem is , I developed a SSIS pacakge to push the text file to SQL Table. I am able to do this. But if i change the column names or added new column SSIS is not able to push the new columns. Is this functionality available in SSIS, is it can be dynamic like this?
I hope i am clear with my prob... if need any clarification please let me know
thanks in advance
Mike
View 3 Replies
View Related
May 5, 2006
Hi folks,
I have a small problem - I am unable to load data from a .csv file into a table in SQL Server. Here is the command I am running:
BULK INSERT CCSProgramParticipation FROM 'c: est.csv'
WITH (
DATAFILETYPE = 'char',
FIELDTERMINATOR = ',',
ROWTERMINATOR = ''
)
Data in test.csv is the following format: (date fields can be blank)
NY580232,0,6/30/2006,3567,396,7/1/2005,9/9/2005
NY580232,0,6/30/2006,14850,462,12/12/2005,
....
....
What I see is the data does get loaded; however, data from the following row is getting inserted in the last field of a particular row (previous row) - it seems like the rowterminator is being ignored.
Has anyone encountered this issue? Please let me know your thoughts on this.
Thanks so much!
-Parul
View 6 Replies
View Related
Oct 9, 2013
I am bulk inserting from a csv file using
BULK INSERT testTable
FROM 'C:UsersRobsDocumentsSoccer2011-2012SC1.csv'
WITH
(
FIRSTROW=2,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '
',
)
Works fine. Most of my csv files have the same number of columns, but some have 4 less. The files contain the same column headings as the full size ones(only 4 less). Is there a way when bulk inserting for sql to either skip these, or ignore the error.
View 2 Replies
View Related
Mar 28, 2008
Well hi here is my text file
"Kelly","Reynold","kelly@reynold.com"
"John","Smith","bill@smith.com"
"Sara","Parker","sara@parker.com"
and a table with Id , name, surname, email
the ID is a autoincrement thats why it gives me error any way to to skip this ID so let the sql create it automaticaly for every row?
View 3 Replies
View Related
Jul 20, 2005
Hi there,I have some text files saved using a UTF-8 encoding. The "BULK INSERT"statment in Sql Server 2000 is failing with a column length error. Savingthe file as ASCII removes the problem. However, I would like to import filesin UTF-8 format. I understand that the BCP tool, when used from the commandline, can take an "-F UTF8" argument, which allows it to work. Can this bedone from the SQL> prompt with the BULK INSERT statement?Cheers,Tobin
View 2 Replies
View Related
Dec 28, 2007
hi friends
i am using bulk insert command for txt files
but now i want to use bulk insert command for dbf files
so plz any one can tell me how to use this command for dbf files
thanx in advance.........
View 1 Replies
View Related
Feb 15, 2006
Hi everyone,I am trying to bulk insert some data from a csv file to a table. I can do it as part of a button on click event, but don't know how to do it using a stored procedure. This is what I have,ALTER PROCEDURE dbo.TestImportData (
@filename varchar(50) )
AS
BULK INSERT dbo.[TestTable]
FROM @filename
WITH
(
FIELDTERMINATOR = ','
)
/* SET NOCOUNT ON */
RETURNI get the error message "Incorrect syntax near '@filename', Incorrect syntax near 'with'). What am I doing wrong? What should I do? Please help!
View 1 Replies
View Related
Apr 8, 2001
I want to use the bulk insert statement to insert data from a text file that contains more columns than the target sql table does. I am using SQL 7.0.
I am using a format file, but I can't work out how to achieve the above. SQL books online (and the msdn website) do not describe how to do this, but it is intimated that it can be done.
Any suggestions ?
Regards,
Stuart.
View 1 Replies
View Related
Jul 16, 2004
Hi All,
I have a file that has fixed row size of 148 and fixed column size, but the file has no end of line character. I know it is wierd but a client has made the file and refuses to change the format. So I am stuck with reading it the way it is.
In Enterprise Manager, I used the Import/Export wizard and I specified fixed length and it let me specify 148 as the lenght of each line. Then it recognized the file and I was able to read it in.
I saved the DTS package and I can run it over and over again using dtsrun. However I want to do the same thing using Bulk Insert. How do you specify fixed row length for Bulk insert and how do you give it individual column lengths?
Thanks,
Shab
View 1 Replies
View Related
Oct 20, 2005
SQL Server 2K
OK, I'm probably being a bone-head here and am clearly in over my head but how do you (or can you?) set up a Bulk Insert to take a dynamic path/file name?
What I want to do is pass in the path and file name from an external process to a stored procedure that bulk inserts the content of the file and then does some other routines on it. I haven't had any luck getting Bulk Insert to run if the path/file name is not hard-coded in the sproc as a string.
The point is to have a master routine that can exercise the process for several different customers and use meta data in a table to inform what file to bulk insert.
Any suggestions?
Thanks!
View 9 Replies
View Related
Mar 11, 2008
i have a log file, i am trying to import data from it to SQL in order to analyze the data (able to query on the data), however that task seems impossible.
In fact the log file contains a varying number of column fields (error logged, various types of data logged demand varying number of columns). More than that the fields themself are hard to extract.
An example of data in my log is:xxxxxxxx is some alphanumeric chars2008-01-09 20:16:05,4784E36F.req,10.1.1.26,xxxxxxxxx,OK -- SMPP - xxxxxxx:xxxxx,Sender=xxxx;SMSCMsgId=2028eecc;Binary=1;DCS=8;Data=xxxxxxxxxxxxxx...2008-01-09 20:16:05,4784E338.req,10.1.1.26,xxxxxxxx,Retry Pending - ERROR: Timeout waiting for response from server or lost connection -- SMPP - xxxxxxxxxxx:xxxxx,Sender=xxxxx........
I may use regular expressions to extract the data, and maybe use a regular INSERT to put in the right table. Thus it seems like making a manual Bulk Insert(yeah and it may take much more time), it seems strange, can i use somehow some additional tool (in SQL package or external), to assist somehow.
Thanks and sorry if this is double posted !
View 1 Replies
View Related
Apr 4, 2008
Hello, I am doing a bulk insert using a XML Format file from a csv file. Most everything works just fine, but my delimiter is a , the problem is one of the column sets sometimes contains a , within "" like this:
value1,"value,2",value3
So when I do my insert it is distorting the column values because unlike excel it is not ignoring the comma within the quotes. Is there any way to set an attribute within the format file to prevent this from happening?
View 2 Replies
View Related
Jan 17, 2008
I Have a fixed width file where the format look like below:
f1 - 16
f2 - 64
f3 - 64..
....etc
and the format file that i created looks like:
8.0
20
1 SQLCHAR 0 16 "" 0 ExtraField
2 SQLCHAR 0 64 "" 0 ExtraField
3 SQLCHAR 0 64 "" 0 ExtraField
4 SQLCHAR 0 16 "" 1 DatabaseName
5 SQLCHAR 0 128 "" 0 ExtraField
6 SQLCHAR 0 24 "" 2 DelivaryDay
7 SQLCHAR 0 4 "" 0 ExtraField
8 SQLCHAR 0 3 "" 0 ExtraField
9 SQLCHAR 0 24 "" 0 ExtraField
10 SQLCHAR 0 3 "" 0 ExtraField
11 SQLCHAR 0 24 "" 0 ExtraField
12 SQLCHAR 0 64 "" 0 ExtraField
13 SQLCHAR 0 24 "" 0 ExtraField
14 SQLCHAR 0 24 "" 0 ExtraField
15 SQLCHAR 0 24 "" 3 CompleteDate
16 SQLCHAR 0 24 "" 0 ExtraField
17 SQLCHAR 0 24 "" 0 ExtraField
18 SQLCHAR 0 24 "" 0 ExtraField
19 SQLCHAR 0 24 "" 0 ExtraField
20 SQLCHAR 0 256 "" 0 ExtraField
I need to take only three coulmns from the file and the text file won't contain any delimeters between the fields.
I tried to execute this using Bulk..Insert and i am getting the error:
Cannot bulk load. Invalid column number in the format file "C:sampleXXXX.fmt"
Any one can help me what is the problem here?
I am using SQL Server 2000.
View 1 Replies
View Related
Aug 24, 2007
Hi ,
I was wondering if there was a way in a format file to load a host file data field to more than one column in a table?
Thanks
View 1 Replies
View Related