T-SQL (SS2K8) :: Ignore First Row In Data File In Bulk Insert?
Feb 17, 2015
I use bulk insert to fill a table
I use code bellow
bulk insert dbo.test
from 'c: est.txt'
with(FIRSTROW=2,FORMATFILE='c: est.xml'
go
but data inserted to table start from 3throw.
View 2 Replies
ADVERTISEMENT
Oct 12, 2007
Hi,
i have a file which consists data as below,
3
123||
456||
789||
Iam reading file using bulk insert and iam inserting these phone numbers into table having one column as below.
BULK INSERT TABLE_NAME FROM 'FILE_PATH'
WITH (KEEPNULLS,FIRSTROW=2,ROWTERMINATOR = '||')
but i want to insert the data into table having two columns. if iam trying to insert the data into table having two columns its not inserting.
can anyone help me how to do this?
Thanks,
-Badri
View 5 Replies
View Related
Jun 29, 2015
I'm trying to use Bulk insert for the first time and getting the following error. I think it might have something to do with my Format File and from the error msg there's a conversion error for the first column. In my database the Field is nvarchar(6) so my best guess is to use SQLNChar for the first column. I've checked the end of each line is CR LF therefore the is correct for line 7 right?
Msg 4863, Level 16, State 1, Line 1
Bulk load data conversion error (truncation) for row 1, column 1 (ASXCode).
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
BULK
INSERTtbl_ASX_Data_temp
FROM
'M:DataASXImportTest.txt'
WITH
(FORMATFILE='M:DataASXSQLFormatImport.Fmt')
[code]...
View 5 Replies
View Related
May 23, 2007
Hi, I´m trying to bulk insert files that looks like this:
aaaa,bbb,dddd,
ccc,dfd,tghj,
each file can have up to 10 data fileds per line, and each file will have same number of data fileds in particular file, let´s say 3 like above. Second file could have let´s say 10 and that is maximum.
I read the file and insert data with fieldterminator in temp table from witch I insert data to other tables regarding some parameters inside.
Now problem is:
Msg 4832, Level 16, State 1, Line 1
Bulk load: An unexpected end of file was encountered in the data file.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
That is because I´m trying to insert 3 fields of data in temporary table which is made of 10 columns (It have to be 10 because next file could have 10 fileds of data). If the temp table has same number of columns like text file has data fields than it works.
What is solution for this problem?
Can I bulk insert NULL in columns for which I don't have data?
I can also import each line of text file to one column (with delimiter inside) but than I don´t know how to insert that data to correct tables or even to one table but to seperate data fields to columns with fieldterminator which is , in this case.
I'm new to SQL and I would apriciate any help.
Thank you
View 3 Replies
View Related
Jun 17, 2007
Hey all
I have a bulk insert situation that would be nice to be able to pull off. I have a flat file with 46 columns that are to go into a table. The table, I want to have a 47th column to be updated later on by means of a stored proc saying if the import into the system was sucessful or not. I have the rowterminator set as '"' thinking that would tell SQL to begin on the next row, leaving the importstatus column null but i still receive an error.
First of all, is this idea possible within this insert statement. Secondly, if so, what would be the syntax to tell the insert statement to skip that particular column. It is the last column listed in the table so it just needs to start on the next row after it inserts the last bit of data in the flatfile.
If this is not possible, is it possible to bulk insert into a temp table?
Thanks
View 1 Replies
View Related
Mar 12, 2015
I am running a set of SQL statements on a SQL server, to insert flat file data into a SQL table. The flat file is already FTP'ed to the SQL server. I seem to be getting an error, which is possibly pointing to a permissions issue
The statements:
BULK INSERT [Jedox_prod].[dbo].[B_BP_Customer]
FROM 'c:jedox_dailyjdcom4401.txt'
WITH
(
FIRSTROW = 2,
MAXERRORS = 0,
FIELDTERMINATOR = '|',
ROWTERMINATOR = '
'
)
GO
The error is :
Msg 4861, Level 16, State 1, Line 1
Cannot bulk load because the file "c:jedox_dailyjdcom4401.txt" could not be opened. Operating system error code 3(failed to retrieve text for this error. Reason: 1815)
If it is permissions issue, how do I overcome this?
View 1 Replies
View Related
Sep 29, 2007
Hi,
I have a data file in the folloing format
SubjectId1|class1
SubjectId2|class2
SubjectId3|class3
I just wanted to insert only SubjectIds into my table 'Subjects' which has the follwing schama ignoring the classes
The row delimeter is "
" and the column delimeter is ' | '
Table Subjects
{
ID (Autoincrement)
SubjectId varchar(20)
}
Can any one provide the format file for doing this or suggest anyway to do this?
Please do note that the file may contain millions of records
Thank u
~mohan
View 5 Replies
View Related
Aug 5, 2015
USE TEST
GO
/****** BULK INSERT ******/
BULK
INSERT [Table01]
FROM 'C:empdata.csv'
[code]....
I am using above code to insert csv file data which consist of arabic data as well. Upload is successful however Arabic field data is uploaded with invalid characters and getting the following error Msg 4864, Level 16, State 1, Line 3...Bulk load data conversion error (type mismatch or invalid character for the specified codepage)
View 15 Replies
View Related
Dec 5, 2006
With "bcp MyDatabase.dbo.MyTable out C:MyFile.Dat -n -T" command line, I could get an exported data file. And I can also import this file into MyTable using 'BULK INSERT MyDatabase.dbo.MyTable FROM 'C:MyFile.dat' WITH (DATAFILETYPE='native');' query statement.
Now, I want to make my own data file just like made by bcp above. Although I could make file of 'char' type, 'native' type file is needed for performance and other reasons. And the format file should not be used.
Any one help?
View 5 Replies
View Related
Oct 1, 2007
Hi,
I have a data file and the contents of it are as follows
2 -- This is the header indicating the no of records in my files
1001|s1
1006|s2
The content of format file is as follows. This is to skip first column of the all the rows and get only Subs (i.e s1 and s2 )
9.0
2
1 SQLCHAR 0 100 "|" 0 ID ""
2 SQLCHAR 0 100 "
" 1 Subs ""
Here is my query to get all the Subs from my data file
SELECT * FROM OPENROWSET( BULK 'datafile.txt',
FORMATFILE = 'FormatFile.fmt',
FIRSTROW = 2 ) AS a
But this query retuns only s2 where i was expeting s1 and s2. The reason being is that the firts row i.e header doesn't follow the format
Can any one please let me know how to skip the first line in the data file and get the result as required
~Mohan
View 6 Replies
View Related
Mar 2, 2004
Hello dbforums,
I are using a BULK INSERT to insert the data from a ascii file to a sql table. The table has a ProductInstanceId column that exists in the tables but does not exist in the ascii DICast data. I am setting the ProductInstanceId to a Guid that will be used for Metrics. I would like to create the Guid in C++ and then set it somehow during the BULK INSERT DICastRaw1hr and DICastRaw6hr. I am calling the BULK INSERT from C++/ADO. I do not see how you can set a static data in the BULK INSERT for a column that exists in the table but does not the source data ... seems there should be a way to do this with the format file?
The other way to do this is with a TRIGGER. I have the TRIGGER below. Prior to the calling the BULK INSERT using ADO I will use ADO to ALTER the TRIGGER with the new Guid. When the BULK INSERT runs the ProductInstanceId will be populated with the new Guid.
ALTER TRIGGER DICastRaw1hrInsertGuid
ON Alphanumericdata.dbo.DICastRaw1hr
FOR INSERT AS UPDATE dbo.DICastRaw1hr SET ProductInstanceId = '4f9a44eb-092b-445b-a224-cc7cdd207092'
WHERE modelrundatetime = (select max(modelrundatetime) from Alphanumericdata.dbo.DICastraw1hr(NOLOCK))
More Questions:
- The Trigger is slow. The Bulk Insert without the Trigger runs in about 10 sec ... with the Trigger in about 40 sec. I tried to use the sql code below in the TRigger but it was only doing the UPDATE on the last row. The TRIGGER must run after the BULK INSERT is complete. Now I am using the select (bad). Any comments ...
ALTER TRIGGER DICastRaw1hrInsertDate
ON Alphanumericdata.dbo.DICastRaw1hr
FOR INSERT
AS
DECLARE @ID as integer
SELECT @ID = i.recordid from inserted i
UPDATE dbo.DICastRaw1hr SET ProductInstanceId = '4f9a44eb-092b-445b-a224-cc7cdd207092'
WHERE recordid = @ID
- I understand that I could set the Guid in the Default Value part of the table definition using the NEWID() function. I need the Guid to be the same for all the rows that are inserted during the BULK INSERT (all have the same modelrundatetime) ... how would I do this?
Thanks,
Chris
View 6 Replies
View Related
Jun 27, 2006
Hi,
for testing purposes I'm inserting a flat file into a sql-server table using BULK INSERT unsig the following code:
BULK INSERT rsk_staging
FROM 'c: empulk
sk.txt'
WITH (
FIELDTERMINATOR = '',
ROWTERMINATOR = '
',
CODEPAGE = 'RAW',
DATAFILETYPE = 'char',
BATCHSIZE = 100000,
ROWS_PER_BATCH = 1925604,
TABLOCK
)
I have two versions of "rsk.txt" one with 1.9mill rows and one with the first 2000 rows only. The files have one column only with 115 characters that I'll split in to several columns later using SUBSTRING. The one with 2000 rows fires in to the database with no problems whatsoever using this exact code, the other one throws the following error:
Server: Msg 4866, Level 17, State 66, Line 1
Bulk Insert fails. Column is too long in the data file for row 1, column 1. Make sure the field terminator and row terminator are specified correctly.
How can I resolve this problem?
EDIT: I tried several different row- and fieldterminators but this exact one works for the small data-file so I assume it should also work for the large one...the large one is however copyed directly using binary ftp from a unix-filesystem and the small one is manually copied into a new txt-file using UltraEdit.
View 1 Replies
View Related
Dec 3, 2015
I am running Microsoft SQL Server 2012 SP on a Windows Server 2008 R2 Standard SP1 box. The SQL Server service is running as a simple windows domain user (nothing special, no admin rights, etc.) I am having some issues with using Bulk Insert when the data file is on a network share when using Windows Authentication. What is known is that the SQL Server service account has access to the network resource, which is shown by logging into SQL Server with a SQL account and doing the Bulk Insert. I also have rights to the files on the share, as shown by the fact that I put the files there. My SQL is in the form of:
Bulk Insert [table name] From '[server][share][filename]' With (FirstRow = 2, FormatFile='FormatFile.xml')
Now, when connecting to SQL Server with Windows Authentication and running the Bulk Insert I get the following error:
Msg 4861, Level 16, State 1, Line 2 Cannot bulk load because the file "[server][share][filename]" could not be opened. Operating system error code 5(Access is denied.).
I found this snip at
BULK INSERT (Transact-SQL)Security Account Delegation (Impersonation), which says, in part (emphasis mine):
To resolve this error [4861], use SQL Server Authentication and specify a SQL Server login that uses the security profile of the SQL Server process account, or configure Windows to enable security account delegation. For information about how to enable a user account to be trusted for delegation.
How to Configure the Server to be Trusted for Delegation, and we tried the unconstrained delegation and I rebooted the SQL server, but it still does not work. Later we tried constrained delegation and it still does not work.
I have verified the SPNs:
C:>setspn adsvc_sqlRegistered ServicePrincipalNames for CN=SVC_SQL,OU=Service Accounts,OU=Users,OU=ad domain,DC=ad,DC=local: MSSQLSvc/SQLQA.ad.local:1433 MSSQLSvc/SQLDev.ad.local:1433 MSSQLSvc/SQLQA.ad.local MSSQLSvc/SQLDev.ad.local
I have verified that my SQL connection is TCP and I am getting/using a Kerberos security token.
C:>sqlcmd -S tcp:SQLQA.ad.local,1433 -E1> Select dec.net_transport, dec.auth_scheme From sys.dm_exec_connections As dec Where session_id = @@Spid;2>
gonet_transport auth_scheme------------- -----------TCP KERBEROS(1 rows affected)1>
If I move the source file to a local drive (on the SQL server), all works fine, but I must be able to read from a file share?
View 8 Replies
View Related
Jan 16, 2015
I have a script to be used to backup a specific table in a weekly basis, here is the approach what I take:
1. script the source table's schema to a create
2. the new script:
If not exists (select * from sysobjects where name='EventlogHistory' and xtype='U')
CREATE TABLE [dbo].[EventlogHistory](
[LogID] [int] IDENTITY(1,1) NOT NULL,
[ProjectID] [int] NOT NULL,
[Description] [nvarchar](max) NOT NULL,
[EventType] [varchar](10) NOT NULL,
[IP] [varchar](50) NOT NULL,
[UserLogon] [varchar](30) NOT NULL,
[CreatedOn] [datetime] NOT NULL,
ArchivedOn datetime default getdate())
insert into EventlogHistory
It throws me the error message saying "String or binary data would be truncated" which is nonsense to me, I need to let sql ignore the error, it is ridiculous to do a max(len()) to find out as it should never happen, right?
View 9 Replies
View Related
Sep 27, 2004
Hi,
Iam trying to import data from a csv file into my table in SQL Server 2000. My table is called as temp_table and consists of 3 fields.
column datatype
-------- -----------
program nvarchar(20)
description nvarchar(50)
pId int
pId has been set to primary key with auto_increment.
My csv file has 2 columns of data and it looks like follows:
program, description
"prog1", "this is program1"
"prog2", "this is program2"
"prog3", "this is program3"
Now i use BULK INSERT like this
"BULK INSERT ord_programs FROM 'C:datafile.csv' WITH (FIELDTERMINATOR=',', ROWTERMINATOR='', FIRSTROW=2)"
to import data into my table in SQL server and it gives me this error
"Bulk insert data conversion error (type mismatch) for row 2, column 3 (pId)"
I guess i have to use fileformat or something since i dont have anything for pId field in the csv file to make it work...
Please help me out guys and please post a snippet of code if you have.
Thank You.
View 2 Replies
View Related
Nov 2, 2006
I'm importing a large csv file two different ways - one with Bulk Import Task and the other way with the Data Flow Task (flat file source -> OLE DB destination).
With the Bulk Import Task I'm putting all the csv rows in one column. With the Data Flow Task I'm mapping each csv value to it's own column in the SQL table.
I used two different flat file sources and got the following:
Flat file 1: Bulk Import Task = 12,649,499 rows; Data Flow Task = 4,215,817 rows
Flat file 2: Bulk Import Task = 3,403,254 rows; Data Flow Task = 1,134,359 rows
Anyone have any guess as to why this is happening?
View 9 Replies
View Related
Apr 3, 2015
I am unable to load data from flat file to sql table using bulk insert sql statement
My code:-
DECLARE @filePath VARCHAR(200)
DECLARE @sql VARCHAR(8000)
Declare @filename varchar(100)
set @filename='CCNVZ_150401054418'
SET @filePath = 'I:IncomingFiles'+@FileName+'.txt'
[Code] .....
View 1 Replies
View Related
Aug 19, 2014
The files have pipe delimters and double quotes as text qualifiers. I can get the file to import with a bulk insert statement, but it brings in the double quotes in as well. What setting is it that can be set to indicate what the text qualifiers are?
Here is are a few sample lines of data:
"id"|"system_id"|"system"|"last_modified_on"|"status"
"1"|"30101"|"H1"|"2013-05-16 09:33:19"|"1"
"2"|"30100"|"H1"|"2013-05-16 16:22:32"|"1"
"3"|"30103"|"H4"|"2013-05-16 16:22:32"|"1"
"4"|"30104"|"H3"|"2013-05-05 01:26:20"|"1"
View 0 Replies
View Related
Jan 17, 2008
Im having some issues with bulk insert.
This is the table:
CREATE TABLE [dbo].[tmp_GA_status](
[GA_recno] [int] NOT NULL,
[GA_desc] [varchar](40) NULL
)
This is the file (unicode):
1|"test1"
2|"test2"
3|"test3"
4|"test4"
5|"test5"
6|"test6"
7|"test7"
8|"test8"
and this is the sql:
bulk insert tmp_GA_status from 'C: empTextDumpGA_status.dta'
with (CODEPAGE='RAW', FIELDTERMINATOR='|', ROWTERMINATOR='
', DATAFILETYPE='widechar')
so yeah, pretty simple. But whatever I do I get this;
Msg 4864, Level 16, State 1, Line 1
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 2 (GA_desc).
So what am I doing wrong ?
View 13 Replies
View Related
Jan 2, 2008
Hey All,
Similar to a previous post (http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=244646&SiteID=1), I am trying to import data into a SQL Table.
I am trying to program a small application that will import product data obtained through suppliers via CD-ROM. One supplier in particular uses Fixed width colums, and data looks like this:
Example of Data
0124015Apple Crate 32.12
0124016Bananna Box 12.56
0124017Mango Carton 15.98
0124018Seedless Watermelon 42.98
My Table would then have:
ProductID as int
Name as text
Cost as money
How would I go about extracting the data with an XML Format file? I am stumbling over how to tell it where to start picking up data for a specific column.
Is there any way that I could trim the Name column (i.e.: "Mango Carton " --> "Mango Carton")?
I don't know if it makes any difference, but I've been calling SQL from my code by doing this:
Code in C# Form
SqlConnection SqlConnection = new SqlConnection(global::SQLClients.Properties.Settings.Default.ClientPhonebookConnectionString);
SqlCommand cmd = new SqlCommand();
cmd.CommandType = CommandType.Text;
cmd.CommandText = "INSERT INTO PhonebookTable(Name, PhoneNumber) VALUES('" + txtName.Text.ToString() + "', '" + txtPhoneNumber.Text.ToString() + "')";
cmd.Connection = SqlConnection;
SqlConnection.Open();
cmd.ExecuteNonQuery();
SqlConnection.Close();
RefreshData();
I am running Visual Studio C# Express 2005 and SQL Server Express 2005.
Thanks for your time,
Hayden.
View 1 Replies
View Related
May 5, 2006
Hi folks,
I have a small problem - I am unable to load data from a .csv file into a table in SQL Server. Here is the command I am running:
BULK INSERT CCSProgramParticipation FROM 'c: est.csv'
WITH (
DATAFILETYPE = 'char',
FIELDTERMINATOR = ',',
ROWTERMINATOR = ''
)
Data in test.csv is the following format: (date fields can be blank)
NY580232,0,6/30/2006,3567,396,7/1/2005,9/9/2005
NY580232,0,6/30/2006,14850,462,12/12/2005,
....
....
What I see is the data does get loaded; however, data from the following row is getting inserted in the last field of a particular row (previous row) - it seems like the rowterminator is being ignored.
Has anyone encountered this issue? Please let me know your thoughts on this.
Thanks so much!
-Parul
View 6 Replies
View Related
Oct 9, 2013
I am bulk inserting from a csv file using
BULK INSERT testTable
FROM 'C:UsersRobsDocumentsSoccer2011-2012SC1.csv'
WITH
(
FIRSTROW=2,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '
',
)
Works fine. Most of my csv files have the same number of columns, but some have 4 less. The files contain the same column headings as the full size ones(only 4 less). Is there a way when bulk inserting for sql to either skip these, or ignore the error.
View 2 Replies
View Related
Mar 28, 2008
Well hi here is my text file
"Kelly","Reynold","kelly@reynold.com"
"John","Smith","bill@smith.com"
"Sara","Parker","sara@parker.com"
and a table with Id , name, surname, email
the ID is a autoincrement thats why it gives me error any way to to skip this ID so let the sql create it automaticaly for every row?
View 3 Replies
View Related
Jul 20, 2005
Hi there,I have some text files saved using a UTF-8 encoding. The "BULK INSERT"statment in Sql Server 2000 is failing with a column length error. Savingthe file as ASCII removes the problem. However, I would like to import filesin UTF-8 format. I understand that the BCP tool, when used from the commandline, can take an "-F UTF8" argument, which allows it to work. Can this bedone from the SQL> prompt with the BULK INSERT statement?Cheers,Tobin
View 2 Replies
View Related
Dec 28, 2007
hi friends
i am using bulk insert command for txt files
but now i want to use bulk insert command for dbf files
so plz any one can tell me how to use this command for dbf files
thanx in advance.........
View 1 Replies
View Related
Feb 15, 2006
Hi everyone,I am trying to bulk insert some data from a csv file to a table. I can do it as part of a button on click event, but don't know how to do it using a stored procedure. This is what I have,ALTER PROCEDURE dbo.TestImportData (
@filename varchar(50) )
AS
BULK INSERT dbo.[TestTable]
FROM @filename
WITH
(
FIELDTERMINATOR = ','
)
/* SET NOCOUNT ON */
RETURNI get the error message "Incorrect syntax near '@filename', Incorrect syntax near 'with'). What am I doing wrong? What should I do? Please help!
View 1 Replies
View Related
Jun 27, 2002
Hi ,
I have imported an event log file from an NT server in a test database . The table has been created automatically with some 10 columns as Col001...Col002 ...and so on till Col0010 .
Now i want to copy the data of the event logs to specific columns . So i created a table with name such as Evnt , date , time , server and evntdescription so that whenever i can execute a simple query like
Select * from tablename where type = 'app' , Server = 'test1 ' .
so that i get all the results for that server 'test1 with type 'application' .
The problem is how do i insert that event log file into the table which i have created with different columns names . So that the data with 'App' should go to column 'type , data with server name should go to the column name server and so on ..
I tried all the was but could not succeed . Can i get some help in this regard please through some stored procedures or through DTS , if its possible .
The server is SQL Server 2000 with SP 1.
Many thanks
Anita .
View 1 Replies
View Related
Apr 8, 2001
I want to use the bulk insert statement to insert data from a text file that contains more columns than the target sql table does. I am using SQL 7.0.
I am using a format file, but I can't work out how to achieve the above. SQL books online (and the msdn website) do not describe how to do this, but it is intimated that it can be done.
Any suggestions ?
Regards,
Stuart.
View 1 Replies
View Related
Jul 16, 2004
Hi All,
I have a file that has fixed row size of 148 and fixed column size, but the file has no end of line character. I know it is wierd but a client has made the file and refuses to change the format. So I am stuck with reading it the way it is.
In Enterprise Manager, I used the Import/Export wizard and I specified fixed length and it let me specify 148 as the lenght of each line. Then it recognized the file and I was able to read it in.
I saved the DTS package and I can run it over and over again using dtsrun. However I want to do the same thing using Bulk Insert. How do you specify fixed row length for Bulk insert and how do you give it individual column lengths?
Thanks,
Shab
View 1 Replies
View Related
Oct 20, 2005
SQL Server 2K
OK, I'm probably being a bone-head here and am clearly in over my head but how do you (or can you?) set up a Bulk Insert to take a dynamic path/file name?
What I want to do is pass in the path and file name from an external process to a stored procedure that bulk inserts the content of the file and then does some other routines on it. I haven't had any luck getting Bulk Insert to run if the path/file name is not hard-coded in the sproc as a string.
The point is to have a master routine that can exercise the process for several different customers and use meta data in a table to inform what file to bulk insert.
Any suggestions?
Thanks!
View 9 Replies
View Related
Feb 15, 2007
I import a group of sentences INSERT from a text file .... test
Insert Into XXXXX Values('UUUUUU','3')
Insert Into XXXXX Values('UUUUUU','3')
Insert Into XXXXX Values('UUUUUU','3')
Insert Into XXXXX Values('UUUUUU','3')
Insert Into XXXXX Values('UUUUUU','3')
Insert Into XXXXX Values('UUUUUU','3')
Insert Into XXXXX Values('UUUUUU','3')
Insert Into XXXXX Values('UUUUUU','3')
The file contains 1000 insert (Aprox); I read lines for lines the file I make the insert
In VS.NET 2003 it works correctly and the process consumes little memory but In VS.NET 2005 the pocket is without space.
How I can specify the factor of growth of the database SQL Mobile?
How another thing can be happening?
I sorry for my inlges ... i speak spanish
Thanks.
View 4 Replies
View Related
Mar 11, 2008
i have a log file, i am trying to import data from it to SQL in order to analyze the data (able to query on the data), however that task seems impossible.
In fact the log file contains a varying number of column fields (error logged, various types of data logged demand varying number of columns). More than that the fields themself are hard to extract.
An example of data in my log is:xxxxxxxx is some alphanumeric chars2008-01-09 20:16:05,4784E36F.req,10.1.1.26,xxxxxxxxx,OK -- SMPP - xxxxxxx:xxxxx,Sender=xxxx;SMSCMsgId=2028eecc;Binary=1;DCS=8;Data=xxxxxxxxxxxxxx...2008-01-09 20:16:05,4784E338.req,10.1.1.26,xxxxxxxx,Retry Pending - ERROR: Timeout waiting for response from server or lost connection -- SMPP - xxxxxxxxxxx:xxxxx,Sender=xxxxx........
I may use regular expressions to extract the data, and maybe use a regular INSERT to put in the right table. Thus it seems like making a manual Bulk Insert(yeah and it may take much more time), it seems strange, can i use somehow some additional tool (in SQL package or external), to assist somehow.
Thanks and sorry if this is double posted !
View 1 Replies
View Related
Apr 4, 2008
Hello, I am doing a bulk insert using a XML Format file from a csv file. Most everything works just fine, but my delimiter is a , the problem is one of the column sets sometimes contains a , within "" like this:
value1,"value,2",value3
So when I do my insert it is distorting the column values because unlike excel it is not ignoring the comma within the quotes. Is there any way to set an attribute within the format file to prevent this from happening?
View 2 Replies
View Related