Bulk Insert, Skip Rows With Duplicate Key Error?
May 21, 2007
Does sql server have a way to handle errors in a sproc which would allow
one to insert rows, ignoring rows which would create a duplicate key
violation? I know if one loops one can handle the error on a row by row
basis. But is there a way to skip the loop and do it as a bulk insert?
It's easy to do in Access, but I'm curious to know if SQL Server proper
can handle like this. I am guessing that a looping operation would be
slower to execute?
View 2 Replies
ADVERTISEMENT
Oct 31, 2014
I used code below to do bulk insert. Since csv file first row is column name. How to skip first row?
BULK
INSERT TEST
FROM 'c: est.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = ''
)
GO
View 4 Replies
View Related
Oct 1, 2007
Hi,
I have a data file and the contents of it are as follows
2 -- This is the header indicating the no of records in my files
1001|s1
1006|s2
The content of format file is as follows. This is to skip first column of the all the rows and get only Subs (i.e s1 and s2 )
9.0
2
1 SQLCHAR 0 100 "|" 0 ID ""
2 SQLCHAR 0 100 "
" 1 Subs ""
Here is my query to get all the Subs from my data file
SELECT * FROM OPENROWSET( BULK 'datafile.txt',
FORMATFILE = 'FormatFile.fmt',
FIRSTROW = 2 ) AS a
But this query retuns only s2 where i was expeting s1 and s2. The reason being is that the firts row i.e header doesn't follow the format
Can any one please let me know how to skip the first line in the data file and get the result as required
~Mohan
View 6 Replies
View Related
Oct 10, 2007
Hi,
I have a data file which consists of data as below,
4
PPU_FFA7485E0D||
T_GLR_DET_11||
While iam inserting into table using bulk insert, this pipe(||) is also getting inserted into the table,
here is my query iam using to insert the data using bulk insert.
BULK INSERT TABLE_NAME FROM FILE_PATH
WITH (FIELDTERMINATOR = ''||'''+',KEEPNULLS,FIRSTROW=2,ROWTERMINATOR = '''')
Can any one help on this.
Thanks,
-Badri
View 7 Replies
View Related
Apr 8, 2008
I receive the following error message when I try to use the Bulk Insert Task to load BCP data into a table:
Error: 0xC002F304 at Bulk Insert Task, Bulk Insert Task: An error occurred with the following error message: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.The bulk load failed. The column is too long in the data file for row 1, column 4. Verify that the field terminator and row terminator are specified correctly.Bulk load data conversion error (overflow) for row 1, column 1 (rowno).".
Task failed: Bulk Insert Task
In SSMS I am able to issue the following command and the data loads into a TableName table with no error messages:
BULK INSERT TableName
FROM 'C:DataDbTableName.bcp'
WITH (DATAFILETYPE='widenative');
What configuration is required for the Bulk Insert Task in SSIS to make the data load? BTW - the TableName.bcp file is bulk copy file as bcp widenative data type. The properties of the Bulk Insert Task are the following:
DataFileType: DTSBulkInsert_DataFileType_WideNative
RowTerminator: {CR}{LF}
Any help getting the bcp file to load would be appreciated. Let me know if you require any other information, thanks for all your help.
Paul
View 1 Replies
View Related
Jun 29, 2015
I'm trying to use Bulk insert for the first time and getting the following error. I think it might have something to do with my Format File and from the error msg there's a conversion error for the first column. In my database the Field is nvarchar(6) so my best guess is to use SQLNChar for the first column. I've checked the end of each line is CR LF therefore the is correct for line 7 right?
Msg 4863, Level 16, State 1, Line 1
Bulk load data conversion error (truncation) for row 1, column 1 (ASXCode).
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
BULK
INSERTtbl_ASX_Data_temp
FROM
'M:DataASXImportTest.txt'
WITH
(FORMATFILE='M:DataASXSQLFormatImport.Fmt')
[code]...
View 5 Replies
View Related
Jan 17, 2008
Im having some issues with bulk insert.
This is the table:
CREATE TABLE [dbo].[tmp_GA_status](
[GA_recno] [int] NOT NULL,
[GA_desc] [varchar](40) NULL
)
This is the file (unicode):
1|"test1"
2|"test2"
3|"test3"
4|"test4"
5|"test5"
6|"test6"
7|"test7"
8|"test8"
and this is the sql:
bulk insert tmp_GA_status from 'C: empTextDumpGA_status.dta'
with (CODEPAGE='RAW', FIELDTERMINATOR='|', ROWTERMINATOR='
', DATAFILETYPE='widechar')
so yeah, pretty simple. But whatever I do I get this;
Msg 4864, Level 16, State 1, Line 1
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 2 (GA_desc).
So what am I doing wrong ?
View 13 Replies
View Related
Feb 20, 2004
Hi, I need to insert rows into table1 from table2 and table3 but I don't want to insert repeated combinations of col2, col3. So, table1 has the primary key col2, col3.
This the table1:
create table table1(
col1 int not null,
col2 int not null,
col3 int not null,
constraint PK_table1 primary key (col2, col3)
)
This is my "insert" code:
INSERT INTO table1
SELECT table2.col1,table2.col2, table3.col3
FROM table2, table3
WHERE table2.col1 = table3.col1
Wich conditions shoud i add to this code?
Thanks.
fmilano.
View 2 Replies
View Related
Jul 23, 2005
I have a table using an identity column as its Primary Key and twocolumns (table reduced for simplicity) EmployeeNumber and ArrivalTime.CREATE TABLE [tblRecords] ([ID] [bigint] IDENTITY (1, 1) NOT NULL ,[EmployeeNumber] [varchar] (10) COLLATE SQL_Latin1_General_CP1_CI_ASNOT NULL ,[ArrivalTime] [datetime] NOT NULL ,CONSTRAINT [PK_tblRecords] PRIMARY KEY CLUSTERED([ID]) ON [PRIMARY]) ON [PRIMARY]GOI have an insert procedure that checks for duplicates before insertinga new record:IF (SELECT TOP 1 [ID] FROM tblRecords WHERE EmployeeNumber =@SocialSecurity) IS NULLBEGININSERT INTO tblRecords(EmployeeNumber,ArrivalTime)VALUES (@EmployeeNumber, @ArrivalTime)SELECT SCOPE_IDENTITY()ENDELSESELECT 0 AS DuplicateRecordIn 99.9% of the cases, this works well. However, in the event that theinsert attempts are literally "ticks" apart, the "SELECT TOP 1..."command completes on both attempts before the first attempt completes.So I end up with duplicate entries if the procedure is called multipletimes vey quickly. The system needs to prevent duplicateEmployeeNumbers within the past 45 days so setting the EmployeeNumberto UNIQUE would not work. I can check for older entries (45 days ornewer) very easily, but I do not know how to handle the times when theprocedure is called multiple times within milliseconds. Would aTRANSACTION with a duplicate check after the INSERT with a ROLLBACKwork in this case? Any help is greatly appreciated!-E
View 18 Replies
View Related
Dec 6, 2007
Hi Guys,
My little bulk insert is only bringing every second row of a CSV file. this is not good as i need every row.
My SQLcommand is thus.
InsertCommand="BULK INSERT TBL_Unitel_services FROM 'C:/webroot/servicedesk/csvs_Services/csv.csv' WITH (FIRSTROW = 1, FIELDTERMINATOR = ',', ROWTERMINATOR = '', MAXERRORS = 0) "
View 2 Replies
View Related
May 15, 2008
Greetings
Got bulk insert thing going real nice thanks to your feedback! But now I notice that the BULK INSERT creates 2 identical rows while the flat file has the column headers and then corresponding row, just one row.
Why would it do that. Interestingly if the data file and format file are residing on a shared folder on SQL 2005, and I ran the BULK INSERT from studio it only inserts one row. But when I do BULK INSERT via user interface it creates 2 rows. What is going here.
View 10 Replies
View Related
Aug 7, 2006
Hello
Iam using:
Microsoft SQL Server 2000 - 8.00.2039 (Intel X86) May 3 2005 23:18:38
Copyright (c) 1988-2003 Microsoft Corporation
Desktop Engine on Windows NT 5.1 (Build 2600: Service Pack 2)
I am using BULK INSERT to import some pipe-delimited flat files into a database.
I am firstly converting the file using VB.NET, to ensure each line of the file has a carriage return (by using streamwriter.writeline), and I am also ensuring there is no blank line at the end of the file (by using streamwriter.write).
Once I have done this, my BULK INSERT command appears to work OK. This is how I am using the statement:
BULK INSERT
tempHISTORY
FROM 'C:TEMPHISTORY.TXT'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = '|',
ROWTERMINATOR = '
'
)
NB: The first row in the file is a header row.
This appears to work OK, however, I have found that certain files seem to miss the final line of the file! I have analysed these files incase they have an inconsistant number of columns but they don't.
I have also found that if I knock off the last column of the tempHISTORY table, the correct number of rows are imported. However, of course, I can't just discard one of the columns from the file, I need to import the entire file.
I cannot understand why BULK INSERT is choosing to miss the final line in the file, when the schema of the destination table matches the structure of the file.
View 2 Replies
View Related
May 15, 2015
I have a file which has some wind data that i am trying to import into a sql data base through bulk insert. if the script works as it supposed i should see 144 rows impacted but i see 0 rows affected.
BULK INSERT TOWER.RAWINTERFACE_1058 FROM 'C:Temp900020150427583.txt'
WITH(CHECK_CONSTRAINTS,CODEPAGE='RAW',DATAFILETYPE='char',FIELDTERMINATOR=' ',ROWTERMINATOR='
',FIRSTROW=172)
The code works if the file is large but if its small 0 rows are affected. and also if i remove the header rows then the file works again. want to understand what is going on here. i am including the screen shot of the file in notepad++. I have tried changing the row terminator to ' ' , ' ' and also tried to change the codepage but nothing seems to work. No error file is being generated either, if i give a error file option.
View 7 Replies
View Related
Jul 13, 2006
I have a stored procedure which will run automatically. I've got try...catch code in the procedure, but I found a bug with the code where if there are any import errors, it doesn't recognize that that there was an error and it runs through the try code as through there was no problems. (I reported the bug).
I added some code using @@rowcount to check if there were rows imported, and if not, it moves the data file to a error folder so I know there was a problem with the import. But this only checks if at least one row was imported, not if all the rows in the datafile have been imported. (i.e. if the first row imported correctly, and the second did not, it still sees it as successful).
The problem is some of the data files have only one row to import and some have multiple rows. Is there a way to count the number of rows in the datafile, then count the number of rows imported, to verify they are the same number imported?
Thanks,
Laura
View 2 Replies
View Related
Jul 13, 2006
I have a stored procedure which will run automatically. I've got try...catch code in the procedure, but I found a bug with the code where if there are any import errors, it doesn't recognize that that there was an error and it runs through the try code as through there was no problems. (I reported the bug).
I added some code using @@rowcount to check if there were rows imported, and if not, it moves the data file to a error folder so I know there was a problem with the import. But this only checks if at least one row was imported, not if all the rows in the datafile have been imported. (i.e. if the first row imported correctly, and the second did not, it still sees it as successful).
The problem is some of the data files have only one row to import and some have multiple rows. Is there a way to count the number of rows in the datafile, then count the number of rows imported, to verify they are the same number imported?
Thanks,
Laura
View 6 Replies
View Related
Aug 7, 2015
I am trying to BULK INSERT csv files using a stored procedure in SQL SERVER 2008R2 SP3. Although the files contain several thousand lines and BULK INSERT returns no errors, no data is actually imported into the table. Every field in the table is a NVARCHAR(50) datatype.
Here is the code for the operation (only the parameters for the insert itself):
set @open = 'bulk insert [DWHStaging].[dbo].[Abverkaufsquote] from '''
set @path = 'G:DataStagingDWHStagingSourceAbverkaufsquote'
set @params = ''' with (firstrow = 2
, datafiletype = ''widechar''
, fieldterminator = '';''
, rowterminator = ''
''
, codepage = ''1252''
, keepnulls);'
The csv file originates from a DB2 database. Using exactly the same code base I can import several other types of CSV files without problem.
The files are stored on the local server with as UCS2 Little Endian and one difference is that the files that do not import do not include a BOM. The other difference is that the failed files are non-UNICODE files.
View 4 Replies
View Related
Feb 3, 2010
we can easily load a file into db tables. However, my main concern here is the number of columns in the file. A text file TEXT_1400.txt has 1400 columns. I am unable to load data to my db table using BCP or BULK INSERT commands, as maximum of 1024 columns are allowed per table in SQL Server 2008.
We can still go ahead and create ‘Wide Table’ (a special table that holds up to 30,000 columns. The maximum size of a wide table row is 8,019 bytes.). But when operating on wide table, BCP/BULK INSERT commands still fail. After few hours of scratching my head over BCP and BULK INSERT, I observed that while inserting BCP/BULK INSERT commands are unable to identify SPARSE columns and skip these columns, which disturbs column mapping and results in data conversion and trancation errors.
Is there any proper way to load this kind of files into the db table?
View 6 Replies
View Related
Apr 18, 2008
Hello,
I'm just learning SSIS and I've hit my first bump. I am doing a bulk import from a tab delimited text file to an empty sql table that has a Idendity column defined. How do I tell the bulk insert task to skip that column when inserting from the text file. If I remove the identity column it imports the data fine, but I want to create the indentity column in the table too.
Thanks.
View 8 Replies
View Related
Jul 23, 2005
Hello All,Does the BCP utility enable you to selectively import rows from a flatfile to a table ?For example:The first column in my flat file contains a record type - 1, 2..7I only need to import types 1, 2, & 3Can this be specified in the .fmt file ?Thanks in advancehharry
View 4 Replies
View Related
May 15, 2007
Hi,
How to skip my 12 header rows from XLS input source?
(before the Excel driver reads (by default, 8 rows) in the specified source to guess at the data type of each column.)
thx,
f.sor
View 3 Replies
View Related
Feb 7, 2006
OK. We know there is Header rows to skip options and it works great.
I've got the file that has a "footer". Here is an example:
.
PSC
filename=table1
records=0000000000525
ldbname=db1
timestamp=2006/02/07-16:25:00
numformat=44,46
dateformat=mdy-1910
map=NO-MAP
cpstream=ISO8859-1
.
0000260611
It's ALWAYS last 12 rows.
Is there a way to split at this point and put the 12 rows in a different location? The task is twofold - I don't need these control rows in my data and I need value of "records" to verify loaded number of rows.
UPDATED: After some testing I found out that the Flat File source does not see that footer at all. This is good and bad - I do want to load this metedat into some other tables.
Dima.
View 7 Replies
View Related
Oct 8, 1999
I'm doing a bulk insert from a text file to sql server 7
I'm getting an error:
Server: Msg 4867, Level 16, State 1, Line 1
Bulk insert data conversion error (overflow) for row 1, column 169 (LOT_WIDTH).
Server: Msg 7399, Level 16, State 1, Line 1
OLE DB provider 'STREAM' reported an error. The provider did not give any information about the error.
The statement has been terminated.
Now my lot-width field coming in is defined as a numeric 9(5).
My table is defined as an INT.
Any suggestion? I'm new to SQL7
Thanks
Jason
View 1 Replies
View Related
Jan 9, 2007
Hello SQLTEAM
I have a flat fix length file...
H315620060417
H315620060417
I have a format file
8.0
2
1 SQLCHAR 0 5 "" 4 MCO_Number SQL_Latin1_General_CP1_CI_AS
2 SQLCHAR 0 8 "
" 5 Run_Date SQL_Latin1_General_CP1_CI_AS
I get the following BULK INSERT
"BULK INSERT data conversion error (truncation) for
row 1, column 1 (MCO_Number).
Columns in destination table are nvarchar (5) and nvarchar (8). I have tried using "
" and "" as row terminators.
Any help appreciated.
View 1 Replies
View Related
Sep 6, 2007
I am attempting to bulk insert a comma delimited text file with double quotes as the text qualifier but I keep getting an error message(EOF) on the bulk insert.
I think the problem lies in my format file (see below)
Please take a look and let me know what I am missing?
Thanks,
Matt
Error message:
Msg 4832, Level 16, State 1, Line 1
Bulk load: An unexpected end of file was encountered in the data file.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
Format File:
8.0
19
1 SQLCHAR 0 0 """ 0 first_quote SQL_Latin1_General_CP1_CI_AS
2 SQLCHAR 0 0 "","" 1 nt_id SQL_Latin1_General_CP1_CI_AS
3 SQLCHAR 0 0 "","" 2 first_name SQL_Latin1_General_CP1_CI_AS
4 SQLCHAR 0 0 "","" 3 last_name SQL_Latin1_General_CP1_CI_AS
5 SQLCHAR 0 0 "","" 4 department SQL_Latin1_General_CP1_CI_AS
6 SQLCHAR 0 0 "","" 5 phone SQL_Latin1_General_CP1_CI_AS
7 SQLCHAR 0 0 "","" 6 mgmt_level SQL_Latin1_General_CP1_CI_AS
8 SQLCHAR 0 0 "","" 7 emp_id SQL_Latin1_General_CP1_CI_AS
9 SQLCHAR 0 0 "","" 8 rc SQL_Latin1_General_CP1_CI_AS
10 SQLCHAR 0 0 "","" 9 subrc SQL_Latin1_General_CP1_CI_AS
11 SQLCHAR 0 0 "","" 10 location SQL_Latin1_General_CP1_CI_AS
12 SQLCHAR 0 0 "","" 11 floor SQL_Latin1_General_CP1_CI_AS
13 SQLCHAR 0 0 "","" 12 supervisor_id SQL_Latin1_General_CP1_CI_AS
14 SQLCHAR 0 0 "","" 13 status SQL_Latin1_General_CP1_CI_AS
15 SQLCHAR 0 0 "","" 14 hiredate SQL_Latin1_General_CP1_CI_AS
16 SQLCHAR 0 0 "","" 15 jobtitle SQL_Latin1_General_CP1_CI_AS
17 SQLCHAR 0 0 "","" 16 paygrade SQL_Latin1_General_CP1_CI_AS
18 SQLCHAR 0 0 "","" 17 id SQL_Latin1_General_CP1_CI_AS
19 SQLCHAR 0 0 ""
" 18 email SQL_Latin1_General_CP1_CI_AS
View 11 Replies
View Related
Feb 28, 2008
My server updated from SQL2000 to SQL2005, SP2.
I found error in bulk insert: -
"Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.The bulk load failed. The column is too long in the data file for row 1, column 1. Verify that the field terminator and row terminator are specified correctly.".
I do read a few article saying that after apply SP2 and hotfixes, this error should be fix, but unfortunately, it is not in my case, what should i do to fix it?
This is my script: -
BULK INSERT wng01_work..nw_business_person FROM 'g:SQLFTPCDIS_Extractew_worker.dat'
WITH
(
MAXERRORS = 1,
FORMATFILE ='g:sqlftpcdis_extractew_work.fmt'
)
Please advice, Thank you
View 4 Replies
View Related
Jun 5, 2006
I've got the following SP to automatically insert all files in a directory into the database:
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE PROCEDURE Imp_Header_PO_sp
@FilePath varchar(1000) = 'D:EBTOutbound',
@WIPPath varchar(1000) = 'D:EBTOutboundWIP',
@ArchivePath varchar(1000) = 'D:EBTOutboundArchive',
@FileNameMask varchar(1000) = '*Header.txt'
AS
BEGIN
SET NOCOUNT ON;
declare @Filename varchar(1000),
@File varchar(1000)
declare @cmd varchar(2000)
create table #Dir (s varchar(8000))
-- Move Header files to WIP
select @cmd = 'move ' + @FilePath + @FileNameMask + ' ' + @WIPPath
select @cmd = 'dir /B ' + @WIPPath + @FileNameMask
delete #Dir
insert #Dir exec master..xp_cmdshell @cmd
delete #Dir where s is null or s like '%not found%'
-- Import file
while exists (select * from #Dir)
begin
select @FileName = min(s) from #Dir
select @File = @WIPPath + @FileName
select @cmd = 'bulk insert'
select @cmd = @cmd + ' POWebOutHeader'
select @cmd = @cmd + ' from'
select @cmd = @cmd + ' ''' + replace(@File,'"','') + ''''
select @cmd = @cmd + ' with (Fieldterminator = ',')'
-- Import the data
exec (@cmd)
-- remove filename just imported
delete #Dir where s = @FileName
-- Archive the file
select @cmd = 'move ' + @WIPPath + @FileName + ' ' + @ArchivePath + @FileName
exec master..xp_cmdshell @cmd
end
drop table #Dir
END
GO
When I try to execute the code, I get the following error, on this line: select @cmd = @cmd + ' with (Fieldterminator = ',')'
Msg 141, Level 15, State 1, Procedure Imp_Header_PO_sp, Line 46
A SELECT statement that assigns a value to a variable must not be combined with data-retrieval operations.
I've tried to find a fix for this error, but it seams to only relate to a select statement and not a Bulk Insert. Can someone please help me figure out how to fix this error?
Thanks,
Laura
View 5 Replies
View Related
Apr 30, 2007
Simple test project. Created Flat File connection, database connection (both local), and Bulk Insert Task. When running the package I get the following error:
[Bulk Insert Task] Error: An error occurred with the following error message: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.Bulk load: An unexpected end of file was encountered in the data file.".
I've tried different settings for the Flat File config, and the database connection, but still get the error. Any suggestions would be helpful.
Tks.
View 4 Replies
View Related
Jun 26, 2006
Hi,
I am using bulk insert to insert a lot of information from file to memory. In many cases it do the work but in one place it gives me the exception.
My code:
BULK INSERT tblCompVSNet1 FROM 'E:EasySeriesWindowsApplication1inDebug blCompVSNet1.tbl' WITH ( FIELDTERMINATOR = '|',ROWTERMINATOR = '|',LASTROW = 0,ROWS_PER_BATCH = 10000,CODEPAGE = 'RAW',TABLOCK)
The error:
Invalid object name 'tblCompVSNet1'
The table exists and the query works fine in query analyzer but in code through OLEDB it doesn't work sometimes.
How can I solve my problem?
Thank's
Alexei
View 6 Replies
View Related
Aug 9, 2006
Hi All,I have this data file with fix length(see below). I am able to insertit into the database using bcp, but now I want to skip (do not insert)the row which start with letter 'S' into the database. Is there away todo it? By the way I am using -F2 option to skip the first record.Here is my data:Record 1 04XXX2 13106900240120042003040045061 Testing N POLYDOROS TRUSTEEE2 12621241640280041004040045633 What are they MARTIN &XXXXXS C1000003200400409850000059611000000500001000000001 9613000000576497500S X1000003200000209850000059613000000000000000000001 9613000000573497000Thanks for your help.Ted Lee
View 1 Replies
View Related
Nov 28, 2007
Dear all,
In Flat File Source properties windows there's Preview node, when we check that node there's an option to skip the data in how many rows. Is it affect the result ?
Best regards,
Hery
View 3 Replies
View Related
Jun 12, 2007
hI,
i have this particular problem with the unpivot.The below is my flat file source.The dates can go upto 130 columns.this count can also vary.SM,SR,SB are again values repeating for diff instrument.They are the values of the instrument on the particular dates.This is a snap shot of one feed.Other feeds may have the dates differing.How do i read this file.
Problem 1:If i skip the first row and unpivot the 2nd row,then with the new feed,with new dates my SSIS package will bomb as it will not find the col names.
Problem 2:IF i uncheck the "Use first row as column headers" then the problem 1 is solved but the o/p will be
20080101
20061102
20061103
1.2
1.3
1.2.
1.5
.....and so on..
IS there any other way to fix this.These are feeds with the spread values of instruments on particular dates.Please help.
RUN 2.01E+11 132238 0 45
INSTRID DATATYPES 20081101 20061102 20061103
Z03369 SM 1.1 1.2 1.3
Z03369 SB 1.3 1.3 1.7
Z03369 SR 2 3 4
Z81910 SM 1.1 1.2 1.3
Z81910 SB 1.3 1.3 1.7
View 7 Replies
View Related
Apr 13, 2000
Help! I am importing a large comma delimited text file into an existing table useing the BULK INSERT command. The table is 4 colums (char16, char16, varchar50, char1). The first 100 or so lines go in without an error. then, I recieve an error stateing that an entry is too long for the field in the database, and kicks me out. The entry is 50 characters, which is allowed. Any ideas why this would happen?
View 1 Replies
View Related
Oct 23, 2000
I am using the following bulk insert statement:
bulk insert DB_Kash.dbo.tb_category
from 'C:cpdataSUPPLIER_5305OUTPUTTb_Category.txt'
with (formatfile = 'C:b_category.txt')
This works on one sql server and the same code does not work on another server.I have taken care to see that the path is appropriate.
Are there any server settings involved?
The table structure is the same in both cases and the select into nulk copy option has been selected.
The table has full text indexing set on it but I don't think that this would make any difference.
The only error it gives is "ILE DB Stream reported an error.The stream does not provide any expalanation regarding this error." Something like this.
Infact the same format file and datafile work fine when I am doing BCP.
I have tried with select into bulk copy option on and off too.
Any info on this is greatly appreciated.
thanks
Sush.
View 1 Replies
View Related