Bulk Import Error
Feb 26, 2013
This is my source data in CSV format:
4,23,2AY5623,7235623
4,23,2GP1207,1451207
4,23,2GQ6689,4186689
Table:
CREATE TABLE [dbo].[Table1](
[idCodeLevel] [int] NOT NULL,
[idFirm] [int] NOT NULL,
[valCodeFrom] [varchar](15) NOT NULL,
[valCodeTo] [varchar](15) NOT NULL
) ON [PRIMARY]
[code]....
I googled and found out that I might have to use the format .fmt. But how can I convert a csv file to fmt. I have seen code to create fmt file from sql table.
View 5 Replies
ADVERTISEMENT
Sep 27, 2004
Hi,
Iam trying to import data from a csv file into my table in SQL Server 2000. My table is called as temp_table and consists of 3 fields.
column datatype
-------- -----------
program nvarchar(20)
description nvarchar(50)
pId int
pId has been set to primary key with auto_increment.
My csv file has 2 columns of data and it looks like follows:
program, description
"prog1", "this is program1"
"prog2", "this is program2"
"prog3", "this is program3"
Now i use BULK INSERT like this
"BULK INSERT ord_programs FROM 'C:datafile.csv' WITH (FIELDTERMINATOR=',', ROWTERMINATOR='', FIRSTROW=2)"
to import data into my table in SQL server and it gives me this error
"Bulk insert data conversion error (type mismatch) for row 2, column 3 (pId)"
I guess i have to use fileformat or something since i dont have anything for pId field in the csv file to make it work...
Please help me out guys and please post a snippet of code if you have.
Thank You.
View 2 Replies
View Related
Jun 29, 2015
I'm trying to use Bulk insert for the first time and getting the following error. I think it might have something to do with my Format File and from the error msg there's a conversion error for the first column. In my database the Field is nvarchar(6) so my best guess is to use SQLNChar for the first column. I've checked the end of each line is CR LF therefore the is correct for line 7 right?
Msg 4863, Level 16, State 1, Line 1
Bulk load data conversion error (truncation) for row 1, column 1 (ASXCode).
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
BULK
INSERTtbl_ASX_Data_temp
FROM
'M:DataASXImportTest.txt'
WITH
(FORMATFILE='M:DataASXSQLFormatImport.Fmt')
[code]...
View 5 Replies
View Related
Jan 17, 2008
Im having some issues with bulk insert.
This is the table:
CREATE TABLE [dbo].[tmp_GA_status](
[GA_recno] [int] NOT NULL,
[GA_desc] [varchar](40) NULL
)
This is the file (unicode):
1|"test1"
2|"test2"
3|"test3"
4|"test4"
5|"test5"
6|"test6"
7|"test7"
8|"test8"
and this is the sql:
bulk insert tmp_GA_status from 'C: empTextDumpGA_status.dta'
with (CODEPAGE='RAW', FIELDTERMINATOR='|', ROWTERMINATOR='
', DATAFILETYPE='widechar')
so yeah, pretty simple. But whatever I do I get this;
Msg 4864, Level 16, State 1, Line 1
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 2 (GA_desc).
So what am I doing wrong ?
View 13 Replies
View Related
Apr 8, 2008
I receive the following error message when I try to use the Bulk Insert Task to load BCP data into a table:
Error: 0xC002F304 at Bulk Insert Task, Bulk Insert Task: An error occurred with the following error message: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.The bulk load failed. The column is too long in the data file for row 1, column 4. Verify that the field terminator and row terminator are specified correctly.Bulk load data conversion error (overflow) for row 1, column 1 (rowno).".
Task failed: Bulk Insert Task
In SSMS I am able to issue the following command and the data loads into a TableName table with no error messages:
BULK INSERT TableName
FROM 'C:DataDbTableName.bcp'
WITH (DATAFILETYPE='widenative');
What configuration is required for the Bulk Insert Task in SSIS to make the data load? BTW - the TableName.bcp file is bulk copy file as bcp widenative data type. The properties of the Bulk Insert Task are the following:
DataFileType: DTSBulkInsert_DataFileType_WideNative
RowTerminator: {CR}{LF}
Any help getting the bcp file to load would be appreciated. Let me know if you require any other information, thanks for all your help.
Paul
View 1 Replies
View Related
May 25, 2004
Hi, don't know how to do this,
i've got 100+ .CSV text files (50mb in total, not each) i need to import into a SQL server database but don't know how.
i've tried hunting the web for some file concatenation program but just came against pay-for-me software, which didn't help.
any ideas?, would the BCP command do it?
View 14 Replies
View Related
Jul 26, 2006
I am trying to import a data file, which is tab delimited, using BULKINSERT. I have used BCP to create a format file, since the destinationtable has around 20 columns, but the data file has only three.Here's the problem: The columns I am trying to import comprise ID (anint identity column), Name (a varchar(255) column and Status (a smallint column). The data file contains identity values for the firstcolumn, so I am using the KEEPIDENTITY modifier. The Status column ismandatory, so I have set all rows in the data file to zero for thatcolumn. All of the other columns in the destination table either allowNULL or have default values. When I BULK INSERT the file using theformat file the identity columns are NOT imported and the Status columngets value 3376. The Name column is the only one that gets importedcorrectly. Here's the format file:8.031 SQLINT 0 4 " " 1 ID""2 SQLCHAR 0 0 " " 2 NameSQL_Latin1_General_CP1_CI_AS3 SQLSMALLINT 0 2 "" 4 Status""Sorry it's a bit messy.Where is 3376 coming from, and why are my identity values for column IDnot being imported?
View 3 Replies
View Related
Jul 11, 2001
Hi folks,
We are trying to import a flat file from a remote machine using the BULK INSERT command.
We have mapped the remote directory on the server.
we have used the following command.
Bulk insert table_name from my_servermy_sharefilename.txt
The error it gives is operating system Error number 5.( Access denied)
We are able to access the same file from the Windows explorer. We also refered the books online and as suggested,
have set the System Path to the same share name.
But it does not work.
Could you help us in this regard.
Thanks.
-Rajesh
View 2 Replies
View Related
Jun 25, 2014
I have imported data in my table using the bulk insert command. I was supposed to fill specific columns of my table with that data so I used a view to put them in the column I wanted.
The table looks like this now:
id | id_param | val_param
+-----------+--------------+
1 | no_tel | 742062141
2 | sex | 1
3 | age | 23
4 | no_tel | 765234157
5 | sex | 1
6 | age | 34
When I want to select only the val_param that is=1 for the id_param=sex using this interogation:
select * from bd_rox where id_param='sex' and val_param='1'
it returns no value and I don`t know why.The wanted result should look like this:
id | id_param | val_param
+-----------+--------------+-
2 | sex | 1
5 | sex | 1
View 9 Replies
View Related
Mar 11, 2008
i have a log file, i am trying to import data from it to SQL in order to analyze the data (able to query on the data), however that task seems impossible.
In fact the log file contains a varying number of column fields (error logged, various types of data logged demand varying number of columns). More than that the fields themself are hard to extract.
An example of data in my log is:xxxxxxxx is some alphanumeric chars2008-01-09 20:16:05,4784E36F.req,10.1.1.26,xxxxxxxxx,OK -- SMPP - xxxxxxx:xxxxx,Sender=xxxx;SMSCMsgId=2028eecc;Binary=1;DCS=8;Data=xxxxxxxxxxxxxx...2008-01-09 20:16:05,4784E338.req,10.1.1.26,xxxxxxxx,Retry Pending - ERROR: Timeout waiting for response from server or lost connection -- SMPP - xxxxxxxxxxx:xxxxx,Sender=xxxxx........
I may use regular expressions to extract the data, and maybe use a regular INSERT to put in the right table. Thus it seems like making a manual Bulk Insert(yeah and it may take much more time), it seems strange, can i use somehow some additional tool (in SQL package or external), to assist somehow.
Thanks and sorry if this is double posted !
View 1 Replies
View Related
Feb 27, 2008
Let me preface this request with the info that I am relatively new to sql server so I may be asking something that is rally basic, and/or is not a best practice but here goes....I need to import data from an excel spreadsheet - one of the columns may be null or may have an integer value. I'd like to replace any null values with a 1 during import so that calculations can be done with the field once the data are imported. Can someone give me an example of how to do this? I had planned to use the bulk insert option but if there's a better way please let me know. Thanks in advance for any advice.
View 3 Replies
View Related
Jul 26, 2006
Hi!.
Is there something like BCP utility in SQL CE?
I need to import (pereodicaly) large ammount of data to my CE database. When tested import on network this take a lot of time. That's why decided to send raw data in ASCII files (because of small size) and to import files to CE database.
Certainly, it's not a problem to write those cli by myself, but it's interesting if someone already did this...
Thanks, Sandr
View 3 Replies
View Related
Jun 19, 2014
I found loads of things but nothing seems to work...
I'm trying to get a link with XML data inside the page into a table but I can't find anything
View 9 Replies
View Related
Mar 18, 2008
Hi Guys,I’m trying to do a Bulk Insert but I am receiving the following error:
conversion error (type mismatch or invalid character for the specified codepage) for row 2, column 8 (Phone).Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 8 (Phone).".
Task failed: dbo.tblNoName
The data is comma delimited which I stipulated in my Import Connection section. But the data also has quotes around it (ie. The field Lname nvarchar (17) = “Brown� and the field phone nvarchar (10) = “12345678921�. Is there a way to ignore the quotes or do I have to remove them before I import?
Or is my problem something else all together?
The connection is solid;
Format = “Specify�
RowDelimiter = {CR}{LF}
columnDelimiter = Comma {,}
No other options are set.
The data looks like:
"tstLName","tstFname","000 N Tst DR","IDAHO sp","ID","00000000",
Thank you,
View 2 Replies
View Related
Nov 28, 2005
I would appreciate some help on a procedure that I have. Using BulkInsert, I would like to import records from a text file. The issue Ihave is the file contains a header - '1AMC_TO_Axiz' and a footer'1AMC_TO_Axiz2". Using a format file, I can get the import to work byediting the file and removing these two entries. Is there a way tosetup the format file to skip these two entries? My file currentlylooks like this:7.0161 SQLCHAR 0 50 "|" 1 keyMemberNo2 SQLCHAR 0 10 "|" 2 fldEffdate.................16 SQLCHAR 0 10 "
" 16 fldNewRecordThanksCharles
View 3 Replies
View Related
Apr 3, 2006
I have a web application that I am rebuilding. I have many picture files that want to take off the file system and move into SQL as a blob. I will create an index of uids against the file names but need a good way to bulk add the files to the database... any hints on code or tools would be a great help.
Thanks
Bill
View 1 Replies
View Related
Nov 12, 2015
I have more than 500 CSV files with a similar structure [Same column name and same data format]. I would like to load these files in a database table on the SQL Server 2014 database.
View 21 Replies
View Related
Dec 21, 2006
We have a sql 2005 x64 database (datawarehouse related), essentially a work area for us, that we truncate and re-populate via BCP weekly. (We don't backup the database at all) . From the perspective of data-import speed what is the best recovery model to use: Bulk-Logged or Simple? (I have read sql 2005 BOL and don't find it partcularly clear on this point.)
Barkingdog
P.S. Anyone know of an article listing "best practices" for high-speed data import?
View 1 Replies
View Related
Feb 26, 2008
I'm experiencing issues importing XML data using a distributed query with the following statement which is run from an XP client named WorkstationA connecting to SQL2005 SP2 ServerB, the XML data is located on ServerC.
AdHoc Queries using OpenRowSet has been enabled and verified.
The SQL Server service is running using a domain user account with permissions to read the remote files. I have logged in locally to the SQL server and verified this. It still fails even if the SQL services are running using LocalSystem.
User on Workstation A is authenticated with Integrated security (SQL Admin) and has rights to read the XML files on ServerC.
WorkStationA = SQL2005 Mgt Studio running the query
ServerB = SQL2005 SP2
ServerC = XML data files
DECLARE @xml XML
SELECT @xml =CONVERT(XML, bulkcolumn, 2)
FROM OPENROWSET(BULK '\SERVERCSHAREPATHDATAFILE.XML', SINGLE_BLOB) AS x
SELECT @xml
Results: Msg 4861, Level 16, State 1, Line 2
Cannot bulk load because the file "\SERVERCSHAREPATHDATAFILE.XML" could not be opened. Operating system error code 5(Access Denied).
The query fails when it is run from Workstation A connected to SQL ServerB querying data on ServerC via a UNC.
The query is succesful when it is run from the local SQL ServerB. The problem is with distributed queries.
The query is succesful when the XML files are local to the SQL server including referencing them via a local UNC
Thank you for any responses.
Hamish
View 4 Replies
View Related
Sep 21, 2006
It seems to me that files created on Unix machines with line terminator , or chr(10), cannot be imported using the Bulk Insert statement. Is this a bug, or an oversight by Microsoft? Does this mean that unless one replaces all with
, there is no way to use Bulk Insert to import Unix files? This is a very strange behavior by MSSQL. Even lessor programs such as Excel and Word automatically recognize chr(10) as a line termination character. Am I missing something, or is this just the way MSSQL is?
View 7 Replies
View Related
Nov 29, 2006
I am trying to simplify a query given to me by one of my collegues written using the query designer of Access. Looking at the query there seem to be some syntax differences, so to see if this was the case I thought I would import the database to my SQL Server Developer edition.
I tried to start the wizard from within SQL Server Management Studio Express as shown in one of the articles on MSDN which did not work, but the manual method also suggested did work.
Trouble is that it gets most of the way through the import until it spews forth the following error messages:
- Prepare for Execute (Error)
Messages
Error 0xc0202009: {332B4EB1-AF51-4FFF-A3C9-3AEE594FCB11}: An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft JET Database Engine" Hresult: 0x80004005 Description: "Could not start session. Too many sessions already active.".
(SQL Server Import and Export Wizard)
Error 0xc020801c: Data Flow Task: The AcquireConnection method call to the connection manager "SourceConnectionOLEDB" failed with error code 0xC0202009.
(SQL Server Import and Export Wizard)
Error 0xc004701a: Data Flow Task: component "Source 33 - ATable" (2065) failed the pre-execute phase and returned error code 0xC020801C.
(SQL Server Import and Export Wizard).
There does not seem to be any method of specifying a number of sessions, so I don't see how to get round the problem.
Does anyone know how I can get the import to work?
View 2 Replies
View Related
Sep 11, 2015
I have a record in an Excel format (Excel 2010) and I would like to bulk import that into SQL Server 2008 and also while importing, SQL Server will automatically create a new table based on the header fields or row of the source file.
I am not sure if SQL Server 2008 has this capabilities.
View 0 Replies
View Related
Jan 12, 2006
Hi all,
when trying to Ãmport files to our database server from a client, I keep getting an error:
- Validating (Error)
Messages
Error 0xc00470fe: Data Flow Task: The product level is insufficient for component "Source_txt" (1).
(SQL Server Import and Export Wizard)
Error 0xc00470fe: Data Flow Task: The product level is insufficient for component "Data Conversion 1" (175).
(SQL Server Import and Export Wizard)
... doing the same import when logged on the server, hasn't been giving me any errors, how come. I can from my client without trouble import tables from other DB servers but when ever it is files it won't do it.
I tried as mentioned in other threads rerun setup to re-install SSIS, but as it was already installed it wouldn't re-install. My next move would be to make a clean install, but not sure it would help, as I think this is a buck.
best regards
Musa Rusid
View 1 Replies
View Related
Oct 8, 1999
I'm doing a bulk insert from a text file to sql server 7
I'm getting an error:
Server: Msg 4867, Level 16, State 1, Line 1
Bulk insert data conversion error (overflow) for row 1, column 169 (LOT_WIDTH).
Server: Msg 7399, Level 16, State 1, Line 1
OLE DB provider 'STREAM' reported an error. The provider did not give any information about the error.
The statement has been terminated.
Now my lot-width field coming in is defined as a numeric 9(5).
My table is defined as an INT.
Any suggestion? I'm new to SQL7
Thanks
Jason
View 1 Replies
View Related
Jan 9, 2007
Hello SQLTEAM
I have a flat fix length file...
H315620060417
H315620060417
I have a format file
8.0
2
1 SQLCHAR 0 5 "" 4 MCO_Number SQL_Latin1_General_CP1_CI_AS
2 SQLCHAR 0 8 "
" 5 Run_Date SQL_Latin1_General_CP1_CI_AS
I get the following BULK INSERT
"BULK INSERT data conversion error (truncation) for
row 1, column 1 (MCO_Number).
Columns in destination table are nvarchar (5) and nvarchar (8). I have tried using "
" and "" as row terminators.
Any help appreciated.
View 1 Replies
View Related
Sep 6, 2007
I am attempting to bulk insert a comma delimited text file with double quotes as the text qualifier but I keep getting an error message(EOF) on the bulk insert.
I think the problem lies in my format file (see below)
Please take a look and let me know what I am missing?
Thanks,
Matt
Error message:
Msg 4832, Level 16, State 1, Line 1
Bulk load: An unexpected end of file was encountered in the data file.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
Format File:
8.0
19
1 SQLCHAR 0 0 """ 0 first_quote SQL_Latin1_General_CP1_CI_AS
2 SQLCHAR 0 0 "","" 1 nt_id SQL_Latin1_General_CP1_CI_AS
3 SQLCHAR 0 0 "","" 2 first_name SQL_Latin1_General_CP1_CI_AS
4 SQLCHAR 0 0 "","" 3 last_name SQL_Latin1_General_CP1_CI_AS
5 SQLCHAR 0 0 "","" 4 department SQL_Latin1_General_CP1_CI_AS
6 SQLCHAR 0 0 "","" 5 phone SQL_Latin1_General_CP1_CI_AS
7 SQLCHAR 0 0 "","" 6 mgmt_level SQL_Latin1_General_CP1_CI_AS
8 SQLCHAR 0 0 "","" 7 emp_id SQL_Latin1_General_CP1_CI_AS
9 SQLCHAR 0 0 "","" 8 rc SQL_Latin1_General_CP1_CI_AS
10 SQLCHAR 0 0 "","" 9 subrc SQL_Latin1_General_CP1_CI_AS
11 SQLCHAR 0 0 "","" 10 location SQL_Latin1_General_CP1_CI_AS
12 SQLCHAR 0 0 "","" 11 floor SQL_Latin1_General_CP1_CI_AS
13 SQLCHAR 0 0 "","" 12 supervisor_id SQL_Latin1_General_CP1_CI_AS
14 SQLCHAR 0 0 "","" 13 status SQL_Latin1_General_CP1_CI_AS
15 SQLCHAR 0 0 "","" 14 hiredate SQL_Latin1_General_CP1_CI_AS
16 SQLCHAR 0 0 "","" 15 jobtitle SQL_Latin1_General_CP1_CI_AS
17 SQLCHAR 0 0 "","" 16 paygrade SQL_Latin1_General_CP1_CI_AS
18 SQLCHAR 0 0 "","" 17 id SQL_Latin1_General_CP1_CI_AS
19 SQLCHAR 0 0 ""
" 18 email SQL_Latin1_General_CP1_CI_AS
View 11 Replies
View Related
Feb 28, 2008
My server updated from SQL2000 to SQL2005, SP2.
I found error in bulk insert: -
"Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.The bulk load failed. The column is too long in the data file for row 1, column 1. Verify that the field terminator and row terminator are specified correctly.".
I do read a few article saying that after apply SP2 and hotfixes, this error should be fix, but unfortunately, it is not in my case, what should i do to fix it?
This is my script: -
BULK INSERT wng01_work..nw_business_person FROM 'g:SQLFTPCDIS_Extractew_worker.dat'
WITH
(
MAXERRORS = 1,
FORMATFILE ='g:sqlftpcdis_extractew_work.fmt'
)
Please advice, Thank you
View 4 Replies
View Related
Jun 5, 2006
I've got the following SP to automatically insert all files in a directory into the database:
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE PROCEDURE Imp_Header_PO_sp
@FilePath varchar(1000) = 'D:EBTOutbound',
@WIPPath varchar(1000) = 'D:EBTOutboundWIP',
@ArchivePath varchar(1000) = 'D:EBTOutboundArchive',
@FileNameMask varchar(1000) = '*Header.txt'
AS
BEGIN
SET NOCOUNT ON;
declare @Filename varchar(1000),
@File varchar(1000)
declare @cmd varchar(2000)
create table #Dir (s varchar(8000))
-- Move Header files to WIP
select @cmd = 'move ' + @FilePath + @FileNameMask + ' ' + @WIPPath
select @cmd = 'dir /B ' + @WIPPath + @FileNameMask
delete #Dir
insert #Dir exec master..xp_cmdshell @cmd
delete #Dir where s is null or s like '%not found%'
-- Import file
while exists (select * from #Dir)
begin
select @FileName = min(s) from #Dir
select @File = @WIPPath + @FileName
select @cmd = 'bulk insert'
select @cmd = @cmd + ' POWebOutHeader'
select @cmd = @cmd + ' from'
select @cmd = @cmd + ' ''' + replace(@File,'"','') + ''''
select @cmd = @cmd + ' with (Fieldterminator = ',')'
-- Import the data
exec (@cmd)
-- remove filename just imported
delete #Dir where s = @FileName
-- Archive the file
select @cmd = 'move ' + @WIPPath + @FileName + ' ' + @ArchivePath + @FileName
exec master..xp_cmdshell @cmd
end
drop table #Dir
END
GO
When I try to execute the code, I get the following error, on this line: select @cmd = @cmd + ' with (Fieldterminator = ',')'
Msg 141, Level 15, State 1, Procedure Imp_Header_PO_sp, Line 46
A SELECT statement that assigns a value to a variable must not be combined with data-retrieval operations.
I've tried to find a fix for this error, but it seams to only relate to a select statement and not a Bulk Insert. Can someone please help me figure out how to fix this error?
Thanks,
Laura
View 5 Replies
View Related
Apr 30, 2007
Simple test project. Created Flat File connection, database connection (both local), and Bulk Insert Task. When running the package I get the following error:
[Bulk Insert Task] Error: An error occurred with the following error message: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.Bulk load: An unexpected end of file was encountered in the data file.".
I've tried different settings for the Flat File config, and the database connection, but still get the error. Any suggestions would be helpful.
Tks.
View 4 Replies
View Related
Jun 26, 2006
Hi,
I am using bulk insert to insert a lot of information from file to memory. In many cases it do the work but in one place it gives me the exception.
My code:
BULK INSERT tblCompVSNet1 FROM 'E:EasySeriesWindowsApplication1inDebug blCompVSNet1.tbl' WITH ( FIELDTERMINATOR = '|',ROWTERMINATOR = '|',LASTROW = 0,ROWS_PER_BATCH = 10000,CODEPAGE = 'RAW',TABLOCK)
The error:
Invalid object name 'tblCompVSNet1'
The table exists and the query works fine in query analyzer but in code through OLEDB it doesn't work sometimes.
How can I solve my problem?
Thank's
Alexei
View 6 Replies
View Related
Apr 13, 2000
Help! I am importing a large comma delimited text file into an existing table useing the BULK INSERT command. The table is 4 colums (char16, char16, varchar50, char1). The first 100 or so lines go in without an error. then, I recieve an error stateing that an entry is too long for the field in the database, and kicks me out. The entry is 50 characters, which is allowed. Any ideas why this would happen?
View 1 Replies
View Related
Oct 23, 2000
I am using the following bulk insert statement:
bulk insert DB_Kash.dbo.tb_category
from 'C:cpdataSUPPLIER_5305OUTPUTTb_Category.txt'
with (formatfile = 'C:b_category.txt')
This works on one sql server and the same code does not work on another server.I have taken care to see that the path is appropriate.
Are there any server settings involved?
The table structure is the same in both cases and the select into nulk copy option has been selected.
The table has full text indexing set on it but I don't think that this would make any difference.
The only error it gives is "ILE DB Stream reported an error.The stream does not provide any expalanation regarding this error." Something like this.
Infact the same format file and datafile work fine when I am doing BCP.
I have tried with select into bulk copy option on and off too.
Any info on this is greatly appreciated.
thanks
Sush.
View 1 Replies
View Related
May 26, 2004
Hi,
Can someone help me out with capturing the bulk insert error.I have a job which calls a procedure in which I used the bulk insert command .If the bulk insert is failing due to some reason as wrong delimitor,wrong path etc then the job fails.I need to track that error and see that the job doesnt stop and goes onto the next cursor record.
Thanks,
Nodbek
View 8 Replies
View Related