SQL Server 2008 :: BULK INSERT Inserting No Rows
Aug 7, 2015
I am trying to BULK INSERT csv files using a stored procedure in SQL SERVER 2008R2 SP3. Although the files contain several thousand lines and BULK INSERT returns no errors, no data is actually imported into the table. Every field in the table is a NVARCHAR(50) datatype.
Here is the code for the operation (only the parameters for the insert itself):
set @open = 'bulk insert [DWHStaging].[dbo].[Abverkaufsquote] from '''
set @path = 'G:DataStagingDWHStagingSourceAbverkaufsquote'
set @params = ''' with (firstrow = 2
, datafiletype = ''widechar''
, fieldterminator = '';''
, rowterminator = ''
''
, codepage = ''1252''
, keepnulls);'
The csv file originates from a DB2 database. Using exactly the same code base I can import several other types of CSV files without problem.
The files are stored on the local server with as UCS2 Little Endian and one difference is that the files that do not import do not include a BOM. The other difference is that the failed files are non-UNICODE files.
View 4 Replies
ADVERTISEMENT
May 22, 2015
I've got a piece of code that returns 53 records when using just the SELECT section.When I change it to INSERT INTO ..... SELECT it only inserts 39 records into the receiving table.There are no keys/contraints/indices or anything else on the receiving table (it's just a dumping ground for some data that will be processed later).
The code for creating the table is here:-
USE [CDSExtractInpatients6.2]
GO
/****** Object: Table [dbo].[CDS_Inpatients_CDS_Feeds_Import] Script Date: 22/05/2015 15:54:15 ******/
SET ANSI_NULLS ON
GO
[code]...
I know most of the date fields are being created as varchar on here, but this is something I inherited and the SELECT is outputting the dates as text.Don't know if it makes any difference, but the server is running SQL2008.
View 9 Replies
View Related
Dec 28, 2013
I'm able to successfully import data in a tab-delimited .txt file using the following statement.
BULK INSERT ImportProjectDates FROM "C: mpImportProjectDates.txt"
WITH (FIRSTROW=2,FIELDTERMINATOR = ' ', ROWTERMINATOR = '')
However, in order to import the text file, I had to add columns to the text file to match the columns that exist in the table. The original file is an export out of another database and contains all but 5 columns from my db.
How would I control which column BULK INSERT actually imports when working with a .txt file? I've tried using a FORMAT FILE, however I kept getting errors which I tracked down to being a case of not using it with a .txt file.
Yes, I could have the DBA add in the missing columns to the query from the other DB to create the columns, however I'd like to know a little bit more about this overall.
View 9 Replies
View Related
Mar 5, 2015
I have to perform a bulk Import on a regular Basis and have created a script to do this. The Problem is that the .csv file has 12 Columns and the table to Import into has 14. To Workaround this discrepancy I have decided to use a Format file. The Problem is that how to create one.
View 3 Replies
View Related
Mar 23, 2015
I want to bulk insert data into a table named scd_event_tab inside a database named rdb.
When I do select * from rdb.dbo.scd_event_tab, i get :
JOB_ID RUN_ONPRIORITYPAYLOADTIMEOUT_INTERVALSTATUSPICKUP_TIMESCD_TYPESCHEDULE_IDDB_ADMIN_LOGIN_REQUIRED_YN
I saved the result into a csv file and then truncated the table. Now, I am trying to bulk insert the data into the table. So I used:
bulk insert
rdb.dbo.scd_event_tab from 'C:userssluintel.ctrdesktopeventtab.csv'
with
(
codepage = 'RAW',
datafiletype = 'native',
fieldterminator = ' ',
keepidentity,
keepnulls
);
go
However, I get this error:
Msg 4867, Level 16, State 1, Line 1
Bulk load data conversion error (overflow) for row 1, column 1 (JOB_ID).
Msg 4866, Level 16, State 5, Line 1
The bulk load failed. The column is too long in the data file for row 1, column 3. Verify that the field terminator and row terminator are specified correctly.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
View 9 Replies
View Related
Mar 18, 2015
Can we bulk insert only the desired column from a flat file to a table?
I am using SSIS to bulk insert from a file with more than 200 columns. I am trying to find a way I can bulk insert them to multiples table through SSIS.
The one way I can think is pre map the columns from the file to the destination tables. Build numerous Bulk Insert tasks to achieve that. But not sure if SSIS will let me do that.
View 4 Replies
View Related
Sep 12, 2006
Hi Andrea,
I have made a table which contain data inserted manually and also data that was inserted by using bulk insertion. I have no problems using the table with Grid View. But when I try to use a query like the following:
SELECT *
FROM dbo.last
WHERE VMake = 'Honda'
AND VType = 'sedan'
AND VColor = 'Red';
I would only get the data that were inserted manually.
When I use the same query to filter data from a table that the data was inserted by using bulk insert, I get column names but no data.
Any help please.
Juvan
View 1 Replies
View Related
Sep 29, 2007
Hi,
I have a data file in the folloing format
SubjectId1|class1
SubjectId2|class2
SubjectId3|class3
I just wanted to insert only SubjectIds into my table 'Subjects' which has the follwing schama ignoring the classes
The row delimeter is "
" and the column delimeter is ' | '
Table Subjects
{
ID (Autoincrement)
SubjectId varchar(20)
}
Can any one provide the format file for doing this or suggest anyway to do this?
Please do note that the file may contain millions of records
Thank u
~mohan
View 5 Replies
View Related
Feb 26, 2015
I am writing a query to return some production data. Basically i need to insert either 1 or 2 rows into a Table variable based on a decision as to does the production part make 1 or 2 items ( The Raw data does not allow for this it comes from a look up in my database)
I can retrieve all the source data i need easily but when i come to insert it into the table variable i need to insert 1 record if its a single part or 2 records if its a twin part. I know could use a cursor but im sure there has to be an easier way !
Below is the code i have at the moment
declare @startdate as datetime
declare @enddate as datetime
declare @Line as Integer
DECLARE @count INT
set @startdate = '2015-01-01'
set @enddate = '2015-01-31'
[Code] .....
View 1 Replies
View Related
Oct 10, 2007
Hi,
I have a data file which consists of data as below,
4
PPU_FFA7485E0D||
T_GLR_DET_11||
While iam inserting into table using bulk insert, this pipe(||) is also getting inserted into the table,
here is my query iam using to insert the data using bulk insert.
BULK INSERT TABLE_NAME FROM FILE_PATH
WITH (FIELDTERMINATOR = ''||'''+',KEEPNULLS,FIRSTROW=2,ROWTERMINATOR = '''')
Can any one help on this.
Thanks,
-Badri
View 7 Replies
View Related
Dec 6, 2007
Hi Guys,
My little bulk insert is only bringing every second row of a CSV file. this is not good as i need every row.
My SQLcommand is thus.
InsertCommand="BULK INSERT TBL_Unitel_services FROM 'C:/webroot/servicedesk/csvs_Services/csv.csv' WITH (FIRSTROW = 1, FIELDTERMINATOR = ',', ROWTERMINATOR = '', MAXERRORS = 0) "
View 2 Replies
View Related
May 15, 2008
Greetings
Got bulk insert thing going real nice thanks to your feedback! But now I notice that the BULK INSERT creates 2 identical rows while the flat file has the column headers and then corresponding row, just one row.
Why would it do that. Interestingly if the data file and format file are residing on a shared folder on SQL 2005, and I ran the BULK INSERT from studio it only inserts one row. But when I do BULK INSERT via user interface it creates 2 rows. What is going here.
View 10 Replies
View Related
Aug 7, 2006
Hello
Iam using:
Microsoft SQL Server 2000 - 8.00.2039 (Intel X86) May 3 2005 23:18:38
Copyright (c) 1988-2003 Microsoft Corporation
Desktop Engine on Windows NT 5.1 (Build 2600: Service Pack 2)
I am using BULK INSERT to import some pipe-delimited flat files into a database.
I am firstly converting the file using VB.NET, to ensure each line of the file has a carriage return (by using streamwriter.writeline), and I am also ensuring there is no blank line at the end of the file (by using streamwriter.write).
Once I have done this, my BULK INSERT command appears to work OK. This is how I am using the statement:
BULK INSERT
tempHISTORY
FROM 'C:TEMPHISTORY.TXT'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = '|',
ROWTERMINATOR = '
'
)
NB: The first row in the file is a header row.
This appears to work OK, however, I have found that certain files seem to miss the final line of the file! I have analysed these files incase they have an inconsistant number of columns but they don't.
I have also found that if I knock off the last column of the tempHISTORY table, the correct number of rows are imported. However, of course, I can't just discard one of the columns from the file, I need to import the entire file.
I cannot understand why BULK INSERT is choosing to miss the final line in the file, when the schema of the destination table matches the structure of the file.
View 2 Replies
View Related
May 15, 2015
I have a file which has some wind data that i am trying to import into a sql data base through bulk insert. if the script works as it supposed i should see 144 rows impacted but i see 0 rows affected.Â
BULK INSERT TOWER.RAWINTERFACE_1058 FROM 'C:Temp900020150427583.txt'Â
WITH(CHECK_CONSTRAINTS,CODEPAGE='RAW',DATAFILETYPE='char',FIELDTERMINATOR=' ',ROWTERMINATOR='
',FIRSTROW=172)
The code works if the file is large but if its small 0 rows are affected. and also if i remove the header rows then the file works again. want to understand what is going on here. i am including the screen shot of the file in notepad++. I have tried changing the row terminator to ' ' , ' ' and also tried to change the codepage but nothing seems to work. No error file is being generated either, if i give a error file option.Â
View 7 Replies
View Related
Jul 13, 2006
I have a stored procedure which will run automatically. I've got try...catch code in the procedure, but I found a bug with the code where if there are any import errors, it doesn't recognize that that there was an error and it runs through the try code as through there was no problems. (I reported the bug).
I added some code using @@rowcount to check if there were rows imported, and if not, it moves the data file to a error folder so I know there was a problem with the import. But this only checks if at least one row was imported, not if all the rows in the datafile have been imported. (i.e. if the first row imported correctly, and the second did not, it still sees it as successful).
The problem is some of the data files have only one row to import and some have multiple rows. Is there a way to count the number of rows in the datafile, then count the number of rows imported, to verify they are the same number imported?
Thanks,
Laura
View 2 Replies
View Related
Jul 13, 2006
I have a stored procedure which will run automatically. I've got try...catch code in the procedure, but I found a bug with the code where if there are any import errors, it doesn't recognize that that there was an error and it runs through the try code as through there was no problems. (I reported the bug).
I added some code using @@rowcount to check if there were rows imported, and if not, it moves the data file to a error folder so I know there was a problem with the import. But this only checks if at least one row was imported, not if all the rows in the datafile have been imported. (i.e. if the first row imported correctly, and the second did not, it still sees it as successful).
The problem is some of the data files have only one row to import and some have multiple rows. Is there a way to count the number of rows in the datafile, then count the number of rows imported, to verify they are the same number imported?
Thanks,
Laura
View 6 Replies
View Related
May 21, 2007
Does sql server have a way to handle errors in a sproc which would allowone to insert rows, ignoring rows which would create a duplicate keyviolation? I know if one loops one can handle the error on a row by rowbasis. But is there a way to skip the loop and do it as a bulk insert?It's easy to do in Access, but I'm curious to know if SQL Server propercan handle like this. I am guessing that a looping operation would beslower to execute?
View 2 Replies
View Related
Jul 23, 2005
Hi,I have an application running on a wireless device and being wireless Iwant it to use bandwidth as efficiently as possible. Therefore, I wantthe SQL statement that it uploads to the SQL Server to be as efficientas possible. In one instance, I give it four records to upload, whichcurrently I have as four seperate SQL statements seperated by a ";".However, all the INSERT INTO... information is the same each time, theonly that changes is the VALUES portion of each command. Also, I haveto have the name of each column to receive the data (believe it or not,these columns are only a small subset of the columns in the table).Here is my current SQL statement:INSERT INTO tblInvTransLog ( intType, strScreen, strMachine, strUser,dteDate, intSteelRecID, intReleaseReceiptID, strReleaseNo, intQty,dblDiameter, strGrade, HeatID, strHeatNum, strHeatCode, lngfkCompanyID)VALUES (1, 'Raw Material Receiving', '[MachineNo]', '[CurrentUser]','3/21/2005', 888, 779, '2', 5, 0.016, '1018', 18, '610T142', 'K8',520);INSERT INTO tblInvTransLog ( intType, strScreen, strMachine, strUser,dteDate, intSteelRecID, intReleaseReceiptID, strReleaseNo, intQty,dblDiameter, strGrade, HeatID, strHeatNum, strHeatCode, lngfkCompanyID)VALUES (1, 'Raw Material Receiving', '[MachineNo]', '[CurrentUser]','3/21/2005', 888, 779, '2', 9, 0.016, '1018', 30, '14841', 'B9', 344);Since the SQL statement INSERT INTO portion remains the same everytime, it would be good if I could have the INSERT INTO portion onlyonce and then any number of VALUES sections, something like this:INSERT INTO tblInvTransLog (intType, strScreen, strMachine, strUser,dteDate, intSteelRecID, intReleaseReceiptID, strReleaseNo, intQty,dblDiameter, strGrade, HeatID, strHeatNum, strHeatCode, lngfkCompanyID)VALUES (1, 'Raw Material Receiving', '[MachineNo]','[CurrentUser]', '3/21/2005', 888, 779, '2', 5, 0.016, '1018', 18,'610T142', 'K8', 520)VALUES (1, 'Raw Material Receiving', '[MachineNo]','[CurrentUser]', '3/21/2005', 888, 779, '2', 9, 0.016, '1018', 30,'14841', 'B9', 344);But this is not a valid SQL statement. But perhaps someone with a morecomprehensive knowledge of SQL knows of way. Maybe there is a way tostore a string at the header of the command then use the string name ineach seperate command(??)
View 2 Replies
View Related
Apr 26, 2004
Hello Everyone,
I am currently struggling with a problem bulk inserting data into sql server.
The application I am writing is multi-threaded and downloads about 2000 records every 2 seconds on x amount of threads (depending on bandwidth).
What I would like to do is find a 'friendly' way to insert this data into sql server without hammering the cpu.. I have tried the following with little success.
1, Using a single insert and thread.sleep(x * 20) to allow for massive data input, altough this made the application more stable and lowered cpu usage to very little the data takes about 60 times longer to download and process into the database.
2, Using a SqlDataAdapter and DataSet and updating the database via the .Update method of the data adapter.. Simply this was awful and took forever to process the data into the database.. (Took about 30 seconds to process 2000 records and Command Timeout was high).
3, Using OpenXML in SqlServer and parsing the data as an XML string (nText), although this method is fast its still very CPU intensive. I have to set the command timeouts very high to allow for this approach (because of the multi-threaded nature of the app).
Does anyone have any idea's on a cpu friendly approach to this problem ??
Thanks in advance..
Gary.
View 4 Replies
View Related
Feb 3, 2010
we can easily load a file into db tables. However, my main concern here is the number of columns in the file. A text file TEXT_1400.txt has 1400 columns. I am unable to load data to my db table using BCP or BULK INSERT commands, as maximum of 1024 columns are allowed per table in SQL Server 2008.Â
We can still go ahead and create ‘Wide Table’ (a special table that holds up to 30,000 columns. The maximum size of a wide table row is 8,019 bytes.). But when operating on wide table, BCP/BULK INSERT commands still fail. After few hours of scratching my head over BCP and BULK INSERT, I observed that while inserting BCP/BULK INSERT commands are unable to identify SPARSE columns and skip these columns, which disturbs column mapping and results in data conversion and trancation errors.
Â
Is there any proper way to load this kind of files into the db table?Â
View 6 Replies
View Related
Apr 18, 2008
Hello,
I'm just learning SSIS and I've hit my first bump. I am doing a bulk import from a tab delimited text file to an empty sql table that has a Idendity column defined. How do I tell the bulk insert task to skip that column when inserting from the text file. If I remove the identity column it imports the data fine, but I want to create the indentity column in the table too.
Thanks.
View 8 Replies
View Related
Nov 2, 2007
Does anyone know how to do a bulk insert using just the script task? I've been searching everyehere but can't seem to find a sample.
View 6 Replies
View Related
Jan 17, 2008
Im having some issues with bulk insert.
This is the table:
CREATE TABLE [dbo].[tmp_GA_status](
[GA_recno] [int] NOT NULL,
[GA_desc] [varchar](40) NULL
)
This is the file (unicode):
1|"test1"
2|"test2"
3|"test3"
4|"test4"
5|"test5"
6|"test6"
7|"test7"
8|"test8"
and this is the sql:
bulk insert tmp_GA_status from 'C: empTextDumpGA_status.dta'
with (CODEPAGE='RAW', FIELDTERMINATOR='|', ROWTERMINATOR='
', DATAFILETYPE='widechar')
so yeah, pretty simple. But whatever I do I get this;
Msg 4864, Level 16, State 1, Line 1
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 2 (GA_desc).
So what am I doing wrong ?
View 13 Replies
View Related
Sep 27, 2007
I have to update a field within a table of 60 records or so. Each record has a different field value. it's type varchar. i was given an excel file with the field values and was thinking of a bulk update like bulk insert, but i don't recall that it's possible that way.
Is the only way to create a table, bulk insert, then merge the two tables together with UPDATE?
Just wanted to see if there was an easier way to do it, otherwise i'll take the latter route. Thanks!
View 1 Replies
View Related
Jan 31, 2008
I have created a table Table with name as Varchar and id as int. Now i have started inserting the rows like, insert into Table values ('arun',20).Yes i have inserted a row in the table. Now i have got the values " arun's ", 50. insert into Table values('arun's',20) My sqlserver is giving me an error instead of inserting the row. How will you solve this problem?
View 3 Replies
View Related
Mar 7, 2008
Hello,
I need help regarding bulk insert into sql server. basically i m making a customized webapplication to save all the file and directories saving it in an arraylist and then want to save the contents of that array list into sql server table. now my questions are:
1) should i use arraylist for this purpose(i mean is it fast as my data is too big)
2) how to save all the data from arraylist into sql server table with bulkinsert or smthing like that. i dont want to use insert query with looping through the arraylist, as it will take lot of time and resources.
hope you get my query.
View 4 Replies
View Related
Apr 2, 2008
I am having troubles trying to copy some rows from a table on my local computer to a table on a remote SQL Server 2005 that is being hosted by one of thos web hosting companies. The problem is that the table has an identity column. I first tried using the the following command:
SET IDENTITY_INSERT [remoteservername].Library2005.dbo.tblLanguages ON
but that results in the error:
Msg 8103, Level 16, State 1, Line 1
Table 'remoteservername.Library2005.dbo.tblLanguages' does not exist or cannot be opened for SET operation.
I read in another topic, that I should change this into the following:
EXECUTE [remoteservername].Library2005.dbo.sp_executesql N'SET IDENTITY_INSERT dbo.tblLanguages ON'
That command executes without error, but the problem is that I cannot perform the actual insert, because it is not within the execute statement. In other words, the following doesn't work:
EXECUTE [remoteservername].Library2005.dbo.sp_executesql N'SET IDENTITY_INSERT dbo.tblLanguages ON'
INSERT INTO [remoteservername].Library2005.dbo.tblLanguages
(colLangID, colEnglish, colGerman, colSpanish)
SELECT colLangID, colEnglish, colGerman, colSpanish FROM tblLanguages
This results in the error:
Msg 7344, Level 16, State 1, Line 2
OLE DB provider 'SQLOLEDB' could not INSERT INTO table '[remoteservername].[Library2005].[dbo].[tblLanguages]' because of column 'colLangID'. The user did not have permission to write to the column.
The remote server is linked correctly on my end via the sp_addlinkedserver and sp_addlinkedsrvlogin. Is there any way to force the remote server to turn IDENTITY_INSERT ON permanently and then let me execute as many INSERTS as I want and then turn it back OFF?
View 2 Replies
View Related
Oct 11, 2000
I have a table containing 8 million records.
I need to replace 2 million of these records with
a scaled down query that goes something like:
SELECT 1, ShareholderID, Assets1
FROM MyTable (Yields appx. 200,000 recods)
SELECT 2, ShareholderID, Assets2
FROM MyTable (Yields appx. 200,000 recods)
.
.
.
SELECT 10, ShareholderID, Assets1 + Assest2 + Assets3 + ... + Assets9
FROM MyTable (Yields appx. 200,000 recods)
Updates and cursors just seem to be too slow.
So far I have done the following, but was wondering if anyone could think of a better way.
SELECT 6 million records that don't need to be deleted into a #TempTable
Use statements above to select into same #TempTable
DROP and recreate Original Table
SELECT 6 + 2 million records INTO original table.
This seems rather convoluted. Is there a better approach? Would it be worth while to dump data to a file and use bcp / Bulk Insert
Any comments are appreciated,
-Marc
View 3 Replies
View Related
Oct 25, 2006
does anyone know how to do a bulk insert in SQL Server Express thanks Marcel
View 2 Replies
View Related
Dec 27, 2007
Hi!
I'm building a web application. I need to read data from a text or excel file and process the data and then store the result records into database. The record number is big. I can store the data record into database (SQL Server 2005) one at a time. I think it's slow. Is there any way to insert the data in bulk.
Thanks!
ccy
View 4 Replies
View Related
Nov 7, 2001
I am running the following:
BULK INSERT DB.dbo.[stblCLIENT]
FROM 'SERVER1downloadClient.txt'
WITH
(
FIELDTERMINATOR = 'Ø',
ROWTERMINATOR = ''
)
DB.dbo.[stblCLIENT] is on SERVER2. I receive the following error:
"Could not bulk insert because file 'SERVER1downloadClient.txt' could not be opened. Operating system error code 53(The network path was not found.)."
I am able to run a DTS package that imports the same text file from SERVER1 with no error.
Is BULK INSERT limited to importing text files from the server on which SQL Server is running or should I be able to BULK INSERT from another server on my LAN?
View 1 Replies
View Related
Sep 4, 2006
I'm trying to import data from flat file in table and have fewproblems.1.Field Delimiter is ',' (comma). If ',' occurs in quotedstring it is still treated as field delimiter. This is BUG or ?2.In table I have datetime field that can be null, but bulkinsert reports error if in flat file is null or ''. It's OK only whenreal date is specified.Table:create table AttachmentList (Code integer not null,ClassID integer null,Description varchar(200) null,ValidUntil datetime null,constraint PK_ATTACHMENTLIST primary key (Code))flat file.1,13,'Naputak, CU 261098', ''Thanks in advanceDavor
View 3 Replies
View Related