I am trying to import some data from csv files. When I try it using bulk insert I get a conversion error. When I use the exact same format file and data file with an openrowset it works fine. I would prefer to use the BULK insert as I can make some generic stored procedures to handle all my imports and not have to code the column names in the SQL. Any suggestions?
BULK Insert stuff
From 'c:projects estdatalist.txt'
with
(FORMATFILE='c:projects estdatamyformat.xml')
insert into stuff (ExternalId, Description, ScheduledDate, SentDate, Name)
select *
from OPENROWSET (BULK 'c:projects estdatalist.txt',
FORMATFILE='c:projects estdatamyformat.xml')
as t1
The destination table has more columns than the data file. The Field IDs represent the ordinal position of the columns in the destination table. Column 1 in the destination table is an int identity. The conversion failure is from trying to convert column 5 to int which makes me think bulk insert is ignoring the name attributes in the XML and just trying to insert the columns into the table in order without skipping.
I'm just learning SSIS and I've hit my first bump. I am doing a bulk import from a tab delimited text file to an empty sql table that has a Idendity column defined. How do I tell the bulk insert task to skip that column when inserting from the text file. If I remove the identity column it imports the data fine, but I want to create the indentity column in the table too.
Hello,to accelerate loading xml data from many files into a table using openrowset (bulk...) I want to use a variable in the file specification and increment it within a loop similar to this:
declare @datnam varchar(100); DECLARE @MyCounter int; SET @MyCounter = 1; set @datnam = 'c:XML_DatenPOS_LOG_200608_'+ltrim(str(@MyCounter))+'.xml'; INSERT INTO GK_TO_KFH_ADAPTER_XML_NS (LOC_ID, MSG_CONTENT) SELECT @MyCounter, MSG_CONTENT FROM ( SELECT * FROM OPENROWSET (BULK @datnam, SINGLE_CLOB) AS MSG_CONTENT) AS R(MSG_CONTENT)
But I got the following error:
Msg 102, Level 15, State 1, Line 8 Incorrect syntax near '@datnam'.
Is there a way to this in that manner? Or is the bcp utility an alternative?
select * from openrowset(bulk '\server1c$file.txt', SINGLE_BLOB) as t works from sql server itself, but doesn't work from any other machine. I got "Operating system error code 5(Access is denied.)." I am running as the domain admin, the file.txt has full control for everyone. In server1 even log, I see Anonymous Logon.
I'm experiencing issues importing XML data using a distributed query with the following statement which is run from an XP client named WorkstationA connecting to SQL2005 SP2 ServerB, the XML data is located on ServerC.
AdHoc Queries using OpenRowSet has been enabled and verified.
The SQL Server service is running using a domain user account with permissions to read the remote files. I have logged in locally to the SQL server and verified this. It still fails even if the SQL services are running using LocalSystem.
User on Workstation A is authenticated with Integrated security (SQL Admin) and has rights to read the XML files on ServerC.
WorkStationA = SQL2005 Mgt Studio running the query ServerB = SQL2005 SP2 ServerC = XML data files
DECLARE @xml XML SELECT @xml =CONVERT(XML, bulkcolumn, 2) FROM OPENROWSET(BULK '\SERVERCSHAREPATHDATAFILE.XML', SINGLE_BLOB) AS x SELECT @xml
Results: Msg 4861, Level 16, State 1, Line 2
Cannot bulk load because the file "\SERVERCSHAREPATHDATAFILE.XML" could not be opened. Operating system error code 5(Access Denied).
The query fails when it is run from Workstation A connected to SQL ServerB querying data on ServerC via a UNC. The query is succesful when it is run from the local SQL ServerB. The problem is with distributed queries. The query is succesful when the XML files are local to the SQL server including referencing them via a local UNC
Is there a way to avoid entering column names in the excel template for me to create an excel file froma dynamic excel using openrowset. I have teh following code but it works fien when column names are given ahead of time. If I remove the column names from the template and just to Select * from the table and Select * from sheet1 then it tells me that column names donot match. Server: Msg 213, Level 16, State 5, Line 1Insert Error: Column name or number of supplied values does not match table definition. here is my code... SET @sql1='select * from table1'SET @sql2='select * from table2' IF @File_Name = '' Select @fn = 'C:Test1.xls' ELSE Select @fn = 'C:' + @File_Name + '.xls' -- FileCopy command string formation SELECT @Cmd = 'Copy C:TestTemplate1.xls ' + @fn -- FielCopy command execution through Shell Command EXEC MASTER..XP_CMDSHELL @cmd, NO_OUTPUT -- Mentioning the OLEDB Rpovider and excel destination filename set @provider = 'Microsoft.Jet.OLEDB.4.0' set @ExcelString = 'Excel 8.0;HDR=yes;Database=' + @fn exec('insert into OPENrowset(''' + @provider + ''',''' + @ExcelString + ''',''SELECT * FROM [Sheet1$]'') '+ @sql1 + '') exec('insert into OPENrowset(''' + @provider + ''',''' + @ExcelString + ''',''SELECT * FROM [Sheet2$]'') '+ @sql2 + ' ')
I have to update a field within a table of 60 records or so. Each record has a different field value. it's type varchar. i was given an excel file with the field values and was thinking of a bulk update like bulk insert, but i don't recall that it's possible that way.
Is the only way to create a table, bulk insert, then merge the two tables together with UPDATE?
Just wanted to see if there was an easier way to do it, otherwise i'll take the latter route. Thanks!
I have a table containing 8 million records. I need to replace 2 million of these records with a scaled down query that goes something like: SELECT 1, ShareholderID, Assets1 FROM MyTable (Yields appx. 200,000 recods) SELECT 2, ShareholderID, Assets2 FROM MyTable (Yields appx. 200,000 recods) . . . SELECT 10, ShareholderID, Assets1 + Assest2 + Assets3 + ... + Assets9 FROM MyTable (Yields appx. 200,000 recods)
Updates and cursors just seem to be too slow.
So far I have done the following, but was wondering if anyone could think of a better way. SELECT 6 million records that don't need to be deleted into a #TempTable Use statements above to select into same #TempTable DROP and recreate Original Table SELECT 6 + 2 million records INTO original table.
This seems rather convoluted. Is there a better approach? Would it be worth while to dump data to a file and use bcp / Bulk Insert
I'm having problems with the openrowset command in SQL Server 2005 stated below...
1. Create a new Excel file at C:ExcelFile.xls and enter values ColumnName1, ColumnName2, ColumnName3 respectively on the first row.
2. Open up SQL Server 2005 Management studio and issue the following query
insert into OPENROWSET('Microsoft.Jet.OLEDB.4.0',
'Excel 8.0;HDR=YES;IMEX=1;Database=C:ExcelFile.xls', 'SELECT * FROM [Sheet1$]')
select 1, 2, 3
I get the following error:...
OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "(null)" returned message "Cannot update. Database or object is read-only.".
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "(null)" reported an error. The provider indicates that the user did not have the permission to perform the operation.
Msg 7343, Level 16, State 2, Line 1
The OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "(null)" could not INSERT INTO table "[Microsoft.Jet.OLEDB.4.0]".
I can select data from openrowset as expected, but I cannot insert, update or delete from it. According to MSDN's documentation you should be able to.
I receive the following error message when I try to use the Bulk Insert Task to load BCP data into a table:
Error: 0xC002F304 at Bulk Insert Task, Bulk Insert Task: An error occurred with the following error message: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.The bulk load failed. The column is too long in the data file for row 1, column 4. Verify that the field terminator and row terminator are specified correctly.Bulk load data conversion error (overflow) for row 1, column 1 (rowno).".
Task failed: Bulk Insert Task
In SSMS I am able to issue the following command and the data loads into a TableName table with no error messages: BULK INSERT TableName FROM 'C:DataDbTableName.bcp' WITH (DATAFILETYPE='widenative');
What configuration is required for the Bulk Insert Task in SSIS to make the data load? BTW - the TableName.bcp file is bulk copy file as bcp widenative data type. The properties of the Bulk Insert Task are the following: DataFileType: DTSBulkInsert_DataFileType_WideNative RowTerminator: {CR}{LF}
Any help getting the bcp file to load would be appreciated. Let me know if you require any other information, thanks for all your help. Paul
I'm trying to use Bulk insert for the first time and getting the following error. I think it might have something to do with my Format File and from the error msg there's a conversion error for the first column. In my database the Field is nvarchar(6) so my best guess is to use SQLNChar for the first column. I've checked the end of each line is CR LF therefore the is correct for line 7 right?
Msg 4863, Level 16, State 1, Line 1 Bulk load data conversion error (truncation) for row 1, column 1 (ASXCode). Msg 7399, Level 16, State 1, Line 1 The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error. Msg 7330, Level 16, State 2, Line 1 Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
BULK INSERTtbl_ASX_Data_temp FROM 'M:DataASXImportTest.txt' WITH (FORMATFILE='M:DataASXSQLFormatImport.Fmt')
Before implementing memory based bulk copy insert with IRowsetFastLoad interface of SQL Server 2005 OLE DB provider, I want to know some considerations.
- performance : compared with T-SQL's "BULK INSERT ..." and bcp utility
- SQL Server's resource usage : when running memory based bulk copy, server resource's influence
- server side action(behavior) : when server is busy, delayed-update means IRowsetFastLoad::Commit(true) method can insert right after?
- row-count : The rowcount limitation can be inserted by IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit
I have a web form with a text field that needs to take in as much as the user decides to type and insert it into an nvarchar(max) field in the database behind. I've tried using the new .write() method in my update statement, but it cuts off the text after a while. Is there a way to insert/update in SQL 2005 this without resorting to Bulk Insert? It bloats the transaction log and turning the logging off requires a call to sp_dboptions (or a straight-up ALTER DATABASE), which I'd like to avoid if I can.
Hi, I need to pass data from a SQL Server data base to an Access data base. To do this I use the OPENROWSET as followed: FR
INSERT INTO OPENROWSET('Microsoft.Jet.OLEDB.4.0', 'C:Aux.mdb'; 'Admin'; '',Test) (Id, Name, TypeId) SELECT Id,Name,TypeId FROM Test ORDER BY TypeId
FR
On SQL Server 2000 or MSDE the data is transfered as expected, respecting the specified order. But when I run the same clause on a SQL 2005 EE the data is transfered, but the order is not respected. So my question is if I have to activate an option for the order to be respected or if this is a bug.
Hello, I am wondering is the Transaction Log logged differently between BULK INSERT vs INSERT? Performance speaking, which operations is generally faster given the same amout of data inserted.
Hi~, I have 3 questions about memory based bulk copy.
1. What is the limitation count of IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit(true)? For example, how much insert row at below sample?(the max value of nCount) for(i=0 ; i<nCount ; i++) { pIFastLoad->InsertRow(hAccessor, (void*)(&BulkData)); }
2. In above code sample, isn't there method of inserting prepared array at once directly(BulkData array, not for loop)
3. In OLE DB memory based bulk copy, what is the equivalent of below's T-SQL bulk copy option ? BULK INSERT database_name.schema_name.table_name FROM 'data_file' WITH (ROWS_PER_BATCH = rows_per_batch, TABLOCK);
------------------------------------------------------- My solution is like this. Is it correct?
// CoCreateInstance(...); // Data source // Create session
I have two SSIS packages that import from the same flat file into the same SQL 2005 table. I have one flat file connection (to a comma delimited file) and one OLE DB connection (to a SQL 2005 Database). Both packages use these same two Connection Managers. The SQL table allows NULL values for all fields. The flat file has "empty values" (i.e., ,"", ) for certain columns.
The first package uses the Data Flow Task with the "Keep nulls" property of the OLE DB Destination Editor unchecked. The columns in the source and destination are identically named thus the mapping is automatically assigned and is mapped based on ordinal position (which is equivalent to the mapping using Bulk Insert). When this task is executed no null values are inserted into the SQL table for the "empty values" from the flat file. Empty string values are inserted instead of NULL.
The second package uses the Bulk Insert Task with the "KeepNulls" property for the task (shown in the Properties pane when the task in selected in the Control Flow window) set to "False". When the task is executed NULL values are inserted into the SQL table for the "empty values" from the flat file.
So using the Data Flow Task " " (i.e., blank) is inserted. Using the Bulk Insert Task NULL is inserted (i.e., nothing is inserted, the field is skipped, the value for the record is omitted).
I want to have the exact same behavior on my data in the Bulk Insert Task as I do with the Data Flow Task.
Using the Bulk Insert Task, what must I do to have the Empty String values inserted into the SQL table where there is an "empty value" in the flat file? Why & how does this occur automatically in the Data Flow Task?
From a SQL Profile Trace comparison of the two methods I do not see where the syntax of the insert command nor the statements for the preceeding captured steps has dictated this change in the behavior of the inserted "" value for the recordset. Please help me understand what is going on here and how to accomplish this using the Bulk Insert Task.
hi friends i am trying for bulk insert using SQL server 2000using this codebulk insert xyzfrom 'D:authors.txt'WITH (FIELDTERMINATOR = ',') but it gve me error saying thatCould not bulk insert because file 'D:authors.txt' could not be opened. Operating system error code 21(error not found). i check file securityit has given full control to the file can any one give me idea about Operating System error code 21(error not found) thanks
Hi,I've a SP that insert records in one table and then call another insert SP on a second table. The first table is like a master table and the second is like a child table. After inserting the right record in the master table, I've to insert some record in the child table. This records differ each other only by two of about ten field, so what I'd want is not to call the second SP X times, but only one time.. Is it possible??ExampleTable1: Id (identity), Desc;Table2: Id (identity), Id_table1, Id_TableX, Num, Field1, Field2, ... Field10.In Table2 only Id_TableX and Num change every time... the other are all the same (for one record in Table1). How can I do? Probably with a bulk insert and a bulk update?? But, can I make a bulk xxx without a file??
hi friends i am using bulk insert cmd using my table name but i am facing error.....SO IS IT POSSIBLE TO USE BULK INSERT WITH TEMPRARY TABLE VARIABLE PLZ HELP ME
I want to move data from a text file to a SQL table. After DTS creates the table, does it use Bulk Insert to copy the data from the file to the table, or BCP?
I am trying to do a bulk insert from a data file into a linked access database. When I run the query I get the error message: 'Server: Msg 4801, Level 16, State 81, Line 1 Bulk_main: The opentable system function on BULK INSERT table failed. Not sure what the problem is because BOL just says to check Microsoft.com for updated error message information. However, when I went to the site there was no updated information. Has anyone else seen this error? If so, have you figured out the problem? Any help would be greatly appreciated. Thanks.
We've been using Bulk Insert to load our tables. But, recently we encountered this error message "There is insufficient system memory to run this query. [SQLSTATE 42000] (Error 701). The step failed."
Then, our DBA suggested that we use BCP. It seems to work fine until the file size exceeds 30MB. This is the message I get "Starting copy... 0 rows copied. Network packet size (bytes): 4096 Clock Time (ms.): total 14984. Process Exit Code 0. The step succeeded." Is this a known problem?
Then, we decided to use DTS. DTS seems to be able to handle any file size but it's a slower process than the other 2. Any suggestions?