I have just one confusion which I want to clear
In logshipping if I insert records through bulk commands in primary database so those records will reflect on the secondary database or not coz as per my knowledge records which we insert through bulk commands will directly update the .mdf file of the database.
I have to update a field within a table of 60 records or so. Each record has a different field value. it's type varchar. i was given an excel file with the field values and was thinking of a bulk update like bulk insert, but i don't recall that it's possible that way.
Is the only way to create a table, bulk insert, then merge the two tables together with UPDATE?
Just wanted to see if there was an easier way to do it, otherwise i'll take the latter route. Thanks!
I have a table containing 8 million records. I need to replace 2 million of these records with a scaled down query that goes something like: SELECT 1, ShareholderID, Assets1 FROM MyTable (Yields appx. 200,000 recods) SELECT 2, ShareholderID, Assets2 FROM MyTable (Yields appx. 200,000 recods) . . . SELECT 10, ShareholderID, Assets1 + Assest2 + Assets3 + ... + Assets9 FROM MyTable (Yields appx. 200,000 recods)
Updates and cursors just seem to be too slow.
So far I have done the following, but was wondering if anyone could think of a better way. SELECT 6 million records that don't need to be deleted into a #TempTable Use statements above to select into same #TempTable DROP and recreate Original Table SELECT 6 + 2 million records INTO original table.
This seems rather convoluted. Is there a better approach? Would it be worth while to dump data to a file and use bcp / Bulk Insert
I'm trying to use Bulk insert for the first time and getting the following error. I think it might have something to do with my Format File and from the error msg there's a conversion error for the first column. In my database the Field is nvarchar(6) so my best guess is to use SQLNChar for the first column. I've checked the end of each line is CR LF therefore the is correct for line 7 right?
Msg 4863, Level 16, State 1, Line 1 Bulk load data conversion error (truncation) for row 1, column 1 (ASXCode). Msg 7399, Level 16, State 1, Line 1 The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error. Msg 7330, Level 16, State 2, Line 1 Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
BULK INSERTtbl_ASX_Data_temp FROM 'M:DataASXImportTest.txt' WITH (FORMATFILE='M:DataASXSQLFormatImport.Fmt')
Before implementing memory based bulk copy insert with IRowsetFastLoad interface of SQL Server 2005 OLE DB provider, I want to know some considerations.
- performance : compared with T-SQL's "BULK INSERT ..." and bcp utility
- SQL Server's resource usage : when running memory based bulk copy, server resource's influence
- server side action(behavior) : when server is busy, delayed-update means IRowsetFastLoad::Commit(true) method can insert right after?
- row-count : The rowcount limitation can be inserted by IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit
I'm just learning SSIS and I've hit my first bump. I am doing a bulk import from a tab delimited text file to an empty sql table that has a Idendity column defined. How do I tell the bulk insert task to skip that column when inserting from the text file. If I remove the identity column it imports the data fine, but I want to create the indentity column in the table too.
Hi~, I have 3 questions about memory based bulk copy.
1. What is the limitation count of IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit(true)? For example, how much insert row at below sample?(the max value of nCount) for(i=0 ; i<nCount ; i++) { pIFastLoad->InsertRow(hAccessor, (void*)(&BulkData)); }
2. In above code sample, isn't there method of inserting prepared array at once directly(BulkData array, not for loop)
3. In OLE DB memory based bulk copy, what is the equivalent of below's T-SQL bulk copy option ? BULK INSERT database_name.schema_name.table_name FROM 'data_file' WITH (ROWS_PER_BATCH = rows_per_batch, TABLOCK);
------------------------------------------------------- My solution is like this. Is it correct?
// CoCreateInstance(...); // Data source // Create session
I receive the following error message when I try to use the Bulk Insert Task to load BCP data into a table:
Error: 0xC002F304 at Bulk Insert Task, Bulk Insert Task: An error occurred with the following error message: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.The bulk load failed. The column is too long in the data file for row 1, column 4. Verify that the field terminator and row terminator are specified correctly.Bulk load data conversion error (overflow) for row 1, column 1 (rowno).".
Task failed: Bulk Insert Task
In SSMS I am able to issue the following command and the data loads into a TableName table with no error messages: BULK INSERT TableName FROM 'C:DataDbTableName.bcp' WITH (DATAFILETYPE='widenative');
What configuration is required for the Bulk Insert Task in SSIS to make the data load? BTW - the TableName.bcp file is bulk copy file as bcp widenative data type. The properties of the Bulk Insert Task are the following: DataFileType: DTSBulkInsert_DataFileType_WideNative RowTerminator: {CR}{LF}
Any help getting the bcp file to load would be appreciated. Let me know if you require any other information, thanks for all your help. Paul
hi friends i am trying for bulk insert using SQL server 2000using this codebulk insert xyzfrom 'D:authors.txt'WITH (FIELDTERMINATOR = ',') but it gve me error saying thatCould not bulk insert because file 'D:authors.txt' could not be opened. Operating system error code 21(error not found). i check file securityit has given full control to the file can any one give me idea about Operating System error code 21(error not found) thanks
Hi,I've a SP that insert records in one table and then call another insert SP on a second table. The first table is like a master table and the second is like a child table. After inserting the right record in the master table, I've to insert some record in the child table. This records differ each other only by two of about ten field, so what I'd want is not to call the second SP X times, but only one time.. Is it possible??ExampleTable1: Id (identity), Desc;Table2: Id (identity), Id_table1, Id_TableX, Num, Field1, Field2, ... Field10.In Table2 only Id_TableX and Num change every time... the other are all the same (for one record in Table1). How can I do? Probably with a bulk insert and a bulk update?? But, can I make a bulk xxx without a file??
Hi, I am working on an application that is to read a large number of XML files, take out specific values from each file, and store these in a SQL server so that reports can be generated from these values. There are some 15-20,000 files for each month of the year. I am OK with parsing the files and getting the fields that I need but I don't want to insert one record at a time as I parse the files. I was told that I can create a .exe file that parses the xml files and stores the required values in a csv file and use these csv files to initiate a bulk insert, using Business Intelligence Studio. I have not been able to find any info or article on how to do this. Any help on how I can accomplish this, or alternate solutions is greatly appreciated.
hi friends i am using bulk insert cmd using my table name but i am facing error.....SO IS IT POSSIBLE TO USE BULK INSERT WITH TEMPRARY TABLE VARIABLE PLZ HELP ME
I want to move data from a text file to a SQL table. After DTS creates the table, does it use Bulk Insert to copy the data from the file to the table, or BCP?
I am trying to do a bulk insert from a data file into a linked access database. When I run the query I get the error message: 'Server: Msg 4801, Level 16, State 81, Line 1 Bulk_main: The opentable system function on BULK INSERT table failed. Not sure what the problem is because BOL just says to check Microsoft.com for updated error message information. However, when I went to the site there was no updated information. Has anyone else seen this error? If so, have you figured out the problem? Any help would be greatly appreciated. Thanks.
I have a question...As per my knowledge bulk copying is not possible during the backup operation. If backup starts first backup will go and bulk copy will fail or bulk copy starts first backup fails and bulk copy will continue... Today I was testing bcp to run in dts using the EXCUTE PROCESS TASK (with this task we run any Win32 Excutable or batch file). I am trying to bcp out from one database(source) and bcp in to another database(destination). While running this package backup also running... I have started the database (destination) backup job and it was running and I started another job to run the dts (even I ran dts manually). Both the josbs succeeded and inserted the data into the table.... Can any one shed some light on this ....
We've been using Bulk Insert to load our tables. But, recently we encountered this error message "There is insufficient system memory to run this query. [SQLSTATE 42000] (Error 701). The step failed."
Then, our DBA suggested that we use BCP. It seems to work fine until the file size exceeds 30MB. This is the message I get "Starting copy... 0 rows copied. Network packet size (bytes): 4096 Clock Time (ms.): total 14984. Process Exit Code 0. The step succeeded." Is this a known problem?
Then, we decided to use DTS. DTS seems to be able to handle any file size but it's a slower process than the other 2. Any suggestions?
I often encountered to insert multiple records into a certain table, such as: DESCRIPTION AMOUNT SALARY 8000 ALLOWANCE 100 CASH GIFT 400 FOOD ALLOWANCE 460
SCENARIO: I want to insert at one time the example written above into TBLcompensation table with the following fields, ID(identity,1), DESCRIPTION ,AMOUNT
QUESTION: Is there other ways to INSERT multiple records at one time?
MY SOLUTION: I collected all records and concatenated it as one string with a special character separating between fields and rows. Then i do the looping on the STORE PROCEDURE. I believe this is not an effecient way.
i am loading xml files present in a folder to sql server. i am doing this by means of ActiveX script (VB). i am passing every xml file and its schema definition file as parameter. later this script automatically create a table in script and loads the the data to database. for every xml in that foleder there is only one xsd.--------this was my intention. but when i run the DTS script error is coming like "Error source: schema mapping, unable to upload xsd file".
is that always necessary to add (xlmns:sql "urn:schemas-microsoft-com:mapping-schema") in its schema definition file.
I want to bulk copy a table using 'SELECT INTO' from a database in server1 to a database in server2. The 'FROM' part of the select into only allows three parameters 'database, user & table' within the the one server.
I remember seeing some option that allows one to bulk copy across servers but cant find it ?????
hi , I am using command line for bulk copy operation. I do have a couple of tables with some triggers to move data from one table to another on an insert trigger, I was just wondering, Is those triggers going to fire when I import data into the tables using bcp command line?
Does anyone know if it is possible when using SQL Bulk Insert or BCP will allow for a variable number of columns in the input csv file? Or is it a requirement for these two commands to have a fixed number of cols in the input file?
I'm using Bulk Insert with fieldterminator = '","' and rowterminator = '' for the delimited csv file, but my input file does not have a fixed number of cols, and is not padded out with the appropiate number of empty ",".
I am trying to do the data transfer using BULK INSERT from a dat file. And the data is only one row . The bulk insert is giving me the error "Bulk insert data conversion error (truncation) for row 1, column 11 (extension1)" The line given below is that data as it appear in the dat file.