I'm using asp.net and the SqlBulkCopy command to insert a few lines of data into a table. I have the SqlBulkCopyOption.FireTriggers set, and a simple trigger on the table to take the inserted data and update a couple of other tables. Works fine if I only insert one record, but as soon as there are multiple records it fails.
So I presume that the trigger isn't firing on every line, but only once when the bulk insert is done?
Am I missing a trick? I can post some code if it will help - I'll have to get a screengrab of the error too.
This is inserting thousands of rows into a table that has a trigger on it. The trigger updates a seperate table.
Now my question is will the trigger fire for every record inserted because I need it to fire only once as the table the trigger is updating is becoming locked when the insert statement is executed. I am assuming it is because of the number of times the trigger is being fired that the table is becoming locked. Any help would be greatly appreciated.
When you import data using DTS into a table that has triggers - do the triggers fire off if there are triggers for on insert or on after insert?Thanks,--Micah
I am currently struggling with a problem bulk inserting data into sql server.
The application I am writing is multi-threaded and downloads about 2000 records every 2 seconds on x amount of threads (depending on bandwidth).
What I would like to do is find a 'friendly' way to insert this data into sql server without hammering the cpu.. I have tried the following with little success.
1, Using a single insert and thread.sleep(x * 20) to allow for massive data input, altough this made the application more stable and lowered cpu usage to very little the data takes about 60 times longer to download and process into the database.
2, Using a SqlDataAdapter and DataSet and updating the database via the .Update method of the data adapter.. Simply this was awful and took forever to process the data into the database.. (Took about 30 seconds to process 2000 records and Command Timeout was high).
3, Using OpenXML in SqlServer and parsing the data as an XML string (nText), although this method is fast its still very CPU intensive. I have to set the command timeouts very high to allow for this approach (because of the multi-threaded nature of the app).
Does anyone have any idea's on a cpu friendly approach to this problem ??
The following produces no errors, but does not place any rows in the table:
USE MoneyManager GO BULK INSERT DailyPrices FROM 'C:DownloadsSecuritiess20070105.dat' WITH ( firstrow = 2, formatfile = 'C:DownloadsDailyPrices.fmt', errorfile = 'c:downloadsprice_error_file.txt' )
Hi at all,I'm trying to bulk insert a uniqueidentifier column from unicode file.In my file I have guid generated from c# application and they areformatted in this way (separated by "|") :guid | field1 | field2fc0c0c42-438e-4897-96db-8b0489e873ef|field1|field2In my destination table I have three column:id (uniqueidentifier)field1 (nvarchar)field2 (nvarchar)I use in bulk insert a format file like this :9.031SQLNCHAR00"| "1IDLatin1_General_CI_AS2SQLNCHAR00"| "2Field1Latin1_General_CI_AS3SQLNCHAR00"| "3Field2Latin1_General_CI_ASand I use this scriptBULK INSERT [dbo].[KWTA2] FROM 'd:WTA2.txt'WITH (FORMATFILE = 'd:wta2Format.FMT')It doesn't work, it prints outMsg 8152, Level 16, State 13, Line 2String or binary data would be truncated.I've also tried to specify in FMT file SQLUNIQUEID instead of SQLNCHARand it works perfectly but it imports another data. For example theguid fc0c0c42-438e-4897-96db-8b0489e873ef became00350031-0039-0033-3100-300030003000Please can you help me?Why sql converts alphanumerical GUID into only numbers ID?How can I bulk insert GUID? (I didn't find anything googling around : )Thanks!Bob
Hi Andrea, I have made a table which contain data inserted manually and also data that was inserted by using bulk insertion. I have no problems using the table with Grid View. But when I try to use a query like the following:
SELECT * FROM dbo.last WHERE VMake = 'Honda' AND VType = 'sedan' AND VColor = 'Red';
I would only get the data that were inserted manually. When I use the same query to filter data from a table that the data was inserted by using bulk insert, I get column names but no data.
And i have data in a file as follows ( column delimeter is | and row deleimeter is new line character ( windows ) )
4000000|10/25/2007 6:07:32 AM 4000001|10/25/2007 6:07:32 AM 4000002|10/25/2007 6:07:32 AM 4000003|10/25/2007 6:07:32 AM 4000004|10/25/2007 6:07:32 AM
BULK INSERT dbo.BulkTest FROM 'c:\insert.dat' WITH (FIELDTERMINATOR = '|',ROWTERMINATOR=' ') When i execute this statement its failing with the follwing message.
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 2 (SubDate ).
Can any one please let me know whats wrong in this and what should be done to make it work Quick response will be appreciated Thanks much ~Mohan
Using SS2K, I'm getting the following error while bulk inserting:Column 'warranty_expiration_date' cannot be modified because it is acomputed column.Here is my bulk insert statement:BULK INSERT dbo.TestDataFROM 'TestData.dat'WITH (CHECK_CONSTRAINTS,FIELDTERMINATOR='|',MAXERRORS = 1,FORMATFILE='TestData.fmt')The computed column is not referenced in the format file and the data filedoes not contain the computed data.Thanks
I am trying to BULK INSERT csv files using a stored procedure in SQL SERVER 2008R2 SP3. Although the files contain several thousand lines and BULK INSERT returns no errors, no data is actually imported into the table. Every field in the table is a NVARCHAR(50) datatype.
Here is the code for the operation (only the parameters for the insert itself):
set @open = 'bulk insert [DWHStaging].[dbo].[Abverkaufsquote] from ''' set @path = 'G:DataStagingDWHStagingSourceAbverkaufsquote' set @params = ''' with (firstrow = 2 , datafiletype = ''widechar'' , fieldterminator = '';'' , rowterminator = '' '' , codepage = ''1252'' , keepnulls);'
The csv file originates from a DB2 database. Using exactly the same code base I can import several other types of CSV files without problem.
The files are stored on the local server with as UCS2 Little Endian and one difference is that the files that do not import do not include a BOM. The other difference is that the failed files are non-UNICODE files.
I just wanted to insert only SubjectIds into my table 'Subjects' which has the follwing schama ignoring the classes The row delimeter is " " and the column delimeter is ' | '
Table Subjects {
ID (Autoincrement) SubjectId varchar(20) }
Can any one provide the format file for doing this or suggest anyway to do this? Please do note that the file may contain millions of records
Hi, I have a data file which consists of data as below, 4 PPU_FFA7485E0D|| T_GLR_DET_11||
While iam inserting into table using bulk insert, this pipe(||) is also getting inserted into the table, here is my query iam using to insert the data using bulk insert.
BULK INSERT TABLE_NAME FROM FILE_PATH WITH (FIELDTERMINATOR = ''||'''+',KEEPNULLS,FIRSTROW=2,ROWTERMINATOR = '''')
This isn€™t an problem as such, it€™s more of a debate.
If a table needs a number of update triggers which do differing tasks, should these triggers be separated out or encapsulated into one all encompassing trigger. Speaking in terms of performance, it doesn€™t make much of an improvement doing either depending upon the tasks performed. I was wondering in terms of maintenance and best practice etc. My view is that if the triggers do totally differing tasks they should be a trigger each on their own.
I have to update a field within a table of 60 records or so. Each record has a different field value. it's type varchar. i was given an excel file with the field values and was thinking of a bulk update like bulk insert, but i don't recall that it's possible that way.
Is the only way to create a table, bulk insert, then merge the two tables together with UPDATE?
Just wanted to see if there was an easier way to do it, otherwise i'll take the latter route. Thanks!
I have a table containing 8 million records. I need to replace 2 million of these records with a scaled down query that goes something like: SELECT 1, ShareholderID, Assets1 FROM MyTable (Yields appx. 200,000 recods) SELECT 2, ShareholderID, Assets2 FROM MyTable (Yields appx. 200,000 recods) . . . SELECT 10, ShareholderID, Assets1 + Assest2 + Assets3 + ... + Assets9 FROM MyTable (Yields appx. 200,000 recods)
Updates and cursors just seem to be too slow.
So far I have done the following, but was wondering if anyone could think of a better way. SELECT 6 million records that don't need to be deleted into a #TempTable Use statements above to select into same #TempTable DROP and recreate Original Table SELECT 6 + 2 million records INTO original table.
This seems rather convoluted. Is there a better approach? Would it be worth while to dump data to a file and use bcp / Bulk Insert
I'm trying to use Bulk insert for the first time and getting the following error. I think it might have something to do with my Format File and from the error msg there's a conversion error for the first column. In my database the Field is nvarchar(6) so my best guess is to use SQLNChar for the first column. I've checked the end of each line is CR LF therefore the is correct for line 7 right?
Msg 4863, Level 16, State 1, Line 1 Bulk load data conversion error (truncation) for row 1, column 1 (ASXCode). Msg 7399, Level 16, State 1, Line 1 The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error. Msg 7330, Level 16, State 2, Line 1 Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
BULK INSERTtbl_ASX_Data_temp FROM 'M:DataASXImportTest.txt' WITH (FORMATFILE='M:DataASXSQLFormatImport.Fmt')
Before implementing memory based bulk copy insert with IRowsetFastLoad interface of SQL Server 2005 OLE DB provider, I want to know some considerations.
- performance : compared with T-SQL's "BULK INSERT ..." and bcp utility
- SQL Server's resource usage : when running memory based bulk copy, server resource's influence
- server side action(behavior) : when server is busy, delayed-update means IRowsetFastLoad::Commit(true) method can insert right after?
- row-count : The rowcount limitation can be inserted by IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit
I'm just learning SSIS and I've hit my first bump. I am doing a bulk import from a tab delimited text file to an empty sql table that has a Idendity column defined. How do I tell the bulk insert task to skip that column when inserting from the text file. If I remove the identity column it imports the data fine, but I want to create the indentity column in the table too.
Hi~, I have 3 questions about memory based bulk copy.
1. What is the limitation count of IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit(true)? For example, how much insert row at below sample?(the max value of nCount) for(i=0 ; i<nCount ; i++) { pIFastLoad->InsertRow(hAccessor, (void*)(&BulkData)); }
2. In above code sample, isn't there method of inserting prepared array at once directly(BulkData array, not for loop)
3. In OLE DB memory based bulk copy, what is the equivalent of below's T-SQL bulk copy option ? BULK INSERT database_name.schema_name.table_name FROM 'data_file' WITH (ROWS_PER_BATCH = rows_per_batch, TABLOCK);
------------------------------------------------------- My solution is like this. Is it correct?
// CoCreateInstance(...); // Data source // Create session
I am trying to insert data into two different tables. I will insert into Table 2 based on an id I get from the Select Statement from Table1. Insert Table1(Title,Description,Link,Whatever)Values(@title,@description,@link,@Whatever)Select WhateverID from Table1 Where Description = @DescriptionInsert into Table2(CategoryID,WhateverID)Values(@CategoryID,@WhateverID) This statement is not working. What should I do? Should I use a stored procedure?? I am writing in C#. Can someone please help!!
I receive the following error message when I try to use the Bulk Insert Task to load BCP data into a table:
Error: 0xC002F304 at Bulk Insert Task, Bulk Insert Task: An error occurred with the following error message: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.The bulk load failed. The column is too long in the data file for row 1, column 4. Verify that the field terminator and row terminator are specified correctly.Bulk load data conversion error (overflow) for row 1, column 1 (rowno).".
Task failed: Bulk Insert Task
In SSMS I am able to issue the following command and the data loads into a TableName table with no error messages: BULK INSERT TableName FROM 'C:DataDbTableName.bcp' WITH (DATAFILETYPE='widenative');
What configuration is required for the Bulk Insert Task in SSIS to make the data load? BTW - the TableName.bcp file is bulk copy file as bcp widenative data type. The properties of the Bulk Insert Task are the following: DataFileType: DTSBulkInsert_DataFileType_WideNative RowTerminator: {CR}{LF}
Any help getting the bcp file to load would be appreciated. Let me know if you require any other information, thanks for all your help. Paul
Is it possible to achieve this using triggers:When someone tries to delete a row in table A, the trigger should first delete a corresponding row in table B and then delete the row in table A. The reason being that, there is a foreign key set on Table B that references table A. So any attempt to delete a row in table A without deleting the corresponding row from B, throws an error.
Hello All, I have to write Trigger for Update, I have two tables, one is for storing records of current values, and one is for storing history of values. How to Write a Trigger on Main Table. As we have Inserted and Deleted Tables through which we can find Values, We dont have any Table for UPDATED Values. Help me. General Problem
I need to create a set of rows every time a new row is inserted into a table. Example (I think this would work)... select @insertedId = column1 from insertedselect @id = column1 from table1 where column2 in (select column1 from table2 where column2 = @insertedId)insert into table3 values(x, y, @id) Is it possible to do the same kind of thing in a situation where the select statement returns multiple values and execute the insert statement for each of these values? Also, if table3 was in fact the table on which the trigger acts, would it then be executed for every row created by the trigger? Sorry if I sound confused. I am.
hi everybody..i tried to put thios loop in sql server 2000 But it is not taking The @ action taken value ,,it is only taking the default value of @actiontaken value. SET @ActionTaken = 'A' IF (@AType = 'A')IF @Status= 'O' IF (@KAppInd ='Y' AND @DAppInd=null)BEGINSET @ActionTaken = 'O'END
Please tell me other option in sql server 2000 for setting variable value based on conditions
Hi using triggers i try to insert some values in to my 2 tables: But its showing teh error as "The request for procedure 'Triginsert123s' failed because 'Triginsert123s' is a trigger object." This is my code in back end: sqlcon.Open() Dim cmd As New SqlCommand("Triginsert123s '" & txtID.Text & "','" & txtName.Text & "','" & txtRole.Text & "','" & txtDep.Text & "'", sqlcon) cmd.ExecuteNonQuery() sqlcon.Close() My trigger is: CREATE TRIGGER Triginsert123s ON [dbo].[EmpRole] FOR INSERT AS declare @Eid as tinyint, @Ename as varchar(50), @Role as char(10) Insert into Emprole(Eid,Ename,Role) values(@Eid,@Ename,@Role) insert into empdep(eid,dep) values(@eid,@Role) Whats the probs?, Plz i am new to triggers help me,
Hi All, I'm using triggers to handle my transaction log to cature inserts and updates. It works fine except if the user clicks on the Save button more than once, the trigger is fired and the record is written to the log even if the record wasn't changed. Does anyone know how to check if the record was actually changed so that it isn't written to the table if it wasn't?
When I execute a stored proc from my asp.net page, will the results of a trigger be returned to my program?
For instance say my stored proc is:
Update Employees set (Lastname = @Lastname) where ID = @ID
And my trigger is:
CREATE TRIGGER tr_Employees_U on Employees FOR UPDATE AS IF UPDATE(lastname) BEGIN RAISERROR ('cannot change lastname', 16, 1) ROLLBACK TRAN RETURN END GO
It seems like since this is an AFTER trigger that my webpage would actually get a valid return code from my stored procedure however the trigger would rollback those changes correct? Or would the trigger get fired and send it's return code to my webpage?
I'm trying to write an instead of trigger for a view in SqlExpress...the table and views are defined as such:CREATE TABLE [dbo].[Work]( [WorkID] [int] IDENTITY(1,1) Primary Key, [ResourceID] [int] NOT NULL, [TaskID] [int] NOT NULL, [WorkDate] [datetime] NOT NULL, [WorkQuantity] [float] NOT NULL, [IsEstimate] [bit] NOT NULL DEFAULT ((0)), [Project] [int] NOT NULL,);CREATE VIEW [dbo].[ActualWork]ASSELECT WorkID, ResourceID, TaskID, WorkDate, WorkQuantity, ProjectFROM dbo.[Work]WHERE (IsEstimate = 0);CREATE VIEW [dbo].[EstimatedWork]ASSELECT WorkID, ResourceID, TaskID, WorkDate, WorkQuantity, ProjectFROM dbo.[Work]WHERE (IsEstimate = 1);Given that, what is wrong with the following create trigger statement:Create Trigger trg_InsertActualWork ondbo.ActualWork Instead of InsertasBEGIN Insert into dbo.Work( ResourceID, TaskID, Project, WorkDate, WorkQuantity, IsEstimate ) values ( inserted.ResourceID, inserted.TaskID, inserted.Project, inserted.WorkDate, inserted.WorkQuantity, 0 );END