I have programmatically created a SqlConnection that begins a SqlTransaction. During the first part of this SqlTransaction, the contents of a table are deleted. The next part uses the SqlBulkCopy object to copy data from another database (in the form of a DataTable). The delete goes through fine, but the SqlBulkCopy always generates a SqlException with the message "Unexpected existing transaction." I cannot think of anything I am doing wrong.
The code looks at an XML file for instructions on each transaction. Each transaction is composed of tasks. Each task will pull data from a different type of database (MVR.Command is a Factory Database object). Please view the code below and tell me if you can spot what I am doing wrong:
using (SqlConnection destinationConnection = new SqlConnection(MVR.ConnectionSource.GetConnectionString(destinationServiceName)))
{
destinationConnection.Open();
using (SqlTransaction transaction = destinationConnection.BeginTransaction(IsolationLevel.Snapshot, "Transport"))
{
transaction.Save("Beginning");
int totalTasks = 0;
int successfulTasks = 0;
foreach (XmlNode taskNode in transactionNode.SelectNodes("Tasks/Task"))
{
totalTasks += 1;
try
{
// Prepare the destination table (delete everything)
int rowsDeleted = new SqlCommand("delete from " + destinationTablename, destinationConnection, transaction).ExecuteNonQuery();
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(destinationConnection))
{
bulkCopy.DestinationTableName = destinationTablename;
// Based on the success of all tasks, either commit or rollback
if (successfulTasks == totalTasks)
{
transaction.Commit();
}
else
{
transaction.Rollback();
}
}
Hi,My code worked fine before i placed "SqlTransaction" command in my code. Now it is showing "Unexpected existing transaction." Please tell me where I am going wrong. My Code:
protected void Page_Load(object sender, EventArgs e) { using (SqlConnection con = new SqlConnection(ConfigurationManager.ConnectionStrings["Mfund_String"].ConnectionString)) { con.Open(); using (SqlConnection destinationConnection = new SqlConnection(ConfigurationManager.ConnectionStrings["Mfund_String"].ConnectionString)) { destinationConnection.Open(); using (SqlTransaction transaction = destinationConnection.BeginTransaction()) { SqlCommand DelCmd = new SqlCommand("delete from mfund_data", con); DelCmd.ExecuteNonQuery(); using (SqlBulkCopy bulkCopy = new SqlBulkCopy(destinationConnection)) { bulkCopy.DestinationTableName = "mfund_data"; try { bulkCopy.WriteToServer(CreateDataTableFromFile()); transaction.Commit(); } catch (Exception ex) { transaction.Rollback(); Response.Write(ex.Message); } } } destinationConnection.Close(); } con.Close(); } }Regards,Jagadeesh
How do I get a new table added to a transaction publication with R/O subscribers? Is it not possible to add new tables without completely drop and recreating the publication?
I went to look at the connection string previously entered for a dataset created in a new report, and am not seeing anything intuitive for bringing up the associated datasource dialog box that was used to enter name, type and connection string. I'm also noticing nothing intuitive for deleting an existing dataset. How do you do these two very simple things in an existing project? I dont see the dataset in solution explorer, I see it only in the text box on the data tab and in a limited kind of way on the dataset view where the columns show and maint is allowed mostly on the columns only. I tried hilighting the dataset here and hitting the delete key to no avail.
I would like to restore database using RESTORE DATABASE ... REPLACE command. If database exists already and has any open connections this command will fail. I would like to close all existing connections to specific database before running RESTORE DATABASE ... REPLACE command. I can do closing from Management Studio using checkbox "Close Existing Connection" when deleting database. Actually I need to do the same but from script.
I am using SQLBULKCOPY to copy Excel spreadsheet into SQL Express Database table. The copying went okay, but the only problem I have is that few records in one of the columns is NULL. The column contains numeric and alphnumeric fields in the spreadsheet. After copying, the numeric fields got copy with no complication, but all the alphanumeric fields all becam NULL. Can someone help me with this problem?
HI, I have one doubt. Is it possible to transfer data from one SQL Server to other SQL Server over the LAN.( Both the SQL Server database are in different cities)
Hello, I was curious if anyone knew if using SqlBulkCopy in code required any special permissions on the database side. I wasn't sure if more permissions than writing capabilities were needed. Thanks.
I need to copy large amounts of data between SQL databases, and between SQL and Access. I've been reading a lot of good explanations of SQLBulkCopy, but the only good examples I've found are written in C# and I work in VB. I'm very new to ASP.NET 2.0 (about a month) and came from Classic ASP, not .NET 1.1. I'm still getting my feet wet and I'm afraid it doesn't take much to confuse me. Can someone point me to a good article or tutorial that includes a clear example written in VB? Diane
Hi, I am trying to use sqlbulkcopy to insert the contents of a data table to a temporary table in the database. The content of data table comes from a csv file. I can add the rows to the data table but it seems they are added as strings and when I try to WriteToServer(datatable) it chokes becuase my table has some int fields. Not sure how to do this.private static TimeSpan DoBulkCopy(string filePath) {Stopwatch stopWatch = new Stopwatch(); stopWatch.Start(); StreamReader sr = new StreamReader(filePath); prepareTable();string fullFileStr = sr.ReadToEnd(); sr.Close(); sr.Dispose(); string[] lines = fullFileStr.Split(''); DataTable dt=new DataTable() ;string[] sArr =lines[0].Split(',');foreach(string s in sArr) {dt.Columns.Add(new DataColumn()); } DataRow row;string finalLine = "";foreach (string line in lines) { row = dt.NewRow();finalLine = line.Replace(Convert.ToString(' '), "");row.ItemArray = finalLine.Split(','); dt.Rows.Add(row); } SqlConnection cn = new SqlConnection(System.Configuration.ConfigurationManager.AppSettings["connectionString"].ToString());System.Data.SqlClient.SqlBulkCopy bc = new System.Data.SqlClient.SqlBulkCopy(cn, SqlBulkCopyOptions.TableLock, null); bc.BatchSize = dt.Rows.Count; cn.Open();bc.DestinationTableName = "tmpTable"; bc.WriteToServer(dt); // <--------------- it dies here cn.Close(); bc.Close(); TimeSpan ts = stopWatch.Elapsed; stopWatch.Stop();return ts; }private static void prepareTable() {SqlConnection cn = new SqlConnection(System.Configuration.ConfigurationManager.AppSettings["connectionString"].ToString()); string sql = @"if exists (select * from dbo.sysobjects where id = object_id(N'[dbo].[tmpTable]') and OBJECTPROPERTY(id, N'IsUserTable') = 1)drop table [dbo].[tmpTable]; CREATE TABLE [dbo].[tmpTable] ([Remote] [int],[KFP] [int]) ON [PRIMARY]";SqlCommand cmd = new SqlCommand(sql, cn); cn.Open(); cmd.ExecuteNonQuery(); cn.Close(); cmd.Dispose(); }
I have a problem using SqlBulkCopy for updating tables. In fact, I can´t update any table. I use this function for insert big groups of records, but I would like to know how I can configure SqlBulkCopy for update rows. I would like to see if there is a record (primary key exists) then update it.
Does SqlBulkCopy have any adverse affects on it's target table indexes? I like the performance gain but am worried about creating unnecessary table scans if the indexes/stats are not updated properly after it completes...
I'm using SqlBulkCopy. Does anyone know how I can output what row (its column names) are throwing a duplicate primary key message when I bulkCopy.WriteToServer(datatable1)?Thanks
I have a collection of around 16000 records, and have been trying to find the best way to update the information in the DB. I have done alot of reading about both BulkCopy and Batch Update, but haven't come to any clear solutions as to which performs better. I am not doing any inserting, just getting a dataset from the DB, changing the values, them want to update the Db again. Thanks for any help. Mick
I'm in the final throws of redesigning a web application, and one of the major improvements is adding in an admin page for the users so they can maintain the data and system structure without the need for us to get involved. One of the areas I'd held back on was promotion of data from SQL server to SQL server - once .NET 2 framework came in and we moved over to VS2005, I'm now in a position to do this work. I have tested it and it works perfectly well for small tables, but as soon as I try it on a reasonably large table (860,000 rows) I get the following error in VS's output window: "A first chance exception of type 'System.Data.SqlClient.SqlException' occurred in System.Data.dll An exception of type 'System.Data.SqlClient.SqlException' occurred in System.Data.dll but was not handled in user code Additional information: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding." The application data is stored on SQL2005 servers, and I am running the application locally. I have tried running the copy from table to table on the same server (different databases) and i get the same result, which leads me to believe that it may be memory related. Running the SQL in SQL Server Management Studio works in just 15 to 20 seconds, but using VS falls over in just under a minute. My connection string has a timeout of 3600. Here's my C# code:// Execute reader... using (IDataReader vReader = vCmd.ExecuteReader()) { // Create SqlBulkCopy... SqlBulkCopy vBulkData = new SqlBulkCopy(aTargetConn); // Set destination table name... vBulkData.DestinationTableName = aTableName; // Write data... vBulkData.WriteToServer(vReader); }
aSourceConn and aTargetConn are the appropriate SqlConnections, and aTableName is the table to be populated with data (previously backed up and emtied of contents). Any help/advice suggestions gratefully received - if any more info needed please ask. Thanks Martin
I am trying to import a CSV file into an SQL Server table with the OleDbDataReader and SqlBulkCopy objects, like this: using (OleDbConnection dconn = new OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=c:\mystuff\;Extended Properties="text;HDR=No;FMT=Delimited"")) { using (OleDbCommand dcmd = new OleDbCommand("select * from mytable.csv", dconn)) { try { dconn.Open();
using (OleDbDataReader dreader = dcmd.ExecuteReader()) { try {
using (SqlConnection dconn2 = new SqlConnection(@"data source=MyDBServer;initial catalog=MyDB;user id=mydbid;password=mydbpwd")) { using (SqlBulkCopy bc = new SqlBulkCopy(dconn2)) { try { dconn2.Open(); bc.DestinationTableName = "dbo.mytable"; bc.WriteToServer(dreader); } finally { dconn2.Close(); } } } } finally { dreader.Close(); }
} } finally { dconn.Close(); } } } A couple of the columns for the destination table use a bit datatype. The CSV files uses the strings "1" and "0" to represent these.When I run this code, it throws this exception:Unhandled Exception: System.InvalidOperationException: The given value of type String from the data source cannot be converted to type bit of the specified target column. ---> System.FormatException: Failed to convert parameter value from a String to a Boolean. ---> System.FormatException: String was not recognized asa valid Boolean. at System.Boolean.Parse(String value) at System.String.System.IConvertible.ToBoolean(IFormatProvider provider) at System.Convert.ChangeType(Object value, Type conversionType, IFormatProvider provider) at System.Data.SqlClient.SqlParameter.CoerceValue(Object value, MetaType destinationType) --- End of inner exception stack trace --- at System.Data.SqlClient.SqlParameter.CoerceValue(Object value, MetaType destinationType) at System.Data.SqlClient.SqlBulkCopy.ConvertValue(Object value, _SqlMetaDatametadata) --- End of inner exception stack trace --- at System.Data.SqlClient.SqlBulkCopy.ConvertValue(Object value, _SqlMetaDatametadata) at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternal() at System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServer(Int32 columnCount) at System.Data.SqlClient.SqlBulkCopy.WriteToServer(IDataReader reader) at MyClass.Main()It appears not to accept "1" and "0" as valid strings to convert to booleans. The System.Convert.ToBoolean method appears to work the same way. Is there any way to change this behavior? I discovered if you change the "1" to "true" and "0" to "false" in the CSV file it will accept them.
I am trying to use SQLBulkCopy to copy from an excel spreadsheet to a table in a SQL database and it is all working fine, but I have to now change the connection to the SQl database to an ODBC connection and it is now erroring. This is the error I get - keyword not supported 'dsn' Is it possible to use SQlBulkCopy when using an odbc connection to the SQL database?
I have a scenario whereby I'd like to insert multiple rows into a table on a SQL server database as efficiently and easily as possible. After some research, it looked like .NET 3.0's SqlBulkCopy class would do what I want. I've tried to set something up, but it's not working. It's not even throwing an error. The code executes but it simply hasn't done the insert by the end of it! My table structure is simple. The name of the table is LPSTUnavailableDate. It has just two columns, one of them an auto-populated ID field:
I have encountered a very frustrating situation when trying to use SQLBulkCopy. I have two excel files that I am trying to import into two tables in an MSSQL Server 2005 Express DB. One excel file has 5,000 rows, while the other file has 500,000 rows.I was able to import the smaller file successfully using this vb.net code: Protected Sub L26ExcelToSQL() 'Declare Variables Dim sSQLTable As String = "Local26Members" Dim sExcelFileName As String = "Full Local 26 List Formatted.xls" Dim sWorkbook As String = "[Sheet1$]"
Dim sSqlConnectionString As String = ConfigurationManager.ConnectionStrings("SiteSqlServer").ConnectionString.ToString 'Execute a query to erase any previous data from our destination table Dim sClearSQL = "DELETE FROM " & sSQLTable Dim SqlConn As SqlConnection = New SqlConnection(sSqlConnectionString) Dim SqlCmd As SqlCommand = New SqlCommand(sClearSQL, SqlConn) SqlConn.Open() SqlCmd.ExecuteNonQuery() SqlConn.Close() 'Series of commands to bulk copy data from the excel file into our SQL table Dim OleDbConn As OleDbConnection = New OleDbConnection(sExcelConnectionString) Dim OleDbCmd As OleDbCommand = New OleDbCommand(("SELECT * FROM " & sWorkbook), OleDbConn) OleDbConn.Open() Dim dr As OleDbDataReader = OleDbCmd.ExecuteReader() Dim bulkCopy As SqlBulkCopy = New SqlBulkCopy(sSqlConnectionString) bulkCopy.DestinationTableName = sSQLTable bulkCopy.WriteToServer(dr) OleDbConn.Close() End Sub However, when I tried to import the 500,000 row excel file, I got the following error: Server Error in '/L26' Application.
A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - The specified network name is no longer available.)
Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.
Exception Details: System.Data.SqlClient.SqlException: A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - The specified network name is no longer available.)
[SqlException (0x80131904): A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - The specified network name is no longer available.)] System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection) +925466 System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection) +800118 System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj) +186 System.Data.SqlClient.TdsParserStateObject.ReadSniError(TdsParserStateObject stateObj, UInt32 error) +556 System.Data.SqlClient.TdsParserStateObject.ReadSni(DbAsyncResult asyncResult, TdsParserStateObject stateObj) +164 System.Data.SqlClient.TdsParserStateObject.ReadPacket(Int32 bytesExpected) +34 System.Data.SqlClient.TdsParserStateObject.ReadBuffer() +44 System.Data.SqlClient.TdsParserStateObject.ReadByte() +17 System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj) +79 System.Data.SqlClient.SqlBulkCopy.WriteToServerInternal() +1336 System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServer(Int32 columnCount) +916 System.Data.SqlClient.SqlBulkCopy.WriteToServer(IDataReader reader) +151 _Default.CSVToSQL() in d:hostingmemberwolsite1L26DuesDefault2.aspx.vb:440 _Default.ButtonTest3_Click(Object sender, EventArgs e) in d:hostingmemberwolsite1L26DuesDefault2.aspx.vb:905 System.Web.UI.WebControls.Button.OnClick(EventArgs e) +105 System.Web.UI.WebControls.Button.RaisePostBackEvent(String eventArgument) +107 System.Web.UI.WebControls.Button.System.Web.UI.IPostBackEventHandler.RaisePostBackEvent(String eventArgument) +7 System.Web.UI.Page.RaisePostBackEvent(IPostBackEventHandler sourceControl, String eventArgument) +11 System.Web.UI.Page.RaisePostBackEvent(NameValueCollection postData) +33 System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) +1746
Version Information: Microsoft .NET Framework Version:2.0.50727.1433; ASP.NET Version:2.0.50727.1433 After I received this error message, I tried viewing my database through the MSSQL Control Panel utilized by my hosting provider (WebHost4Life). However, I was unable to connect to the database and received this error: ___________________Microsoft OLE DB Provider for SQL Server error '80040e14' Database 1496 cannot be autostarted during server shutdown or startup. /getDBinfo.asp, line 29
_____________________ Now here is the most frustrating/mysterious part. I figured that maybe the error message were a result of the large size of the second excel file, so just for testing purposes, I created a new table in my MSSQL database. The table just has two fields, both set to varchar(50). I then created a test excel file, that had one row with the word "test" in the first and second column. When I tried using the code above to import the test excel data into the test table, I got the same exact error as I did with the 500,000 row file!Please help, I'm really stumped and I am not sure when I am having so much trouble replicating the success I had the 5,000 row file. Any suggestions are much apprecaited. -Bryan
I have created an assembly which I load into SQL 2005. However, if I set my connection string = context connection = true... I will get an error saying something like this feature could not be used in this context... So I changed my function to insert each row.... Now the issue I have is the transfer takes 4X as long.... Before I made the change I was using the bulkcopy by specifying the actual connection string....but I also had to specify the password in the string...and since I wanted to get way from this specification...I attempted the context route. So...is there any other way of using the bulkcopy feature or something like it using the context connection?
Private Shared Function BulkDataTransfer2(ByVal _tblName As String, ByRef _dt As DataTable, ByRef emailLog As String) As Boolean
Is there a JDBC equivalent of the SqlBulkCopy command?
Simply using batched INSERTs, it takes days to insert 1m rows into SQL Server. However using C# client that uses SqlBulkCopy I can load it in about 1 hour.
I have deveoped a replacement for some an old bcp based applications in the .Net Framework that uses the SqlBulkCopy class.
I have run into some difficulties with code page translation:
The original BCP client runs with OEM Codepage 437 , thus the data "ëÄÆòÖ" gets loaded as "d-¦=+". DB is SQL_Latin1_General_CP1_CI_AS, column is varchar.
I have been unable to perform any code page Encoding in .Net that yields the same result.
I want to emulate this behaviour in my database loader but as yet have been able to find a way....
I have the following table: create table tTest ( x varchar(20) COLLATE SQL_Latin1_General_CP1_CI_AS )
I need to populate this table using SqlBulkCopy, however some symbols are inserted with mistakes.
Here is an example of the code that I€™m using: using (SqlBulkCopy copier = new SqlBulkCopy(ci.ConnectionString)) { copier.DestinationTableName = "tTest"; DataTable tbl = new DataTable(); tbl.Columns.Add(new DataColumn("x"));
I'm getting this when executing the code below. Going from W2K/SQL2k SP4 to XP/SQL2k SP4 over a dial-up link.
If I take away the begin tran and commit it works, but of course, if one statement fails I want a rollback. I'm executing this from a Delphi app, but I get the same from Qry Analyser.
I've tried both with and without the Set XACT . . ., and also tried with Set Implicit_Transactions off.
set XACT_ABORT ON Begin distributed Tran update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TRANSACTIONMAIN set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.TRANSACTIONMAIN set REPFLAG = 0 where REPFLAG = 1 and DONE = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.WBENTRY set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.WBENTRY set REPFLAG = 0 where REPFLAG = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.FIXED set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.FIXED set REPFLAG = 0 where REPFLAG = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.ALTCHARGE set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.ALTCHARGE set REPFLAG = 0 where REPFLAG = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TSAUDIT set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.TSAUDIT set REPFLAG = 0 where REPFLAG = 1 COMMIT TRAN
It's got me stumped, so any ideas gratefully received.Thx
SqlBulkCopy does not seem to have much flexibility. If your table has columns that dont allow duplicates, you are apparently screwed. Its a shame there is no switch or param setting you can issue the SqlBulkCopy class (if there is, please let me know!!)Example:Say I have a table "Cars" with fields "CarId" (identity) and "CarName" (varchar) and the CarName field has a unique constraint. Now, I have a DataTable that contains a bunch of CarNames to insert. If there are duplicates on CarName, the entire insert fails. This is nothing to do with the PK or identity field. The problem I have: I would much rather have an option to ignore or silently not insert that duplicate row, but continue to insert the rest of my data. Any known work arounds for this would be much appreciated. Maybe I am missing something?
Hi There, I'm trying to use a sql bulk copy to transfer data from xml file to a table in one of my page. In this page I'm doing 2 database related. The first is a simple insert that will return a value and the second one is the sql bulk copy data transfer. I'm using the same connection for both of them and the sql bulk copy always give me a "login failed" error while the insert is fine. Do I need to set a specific setting for the sql server account so that it can use sql bulk copy? Thank you
I have written an app that will allow you to send a query to Teradata, return the results into a Reader and then Bulk Copy that data into SQL Server 2005.
If the query results in a large dataset (ie 20,000,000 rows) is processed then while that data is being bulk copied into SQL Server, using the SQLBulkCopy class, then it prevents users on other computers from logging into SQL Server Management Studio.
Those that are already logged in are shut down also. Everything appears fine to the users but queries do not finish running.
Everything immediatly starts working as normal when either my program finishes or I shut down my program.
Is there any type of property to the SQLBulkCopy class or any other function that will prevent Management Studio from locking up?
Hi, Iam using SqlBulkCopy to copy data of all the tables from one database to another database. SqlBulkCopy runs just fine but it throws exception for one of the tables which is having computed column.
Next For one of the tables iam getting the following error The column "Col4" cannot be modified because it is either a computed column or is the result of a UNION operator
Since iam fetching table names at runtime its not possible to use columnMappings.
I want to insert data into Header and Details table simultaneously using SQLBulkCopy. Header table contains an Identity columns and Details table contains Foreign Key to this identity column in Header Table.I want to use DataTable as datasource in SQLBulkCopy.Can any body help on this
I have a design a SSIS Package for ETL Process. In my package i have to read the data from the tables and then insert into the another table of same structure.
for reading the data i have write the Dynamic TSQL based on some condition and based on that it is using 25 different function to populate the data into different 25 column. Tsql returning correct data and is working fine in Enterprise manager. But in my SSIS package it show me time out ERROR.
I have increase and decrease the time to catch the error but it is still there i have tried to set 0 for commandout Properties.
if i'm using the 0 for commandtime out then i'm getting the Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.
and
Failed to open a fastload rowset for "[dbo].[P@@#$%$%%%]". Check that the object exists in the database.
Hi, I am using .NET sqlBulkCopy to insert several rows at once into my sql table. Yesterday I created a simple trigger on this table to take the primary key of the inserted row and place into another table. Fairly straightforward I thought, but I keep getting this error when inserting 2 or more rows (works fine with one row): "Subquery returned more than 1 value. This is not permitted when the subquery follows =, !=, <= , >, >= or when the subquery is used as an expression. The statement has been terminated." To test this, I changed the sqlBulkCopy into individual INSERT statements in a for each loop and then it works perfectly, so I know that it's definitely the sqlBulkCopy causing the problem, not the trigger itself. Here is my trigger code anyway: alter trigger tg_InsertClaim on AuditItemfor insertas declare @AuditItemID as intset @AuditItemID = (select i.AuditItemID from inserted i inner join dbo.AuditItem a ON i.AuditItemID = a.AuditItemID) insert into Claim (AuditItemID) VALUES (@AuditItemID) Thanks!