SqlBulkCopy.WriteToServer Succeeds Or Fails Depending On The Size Of DataTable?
Mar 4, 2008
Hello,
I am trying to bulk copy some data from a text file to SqlServer. In my case, the table in SqlServer is simple. It has two columns: Symbol <nchar(5), Primary Key> and Company <nvarchar(50)>. Each row in the text file is Symbol and Company separated by a "#". Below is the code of my bulk copy:
public static void StartImport(string sourceFile)
{
SqlBulkCopy bulkCopy = new SqlBulkCopy(connString_local, SqlBulkCopyOptions.TableLock);
bulkCopy.DestinationTableName = "dbo.NasdaqSymbols";
dc = new DataColumn();
dc.DataType = Type.GetType("System.String");
dc.ColumnName = "Symbol";
dc.Unique = true;
dt.Columns.Add(dc);
dc = new DataColumn();
dc.DataType = Type.GetType("System.String");
dc.ColumnName = "Company";
dc.Unique = false;
dt.Columns.Add(dc);
StreamReader sr = new StreamReader(filePath);
string input;
while ((input = sr.ReadLine()) != null)
{
string[] s = input.Split(new string[] { "#" }, StringSplitOptions.None);
dr = dt.NewRow();
dr["Symbol"] = s[0].Trim();
dr["Company"] = s[1].Trim();
dt.Rows.Add(dr);
}
sr.Close();
return dt;
}
The problem is, I got the following exception when I tried to call my StartImport method (thrown from SqlBulkCopy.WriteToServer): System.InvalidOperationException: The given value of type String from the data source cannot be converted to type nvarchar of the specified target column. It turned out that the problem seems not to be String to nvarchar; because when I use a source text file which contains only about a dozen of rows, it works! I have no idea why SqlBulkCopy.WriteToServer works fine on a small set of data. Or is there something I overlooked?
Thank you for time and help.
Gary
I'm trying to use the SQL Bulk copy class to bulk import from a text file.I'm getting the following error: Line 24: bulkCopy.WriteToServer(CreateDataTableFromFile()); System.Data.SqlClient.SqlException: An error has occurred while establishing a connection to the server. When connecting to SQL Server 2005, this failure may be caused by the fact that under the default settings SQL Server does not allow remote connections. (provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server) I've even tried to allow remote connections thro pipes and restarted the database engine but to no avail. Any inputs/suggestions?
My client has a number of jobs that are run overnight. We've set themup to email me when they're completed. Every morning I get in to abunch of emails like this:<quote>JOB RUN:'Tech Pubs Email Notification' was run on 18/03/2006 at00:00:00DURATION:0 hours, 0 minutes, 0 secondsSTATUS: SucceededMESSAGES:The job succeeded. The Job was invoked by Schedule 10 (SendMail). The last step to run was step 1 (Send Mail).</quote>However, the most important job - the database backup - fails everytime.<quote>JOB RUN:'DB Backup Job for DB Maintenance Plan 'DB Maintenance Plan1''was run on 20/03/2006 at 18:00:00DURATION:0 hours, 0 minutes, 2 secondsSTATUS: FailedMESSAGES:The job failed. The Job was invoked by Schedule 7 (Schedule1). The last step to run was step 1 (Step 1).</quote>What's strange is that the job runs successfully if you kick it offmanually (in EM: right-click and "Start Job")!!! Does anyone have anyidea of why that might be? Where to look for diagnostic information?TIAEdward
Hi, I am having problem in bulk update of a sql server table haning identity column from a datatable( has no identity column) using sqlbulkcopy. I tried several approaches, but it does not show any error nor is the table getting updated. But the identity value seems to getting increased every time. thanks. varun
Test #1 Returns 20971 rows * 25 = 524,725 Test #2 Returns 14169 rows * 37 = 524,253 Test #3 Returns 6808 rows * 77 = 524,216 Test #4 Returns 5140 rows * 102 = 524,280
With the similarity of the total byte count returned, I would assume that a buffer or something is being overrun, is this a configuration parameter possibly.
If I perform the select against the linked table as such
Select * (for fields list) From LURCH_PARADB.S102D4LM.PARADB.BLHDR
I get all rows returned. The issue only exist when accessing the views that are created against the linked server.
Filling a DataTable from SqlQuery : If SqlQuery returns some null values problem ocurrs with DataTable. Is it possible using DataTable with some null values in it? Thanks
Pump file data into sql server Move file to "archive" directory(file system task) Delete File (file system task) End Loop
Unmap Drive (batch file)
The Map/unmap code is in a batch file c:windowssystem32et use \10.10.10.10ShareName MyPassword /USER:MyUserName /YES
Unmap: c:windowssystem32et use \10.10.10.10ShareName /DELETE /YES
Here are the results when running this package: 1. Running in BIDS on separate workstation. Everything OK. 2. Running on Server by right clicking on package in Integration Services (SSMS) and choosing "run". Everything OK 3. Running as job with SQL Agent: Package succeeds but no action was taken on the files, files in "ShareName" still there, so therefore no data pumped into SQL Server.
Now, the difference is the SQL Agent jobs are running using a domain account proxy. I'm not sure how that would affect things though--I have the tasks in the package set to fail the package if they fail, so they are not failing, the drives are being mapped o.k.
The computer with the share is non-domain, but that shouldn't matter--I am specifiying the local username and password in the batch file as you can see, and as you can see it works from the workstation in BIDS on a separate machine, and works on the server too as long as I don't run it as a job. The batch file sits on both the server and the local workstation with the same local path.
Any idea why the files aren't actioned when run as a job?
Hi All, I am trying to restore a DB from my production to Standby server..it gives me a mesage "Msg 3105, Level 16, State 1 Data on dump will not fit into current database. Need 6500 Mbyte database."
The production server DB size is 5000MB and I have increased the size of standby DB to 6500MB but still the same message...
I am trying to resize a database initial log file from 500M to 2M. I€™m using€?
ALTER DATABASE <DBNAME> MODIFY FILE ( NAME = <DBLOGFILENAME, SIZE = 2 ) "
And I'm getting "MODIFY FILE failed. Specified size is less than current size." I tried going into the database properties and setting the log file to 2M, but it doesn€™t keep the changes.
I am using SQLBULKCOPY to copy Excel spreadsheet into SQL Express Database table. The copying went okay, but the only problem I have is that few records in one of the columns is NULL. The column contains numeric and alphnumeric fields in the spreadsheet. After copying, the numeric fields got copy with no complication, but all the alphanumeric fields all becam NULL. Can someone help me with this problem?
HI, I have one doubt. Is it possible to transfer data from one SQL Server to other SQL Server over the LAN.( Both the SQL Server database are in different cities)
Hello, I was curious if anyone knew if using SqlBulkCopy in code required any special permissions on the database side. I wasn't sure if more permissions than writing capabilities were needed. Thanks.
I need to copy large amounts of data between SQL databases, and between SQL and Access. I've been reading a lot of good explanations of SQLBulkCopy, but the only good examples I've found are written in C# and I work in VB. I'm very new to ASP.NET 2.0 (about a month) and came from Classic ASP, not .NET 1.1. I'm still getting my feet wet and I'm afraid it doesn't take much to confuse me. Can someone point me to a good article or tutorial that includes a clear example written in VB? Diane
Hi, I am trying to use sqlbulkcopy to insert the contents of a data table to a temporary table in the database. The content of data table comes from a csv file. I can add the rows to the data table but it seems they are added as strings and when I try to WriteToServer(datatable) it chokes becuase my table has some int fields. Not sure how to do this.private static TimeSpan DoBulkCopy(string filePath) {Stopwatch stopWatch = new Stopwatch(); stopWatch.Start(); StreamReader sr = new StreamReader(filePath); prepareTable();string fullFileStr = sr.ReadToEnd(); sr.Close(); sr.Dispose(); string[] lines = fullFileStr.Split(''); DataTable dt=new DataTable() ;string[] sArr =lines[0].Split(',');foreach(string s in sArr) {dt.Columns.Add(new DataColumn()); } DataRow row;string finalLine = "";foreach (string line in lines) { row = dt.NewRow();finalLine = line.Replace(Convert.ToString(' '), "");row.ItemArray = finalLine.Split(','); dt.Rows.Add(row); } SqlConnection cn = new SqlConnection(System.Configuration.ConfigurationManager.AppSettings["connectionString"].ToString());System.Data.SqlClient.SqlBulkCopy bc = new System.Data.SqlClient.SqlBulkCopy(cn, SqlBulkCopyOptions.TableLock, null); bc.BatchSize = dt.Rows.Count; cn.Open();bc.DestinationTableName = "tmpTable"; bc.WriteToServer(dt); // <--------------- it dies here cn.Close(); bc.Close(); TimeSpan ts = stopWatch.Elapsed; stopWatch.Stop();return ts; }private static void prepareTable() {SqlConnection cn = new SqlConnection(System.Configuration.ConfigurationManager.AppSettings["connectionString"].ToString()); string sql = @"if exists (select * from dbo.sysobjects where id = object_id(N'[dbo].[tmpTable]') and OBJECTPROPERTY(id, N'IsUserTable') = 1)drop table [dbo].[tmpTable]; CREATE TABLE [dbo].[tmpTable] ([Remote] [int],[KFP] [int]) ON [PRIMARY]";SqlCommand cmd = new SqlCommand(sql, cn); cn.Open(); cmd.ExecuteNonQuery(); cn.Close(); cmd.Dispose(); }
I have a problem using SqlBulkCopy for updating tables. In fact, I can´t update any table. I use this function for insert big groups of records, but I would like to know how I can configure SqlBulkCopy for update rows. I would like to see if there is a record (primary key exists) then update it.
Does SqlBulkCopy have any adverse affects on it's target table indexes? I like the performance gain but am worried about creating unnecessary table scans if the indexes/stats are not updated properly after it completes...
I'm using SqlBulkCopy. Does anyone know how I can output what row (its column names) are throwing a duplicate primary key message when I bulkCopy.WriteToServer(datatable1)?Thanks
I have a collection of around 16000 records, and have been trying to find the best way to update the information in the DB. I have done alot of reading about both BulkCopy and Batch Update, but haven't come to any clear solutions as to which performs better. I am not doing any inserting, just getting a dataset from the DB, changing the values, them want to update the Db again. Thanks for any help. Mick
I'm in the final throws of redesigning a web application, and one of the major improvements is adding in an admin page for the users so they can maintain the data and system structure without the need for us to get involved. One of the areas I'd held back on was promotion of data from SQL server to SQL server - once .NET 2 framework came in and we moved over to VS2005, I'm now in a position to do this work. I have tested it and it works perfectly well for small tables, but as soon as I try it on a reasonably large table (860,000 rows) I get the following error in VS's output window: "A first chance exception of type 'System.Data.SqlClient.SqlException' occurred in System.Data.dll An exception of type 'System.Data.SqlClient.SqlException' occurred in System.Data.dll but was not handled in user code Additional information: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding." The application data is stored on SQL2005 servers, and I am running the application locally. I have tried running the copy from table to table on the same server (different databases) and i get the same result, which leads me to believe that it may be memory related. Running the SQL in SQL Server Management Studio works in just 15 to 20 seconds, but using VS falls over in just under a minute. My connection string has a timeout of 3600. Here's my C# code:// Execute reader... using (IDataReader vReader = vCmd.ExecuteReader()) { // Create SqlBulkCopy... SqlBulkCopy vBulkData = new SqlBulkCopy(aTargetConn); // Set destination table name... vBulkData.DestinationTableName = aTableName; // Write data... vBulkData.WriteToServer(vReader); }
aSourceConn and aTargetConn are the appropriate SqlConnections, and aTableName is the table to be populated with data (previously backed up and emtied of contents). Any help/advice suggestions gratefully received - if any more info needed please ask. Thanks Martin
I am trying to import a CSV file into an SQL Server table with the OleDbDataReader and SqlBulkCopy objects, like this: using (OleDbConnection dconn = new OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=c:\mystuff\;Extended Properties="text;HDR=No;FMT=Delimited"")) { using (OleDbCommand dcmd = new OleDbCommand("select * from mytable.csv", dconn)) { try { dconn.Open();
using (OleDbDataReader dreader = dcmd.ExecuteReader()) { try {
using (SqlConnection dconn2 = new SqlConnection(@"data source=MyDBServer;initial catalog=MyDB;user id=mydbid;password=mydbpwd")) { using (SqlBulkCopy bc = new SqlBulkCopy(dconn2)) { try { dconn2.Open(); bc.DestinationTableName = "dbo.mytable"; bc.WriteToServer(dreader); } finally { dconn2.Close(); } } } } finally { dreader.Close(); }
} } finally { dconn.Close(); } } } A couple of the columns for the destination table use a bit datatype. The CSV files uses the strings "1" and "0" to represent these.When I run this code, it throws this exception:Unhandled Exception: System.InvalidOperationException: The given value of type String from the data source cannot be converted to type bit of the specified target column. ---> System.FormatException: Failed to convert parameter value from a String to a Boolean. ---> System.FormatException: String was not recognized asa valid Boolean. at System.Boolean.Parse(String value) at System.String.System.IConvertible.ToBoolean(IFormatProvider provider) at System.Convert.ChangeType(Object value, Type conversionType, IFormatProvider provider) at System.Data.SqlClient.SqlParameter.CoerceValue(Object value, MetaType destinationType) --- End of inner exception stack trace --- at System.Data.SqlClient.SqlParameter.CoerceValue(Object value, MetaType destinationType) at System.Data.SqlClient.SqlBulkCopy.ConvertValue(Object value, _SqlMetaDatametadata) --- End of inner exception stack trace --- at System.Data.SqlClient.SqlBulkCopy.ConvertValue(Object value, _SqlMetaDatametadata) at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternal() at System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServer(Int32 columnCount) at System.Data.SqlClient.SqlBulkCopy.WriteToServer(IDataReader reader) at MyClass.Main()It appears not to accept "1" and "0" as valid strings to convert to booleans. The System.Convert.ToBoolean method appears to work the same way. Is there any way to change this behavior? I discovered if you change the "1" to "true" and "0" to "false" in the CSV file it will accept them.
I am trying to use SQLBulkCopy to copy from an excel spreadsheet to a table in a SQL database and it is all working fine, but I have to now change the connection to the SQl database to an ODBC connection and it is now erroring. This is the error I get - keyword not supported 'dsn' Is it possible to use SQlBulkCopy when using an odbc connection to the SQL database?
I have a scenario whereby I'd like to insert multiple rows into a table on a SQL server database as efficiently and easily as possible. After some research, it looked like .NET 3.0's SqlBulkCopy class would do what I want. I've tried to set something up, but it's not working. It's not even throwing an error. The code executes but it simply hasn't done the insert by the end of it! My table structure is simple. The name of the table is LPSTUnavailableDate. It has just two columns, one of them an auto-populated ID field:
I have encountered a very frustrating situation when trying to use SQLBulkCopy. I have two excel files that I am trying to import into two tables in an MSSQL Server 2005 Express DB. One excel file has 5,000 rows, while the other file has 500,000 rows.I was able to import the smaller file successfully using this vb.net code: Protected Sub L26ExcelToSQL() 'Declare Variables Dim sSQLTable As String = "Local26Members" Dim sExcelFileName As String = "Full Local 26 List Formatted.xls" Dim sWorkbook As String = "[Sheet1$]"
Dim sSqlConnectionString As String = ConfigurationManager.ConnectionStrings("SiteSqlServer").ConnectionString.ToString 'Execute a query to erase any previous data from our destination table Dim sClearSQL = "DELETE FROM " & sSQLTable Dim SqlConn As SqlConnection = New SqlConnection(sSqlConnectionString) Dim SqlCmd As SqlCommand = New SqlCommand(sClearSQL, SqlConn) SqlConn.Open() SqlCmd.ExecuteNonQuery() SqlConn.Close() 'Series of commands to bulk copy data from the excel file into our SQL table Dim OleDbConn As OleDbConnection = New OleDbConnection(sExcelConnectionString) Dim OleDbCmd As OleDbCommand = New OleDbCommand(("SELECT * FROM " & sWorkbook), OleDbConn) OleDbConn.Open() Dim dr As OleDbDataReader = OleDbCmd.ExecuteReader() Dim bulkCopy As SqlBulkCopy = New SqlBulkCopy(sSqlConnectionString) bulkCopy.DestinationTableName = sSQLTable bulkCopy.WriteToServer(dr) OleDbConn.Close() End Sub However, when I tried to import the 500,000 row excel file, I got the following error: Server Error in '/L26' Application.
A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - The specified network name is no longer available.)
Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.
Exception Details: System.Data.SqlClient.SqlException: A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - The specified network name is no longer available.)
[SqlException (0x80131904): A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - The specified network name is no longer available.)] System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection) +925466 System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection) +800118 System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj) +186 System.Data.SqlClient.TdsParserStateObject.ReadSniError(TdsParserStateObject stateObj, UInt32 error) +556 System.Data.SqlClient.TdsParserStateObject.ReadSni(DbAsyncResult asyncResult, TdsParserStateObject stateObj) +164 System.Data.SqlClient.TdsParserStateObject.ReadPacket(Int32 bytesExpected) +34 System.Data.SqlClient.TdsParserStateObject.ReadBuffer() +44 System.Data.SqlClient.TdsParserStateObject.ReadByte() +17 System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj) +79 System.Data.SqlClient.SqlBulkCopy.WriteToServerInternal() +1336 System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServer(Int32 columnCount) +916 System.Data.SqlClient.SqlBulkCopy.WriteToServer(IDataReader reader) +151 _Default.CSVToSQL() in d:hostingmemberwolsite1L26DuesDefault2.aspx.vb:440 _Default.ButtonTest3_Click(Object sender, EventArgs e) in d:hostingmemberwolsite1L26DuesDefault2.aspx.vb:905 System.Web.UI.WebControls.Button.OnClick(EventArgs e) +105 System.Web.UI.WebControls.Button.RaisePostBackEvent(String eventArgument) +107 System.Web.UI.WebControls.Button.System.Web.UI.IPostBackEventHandler.RaisePostBackEvent(String eventArgument) +7 System.Web.UI.Page.RaisePostBackEvent(IPostBackEventHandler sourceControl, String eventArgument) +11 System.Web.UI.Page.RaisePostBackEvent(NameValueCollection postData) +33 System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) +1746
Version Information: Microsoft .NET Framework Version:2.0.50727.1433; ASP.NET Version:2.0.50727.1433 After I received this error message, I tried viewing my database through the MSSQL Control Panel utilized by my hosting provider (WebHost4Life). However, I was unable to connect to the database and received this error: ___________________Microsoft OLE DB Provider for SQL Server error '80040e14' Database 1496 cannot be autostarted during server shutdown or startup. /getDBinfo.asp, line 29
_____________________ Now here is the most frustrating/mysterious part. I figured that maybe the error message were a result of the large size of the second excel file, so just for testing purposes, I created a new table in my MSSQL database. The table just has two fields, both set to varchar(50). I then created a test excel file, that had one row with the word "test" in the first and second column. When I tried using the code above to import the test excel data into the test table, I got the same exact error as I did with the 500,000 row file!Please help, I'm really stumped and I am not sure when I am having so much trouble replicating the success I had the 5,000 row file. Any suggestions are much apprecaited. -Bryan
I have created an assembly which I load into SQL 2005. However, if I set my connection string = context connection = true... I will get an error saying something like this feature could not be used in this context... So I changed my function to insert each row.... Now the issue I have is the transfer takes 4X as long.... Before I made the change I was using the bulkcopy by specifying the actual connection string....but I also had to specify the password in the string...and since I wanted to get way from this specification...I attempted the context route. So...is there any other way of using the bulkcopy feature or something like it using the context connection?
Private Shared Function BulkDataTransfer2(ByVal _tblName As String, ByRef _dt As DataTable, ByRef emailLog As String) As Boolean
Is there a JDBC equivalent of the SqlBulkCopy command?
Simply using batched INSERTs, it takes days to insert 1m rows into SQL Server. However using C# client that uses SqlBulkCopy I can load it in about 1 hour.
I have deveoped a replacement for some an old bcp based applications in the .Net Framework that uses the SqlBulkCopy class.
I have run into some difficulties with code page translation:
The original BCP client runs with OEM Codepage 437 , thus the data "ëÄÆòÖ" gets loaded as "d-¦=+". DB is SQL_Latin1_General_CP1_CI_AS, column is varchar.
I have been unable to perform any code page Encoding in .Net that yields the same result.
I want to emulate this behaviour in my database loader but as yet have been able to find a way....
I have the following table: create table tTest ( x varchar(20) COLLATE SQL_Latin1_General_CP1_CI_AS )
I need to populate this table using SqlBulkCopy, however some symbols are inserted with mistakes.
Here is an example of the code that I€™m using: using (SqlBulkCopy copier = new SqlBulkCopy(ci.ConnectionString)) { copier.DestinationTableName = "tTest"; DataTable tbl = new DataTable(); tbl.Columns.Add(new DataColumn("x"));
SqlBulkCopy does not seem to have much flexibility. If your table has columns that dont allow duplicates, you are apparently screwed. Its a shame there is no switch or param setting you can issue the SqlBulkCopy class (if there is, please let me know!!)Example:Say I have a table "Cars" with fields "CarId" (identity) and "CarName" (varchar) and the CarName field has a unique constraint. Now, I have a DataTable that contains a bunch of CarNames to insert. If there are duplicates on CarName, the entire insert fails. This is nothing to do with the PK or identity field. The problem I have: I would much rather have an option to ignore or silently not insert that duplicate row, but continue to insert the rest of my data. Any known work arounds for this would be much appreciated. Maybe I am missing something?
Hi There, I'm trying to use a sql bulk copy to transfer data from xml file to a table in one of my page. In this page I'm doing 2 database related. The first is a simple insert that will return a value and the second one is the sql bulk copy data transfer. I'm using the same connection for both of them and the sql bulk copy always give me a "login failed" error while the insert is fine. Do I need to set a specific setting for the sql server account so that it can use sql bulk copy? Thank you
I have programmatically created a SqlConnection that begins a SqlTransaction. During the first part of this SqlTransaction, the contents of a table are deleted. The next part uses the SqlBulkCopy object to copy data from another database (in the form of a DataTable). The delete goes through fine, but the SqlBulkCopy always generates a SqlException with the message "Unexpected existing transaction." I cannot think of anything I am doing wrong. The code looks at an XML file for instructions on each transaction. Each transaction is composed of tasks. Each task will pull data from a different type of database (MVR.Command is a Factory Database object). Please view the code below and tell me if you can spot what I am doing wrong: using (SqlConnection destinationConnection = new SqlConnection(MVR.ConnectionSource.GetConnectionString(destinationServiceName))) { destinationConnection.Open();
using (SqlTransaction transaction = destinationConnection.BeginTransaction(IsolationLevel.Snapshot, "Transport")) { transaction.Save("Beginning");
int totalTasks = 0; int successfulTasks = 0;
foreach (XmlNode taskNode in transactionNode.SelectNodes("Tasks/Task")) { totalTasks += 1;
try { // Prepare the destination table (delete everything) int rowsDeleted = new SqlCommand("delete from " + destinationTablename, destinationConnection, transaction).ExecuteNonQuery();
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(destinationConnection)) { bulkCopy.DestinationTableName = destinationTablename;
// Based on the success of all tasks, either commit or rollback if (successfulTasks == totalTasks) { transaction.Commit(); } else { transaction.Rollback(); } }
I have written an app that will allow you to send a query to Teradata, return the results into a Reader and then Bulk Copy that data into SQL Server 2005.
If the query results in a large dataset (ie 20,000,000 rows) is processed then while that data is being bulk copied into SQL Server, using the SQLBulkCopy class, then it prevents users on other computers from logging into SQL Server Management Studio.
Those that are already logged in are shut down also. Everything appears fine to the users but queries do not finish running.
Everything immediatly starts working as normal when either my program finishes or I shut down my program.
Is there any type of property to the SQLBulkCopy class or any other function that will prevent Management Studio from locking up?