I am doing a full snapshot on couple of subcriptions but getting this message from last couple of days. I have to do this at night because tables are very large so it does not block the users. I am not sure which table this error is happening. It used to work fine but from last couple of days this has started to happen. I need to re-sync my subscription database with production database but some of the tables are giving problem.
The process could not bulk copy out of table '[dbo].[syncobj_0x3735393934363031]'.
I/O error while writing BCP data-file
(Source: ODBC SQL Server Driver (ODBC); Error number: 0)
---------------------------------------------------------------------------------------------------------------
Before implementing memory based bulk copy insert with IRowsetFastLoad interface of SQL Server 2005 OLE DB provider, I want to know some considerations.
- performance : compared with T-SQL's "BULK INSERT ..." and bcp utility
- SQL Server's resource usage : when running memory based bulk copy, server resource's influence
- server side action(behavior) : when server is busy, delayed-update means IRowsetFastLoad::Commit(true) method can insert right after?
- row-count : The rowcount limitation can be inserted by IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit
Hi~, I have 3 questions about memory based bulk copy.
1. What is the limitation count of IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit(true)? For example, how much insert row at below sample?(the max value of nCount) for(i=0 ; i<nCount ; i++) { pIFastLoad->InsertRow(hAccessor, (void*)(&BulkData)); }
2. In above code sample, isn't there method of inserting prepared array at once directly(BulkData array, not for loop)
3. In OLE DB memory based bulk copy, what is the equivalent of below's T-SQL bulk copy option ? BULK INSERT database_name.schema_name.table_name FROM 'data_file' WITH (ROWS_PER_BATCH = rows_per_batch, TABLOCK);
------------------------------------------------------- My solution is like this. Is it correct?
// CoCreateInstance(...); // Data source // Create session
I have a question...As per my knowledge bulk copying is not possible during the backup operation. If backup starts first backup will go and bulk copy will fail or bulk copy starts first backup fails and bulk copy will continue... Today I was testing bcp to run in dts using the EXCUTE PROCESS TASK (with this task we run any Win32 Excutable or batch file). I am trying to bcp out from one database(source) and bcp in to another database(destination). While running this package backup also running... I have started the database (destination) backup job and it was running and I started another job to run the dts (even I ran dts manually). Both the josbs succeeded and inserted the data into the table.... Can any one shed some light on this ....
I want to bulk copy a table using 'SELECT INTO' from a database in server1 to a database in server2. The 'FROM' part of the select into only allows three parameters 'database, user & table' within the the one server.
I remember seeing some option that allows one to bulk copy across servers but cant find it ?????
hi , I am using command line for bulk copy operation. I do have a couple of tables with some triggers to move data from one table to another on an insert trigger, I was just wondering, Is those triggers going to fire when I import data into the tables using bcp command line?
With the following I try to save the content of an excel-sheet to a sql table. This works perfectly with SQL Server Express but not with MSDE, which I would need also. Here the code:String rootPath1 = Request.MapPath("~/Kontoauszug.xls"); String strConn = "Provider=Microsoft.Jet.OLEDB.4.0;" +"Data Source=" + rootPath1 + ";Extended Properties=Excel 8.0;"; OleDbDataAdapter da = new OleDbDataAdapter("SELECT * FROM [Mappe1$]", strConn);DataTable dtCustomers = new DataTable(); da.Fill(dtCustomers);
string ConnectionString = ConfigurationManager.ConnectionStrings["ConnectionString"].ToString();using (SqlConnection destinationConnection = new SqlConnection(ConnectionString)) { // open the connection destinationConnection.Open();using (SqlBulkCopy bulkCopy = new SqlBulkCopy(destinationConnection.ConnectionString)) { bulkCopy.BatchSize = 500; bulkCopy.BulkCopyTimeout = 90;bulkCopy.DestinationTableName = "dbo.Auszug"; bulkCopy.WriteToServer(dtCustomers);
bulkCopy.Close(); } } With MSDE I see the following error-message Fehler bei der Anmeldung für den Benutzer 'sa'. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.Data.SqlClient.SqlException: Fehler bei der Anmeldung für den Benutzer 'sa'.Source Error:
Line 148: Line 149: // Write from the source to the destination. Line 150: bulkCopy.WriteToServer(dtCustomers); Line 151: Line 152: Source File: d:InetpubWww_rootXXXXXXXXXXXXXXXXXXXXXXXX.cs Line: 150 Stack Trace:
Can I still use bcp comman to export data into a table when the target table has a referential integrity with another table ( parent child relationship ) .. thanks
I have a bcp which generates .txt file perfectly. I just wanted to know how can i generate a text file in distributed environment.
Assuming that my Sql Server is running in machine A. I wanted the bcp to generate in Machine B. What are the permission's i should give in order to generate it in Machine B.
Server: Msg 7399, Level 16, State 1, Line 1 OLE DB provider 'STREAM' reported an error. The provider did not give any information about the error. The statement has been terminated.
i am beginner in .NET visual programming using C#.NET with SQL Server 2005
i am facing a problem in SQL Server. suppose I have a table given below.. ------------------------------------------------ Table name is "A" ------------------------------------------------ 1 Michle Administrator 2 John Consumer Finance Officer 3 Jackson Employer 4 Goeffery Empl0yer
------------------------------------------------ Here the identity column is the first one. Now if i delete the 2nd row the this table becomes like that ------------------------------------------------ Table name is "A" ------------------------------------------------ 1 Michle Administrator 3 Jackson Employer 4 Goeffery Empl0yer
------------------------------------------------
Now i have an another table "B" having same structure (3 columns and first column is the identity column). Now if i move the all the data from table "A" to table "B" then the original seqeunce is distrubed the table "B" contains the data like this.
------------------------------------------------ Table name is "B" ------------------------------------------------ 1 Michle Administrator 2 Jackson Employer 3 Goeffery Empl0yer
------------------------------------------------
but i need that what ever the first column is identity column but sequence of numbering must be same as the numbering in the table "A" to preserve the relationships.... like that...
------------------------------------------------ Table name is "B" ------------------------------------------------ 1 Michle Administrator 3 Jackson Employer 4 Goeffery Empl0yer
------------------------------------------------
waiting for ur responce. i will very thankful to you to solve my problem.
Note:- the is moved using SQLBULKCOPY command.....
We have bulk copy option enabled for our DB and we really use it. Will it be possible to set up a snapshot replication over the Internet of particular tables to a remote server from which the data will be only retrieved and never changed? Also, is it necessary to have PKs in all tables for this one-way snapshot replication? (for transactional replication it is needed, as I know)
I am working on sql server bulk copy program. I am getting data files from our vendors for shares and stocks. The data files are pipe separated values. for ex the Ascii file format is
8388182|"ACC consultanats"|"rating for the current financial year"|23
My doubt is i have four columns in my sql server table named stocks.
In this the third column named memo1 in the data file would be a large volume data. That is it may be upto one full A4 size page. One important thing is, data in the third column is not formatted.
Since it is very urgent, let me know what would be the format file for this type of data file and the bulk copy program utility.
Hi, I am have a text file that contain row header. I want to export this text file into pub database to the author table. I usually use this code: Exec master..xp_cmdshell "bcp pubs..authors in d:dataauthors.txt /c /Snameofserver /Usa /Ppassword"
from sql analyser window. it the text file does not have a header, I am able to export the data, but if the text file Does have a header, I got an error, I know that I can open the text file and delete the header then run the bcp process. But I do not want to do so, IS there a code that I can add to the bcp code above to accept the header row and have a successful bcp procedure. thanks
Can someone confirm that setting a db to use the simple recovery model in sql server 2000 is the same as setting a db to user trunc. log on checkpt. and select into/bulk copy on sql server 7?
Hi guys,in my db i have these three tables1.Stores 2.Products3.Partstheir structure is something like :Stores ----Products ----PartsStores----------------StoreId, StoreNameProducts----------------ProductId, StoreId, ProductNameParts----------------PartId, ProductId, PartNamenow, in my application i wanna to implement a bulk-copy operation souser can copy products from one store to another one and when aproduct copied to new store;all of it's parts should copy too.in fact i need a method to insert a Product item in Products table andsynchronously copy it's parts into Parts table and repeat this stepsuntil all of proucts copied.how can i do that without cursors or loops ?Thanks
I'm trying to start a new publication. When the snapshot agent runs, it stops on a table with the error "Bulk Copy Failed". If I remove the table from the publication, it just moves the error to the next table it tries to copy. What could cause a bulk copy to fail during a snapshot?
I got a problem in regarding Transactional Replication.
Let me explain my scenario.
I€™m doing transactional replication between two databases.
When publisher and subscriber created the data going to be bulk copied from publisher table to subscriber table.
My main intension was to create replication between different tables with different fields in which I got succeeded.
But main problem is I want to stop this bulk copy from publisher to subscriber.
Scenario 1: my subscriber table may contain some previous data which will be replaced with publisher data due to bulk copy. I don€™t want this .I want to avoid this bulk copy and wants to create procedures(for insert, update and delete transactions) in subscriber which will take care of replication.
I achieved almost everything but not able to avoid this bulk copy during the creation of subscriber.
As I know the only way I can stop bulk copy is by creating subscription without subscription agent. But here without subscription agent the procedures(for insert, update and delete transactions) won€™t get created in subscriber.
Help me regarding the above scenario and I need it urgently.
I have a set of records in application memory seperated by a record terminator ''. I can write the memory stream to a local disk file and call bcp api functions to load the file in to SQL server. But how do I transfer the in memory data directly to the SQL server, without writing to a data file, using ODBC. I am not using any .Net Framework classes in my code. The SQL server and application server(generating the data records) are on two different physical servers connected through network. I am trying to figureout the fastest and efficient way to load the data to SQL server from a remote application server. Thanks for your help.
I am trying to export a table to .CSV file using Sql 2005 BCP utility through .Net web page. I get the following error: " An error occurred during the execution of xp_cmdshell. A call to 'CreateProcessAsUser' failed with error code: '1314'. " I have done following steps: -- Set up proxy for a windows account using command : sp_xp_cmdshell_proxy_account '<winAccount>', '<password>' -- EXEC sp_grantdbaccess '<winAccount>' -- GRANT exec ON xp_cmdshell TO <winAccount> -- BCP command is inside a stored proc, so I run the stored proc as : Create Proc dbo.Export_Table with execute as '<winAccount>' What I am I missing? I can run the stored proc in SQL Managaement studio using that windows account. I don't get the error when I use the Visual Studio Developement Web Server, the export file is created without any problem. I get the error only when I run my .Net web page though IIS. Please help. Thanks
I'm running SQL 7.0 SP3 on two different machines (one with additional hotfixes). I'm taking a nightly snapshot of imported data on Server1 and pushing it out to another SQL 7.0 server on our network, Server2. All but one table is copied successfully. On the final table, I receive the message, "The process could not bulk copy into table '"%"'." Error Information Category: Data Source, Source: Server2, Number 4813.
Full error message: "Expected the text length in data stream for bulk copy of text, ntext, or image data."
I've looked up 4813, but it's pretty ambiguous/generic. Also, when I SELECT from Server1 and INSERT INTO Server2 in the QA, I receive no errors. Does anyone have any insight?
Still in my republisher scheme I added a trigger at the publisher server created the snapshot than reinitialized the subscription , started the merge agent and the trigger passed. But at the republisher I followed the same technic that worked previously and now it gives me those errors when I start the merge agent.....
The process could not deliver the snapshot to the Subscriber. (Source: Merge Replication Provider (Agent); Error number: -2147201001) --------------------------------------------------------------------------------------------------------------- The process could not bulk copy into table '"dbo"."MSmerge_genhistory"'. (Source: ***************** (Agent); Error number: 20037) --------------------------------------------------------------------------------------------------------------- Cannot insert duplicate key row in object 'MSmerge_genhistory' with unique index 'unc1MSmerge_genhistory'. (Source: ***************** (Data source); Error number: 2601) --------------------------------------------------------------------------------------------------------------- Function sequence error (Source: ***************** (ODBC); Error number: 0) ---------------------------------------------------------------------------------------------------------------
Does anyone have any idea of what to do ??? except delete the subscription....
I have setup an 'indexed view logged' article so that I can replicate my view to a table in the destination server. I need to do transactional replication without changing the data at the destination so i used 'Keep existing object unchanged' in the article but I'm always getting 'The process could not bulk copy into table '"nnn"."musictracks1"'.
I'm trying to setup transaction replication between 2 servers. This is a one-way replication: Server A to Server B, not Server B to Server A.
I am able to replicate all the tables except one. I added commands to the agent so that it would create an output file, possibly with more or better information.
Here is a portion of the error causing the failure
Agent message code 20037. The process could not bulk copy into table '"tblSuppContractFee"'. [5/5/2006 8:02:10 PM]01sqlft003.distribution: {call sp_MSadd_distribution_history(4, 6, ?, ?, 0, 0, 0.00, 0x01, 1, ?, 6, 0x01, 0x01)} Adding alert to msdb..sysreplicationalerts: ErrorId = 65, Transaction Seqno = 000075400000ff9b000b00000002, Command ID = 6 Message: Replication-Replication Distribution Subsystem: agent 01sqlft003-EDGE-01SQLFT004-4 failed. The process could not bulk copy into table
[5/5/2006 8:02:10 PM]01SQLFT004.EDGE_REPLICATION: exec dbo.sp_MSupdatelastsyncinfo N'01sqlft003',N'EDGE', N'', 0, 6, N'The process could not bulk copy into table ''"tblSuppContractFee"''.'
Can somebody help me in finding a solution for this error? I don't see any Error Text and there are no resources available for the error code throwing up in the log file.
The process could not bulk copy out of table '[dbo].[syncobj_0x3944323636373031]'.
Snapshot replication was running fine up until last week. The e drive on the system failed. The server was rebooted and the drive returned. Since then the replication has failed. (e drive is the location of the database and log files)
The database is approx. 12 gig. I googled the error message an found several references to the drive being full. So I tried to map the replication to a different drive. The replication fails with the same error.
Also tried to drop and recreate the replication process. Still the same error.
I need to move all of the contents of one database into anther with the same schema, and it looks like this might be just what I need. But it is from 2007, so I wonder if it is still current?
Also, having tried to run it on another database to generate the script that will actually do the copying, I have a few questions. It looks like it generates statements to import the data twice. For example:
BULK INSERT [TaPerfGDB].[dbo].[i1] Â Â FROM 'C:Tempi1.Dat' Â Â WITH (FORMATFILE = 'C:Tempi1.FMT', Â Â Â Â Â Â Â Â BATCHSIZE = 1000000, Â Â Â Â Â Â Â Â ERRORFILE = 'C:TempBI_i1.ERR', Â Â Â Â Â Â Â Â TABLOCK);
And a little later:
INSERT INTO [TaPerfGDB].[dbo].[i1] Â Â Â SELECT * Â Â Â Â Â FROMÂ OPENROWSET(BULKÂ 'C:Tempi1.Dat', Â Â Â Â Â FORMATFILE='C:Tempi1.Xml' Â Â Â Â Â ) as t1;
That does not really make any sense to me. It also generates statements like this:
bcp "[TaPerfGDB].[dbo].[GDB_GEOMNETWORKS]" format nul -n -CRAW -f "C:TempGDB_GEOMNETWORKS.fmt" --S"PGALLUCC-M7" -T
What is the deal with the double hyphen by the servername? Won't it just see that as a comment? It can be easily fixed, but I am just suprised that it is still there after all these years. My purpose in doing this is a desparate attempt to salvege a database that sits on a server with multiple drive errors. These prevent backups, so I cannot just restore the database on the new server. That is why I want to try an approach that goes table by table, so that at least all the tables which are not touched by the drive errors can be moved.
It is a 3 TB database running on SQL Server 2008 R2 std. ed.