Parts
----------------
PartId, ProductId, PartName
now, in my application i wanna to implement a bulk-copy operation so
user can copy products from one store to another one and when a
product copied to new store;
all of it's parts should copy too.
in fact i need a method to insert a Product item in Products table and
synchronously copy it's parts into Parts table and repeat this steps
until all of proucts copied.
Before implementing memory based bulk copy insert with IRowsetFastLoad interface of SQL Server 2005 OLE DB provider, I want to know some considerations.
- performance : compared with T-SQL's "BULK INSERT ..." and bcp utility
- SQL Server's resource usage : when running memory based bulk copy, server resource's influence
- server side action(behavior) : when server is busy, delayed-update means IRowsetFastLoad::Commit(true) method can insert right after?
- row-count : The rowcount limitation can be inserted by IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit
Hi~, I have 3 questions about memory based bulk copy.
1. What is the limitation count of IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit(true)? For example, how much insert row at below sample?(the max value of nCount) for(i=0 ; i<nCount ; i++) { pIFastLoad->InsertRow(hAccessor, (void*)(&BulkData)); }
2. In above code sample, isn't there method of inserting prepared array at once directly(BulkData array, not for loop)
3. In OLE DB memory based bulk copy, what is the equivalent of below's T-SQL bulk copy option ? BULK INSERT database_name.schema_name.table_name FROM 'data_file' WITH (ROWS_PER_BATCH = rows_per_batch, TABLOCK);
------------------------------------------------------- My solution is like this. Is it correct?
// CoCreateInstance(...); // Data source // Create session
Still in my republisher scheme I added a trigger at the publisher server created the snapshot than reinitialized the subscription , started the merge agent and the trigger passed. But at the republisher I followed the same technic that worked previously and now it gives me those errors when I start the merge agent.....
The process could not deliver the snapshot to the Subscriber. (Source: Merge Replication Provider (Agent); Error number: -2147201001) --------------------------------------------------------------------------------------------------------------- The process could not bulk copy into table '"dbo"."MSmerge_genhistory"'. (Source: ***************** (Agent); Error number: 20037) --------------------------------------------------------------------------------------------------------------- Cannot insert duplicate key row in object 'MSmerge_genhistory' with unique index 'unc1MSmerge_genhistory'. (Source: ***************** (Data source); Error number: 2601) --------------------------------------------------------------------------------------------------------------- Function sequence error (Source: ***************** (ODBC); Error number: 0) ---------------------------------------------------------------------------------------------------------------
Does anyone have any idea of what to do ??? except delete the subscription....
I have setup an 'indexed view logged' article so that I can replicate my view to a table in the destination server. I need to do transactional replication without changing the data at the destination so i used 'Keep existing object unchanged' in the article but I'm always getting 'The process could not bulk copy into table '"nnn"."musictracks1"'.
The process could not bulk copy out of table '[dbo].[syncobj_0x3944323636373031]'.
Snapshot replication was running fine up until last week. The e drive on the system failed. The server was rebooted and the drive returned. Since then the replication has failed. (e drive is the location of the database and log files)
The database is approx. 12 gig. I googled the error message an found several references to the drive being full. So I tried to map the replication to a different drive. The replication fails with the same error.
Also tried to drop and recreate the replication process. Still the same error.
I am using Bulk Copy command for Exporting data table wise from database to csv files and it was working fine. Since last 3-4 days when exporting for some tables data in csv file is coming junk.
Can anyone please point me in the right direction?
What I am trying to do should be very straightforward:
Take a flat file, perform various transformation on various columns using the SCRIPT COMPONENT task, then send the transformed (and un-transformed) rows to a table in the database.
My question is, how to do this using scripting? I have yet to see an example of what I'm trying to do. (I have both Kirk Haselden's book, Donald Farmer's SSIS scripting book, and the msdn website, but I have yet to see an example of what I'm trying to do!)
FILE SOURCE --> SCRIPT COMPONENT (synchronous transform) --> OLE DB DESTINATION
How do I account for all the columns that will be both transformed and un-transformed, and get them into the table? That is the missing piece of information I can't find anywhere.
The closest thing I found was this code snippet. Do I need to use this syntax, eg. Me.Output0Buffer.FirstName = (where FirstName is the actual column name??)
etc.
Then, once I hook up the SCRIPT COMPONENT to the OLEDB Destination, which uses a connection manager to the table, it will insert FirstName with what I specify?
I have a question...As per my knowledge bulk copying is not possible during the backup operation. If backup starts first backup will go and bulk copy will fail or bulk copy starts first backup fails and bulk copy will continue... Today I was testing bcp to run in dts using the EXCUTE PROCESS TASK (with this task we run any Win32 Excutable or batch file). I am trying to bcp out from one database(source) and bcp in to another database(destination). While running this package backup also running... I have started the database (destination) backup job and it was running and I started another job to run the dts (even I ran dts manually). Both the josbs succeeded and inserted the data into the table.... Can any one shed some light on this ....
I want to bulk copy a table using 'SELECT INTO' from a database in server1 to a database in server2. The 'FROM' part of the select into only allows three parameters 'database, user & table' within the the one server.
I remember seeing some option that allows one to bulk copy across servers but cant find it ?????
hi , I am using command line for bulk copy operation. I do have a couple of tables with some triggers to move data from one table to another on an insert trigger, I was just wondering, Is those triggers going to fire when I import data into the tables using bcp command line?
I am doing a full snapshot on couple of subcriptions but getting this message from last couple of days. I have to do this at night because tables are very large so it does not block the users. I am not sure which table this error is happening. It used to work fine but from last couple of days this has started to happen. I need to re-sync my subscription database with production database but some of the tables are giving problem.
The process could not bulk copy out of table '[dbo].[syncobj_0x3735393934363031]'.
I/O error while writing BCP data-file (Source: ODBC SQL Server Driver (ODBC); Error number: 0) ---------------------------------------------------------------------------------------------------------------
With the following I try to save the content of an excel-sheet to a sql table. This works perfectly with SQL Server Express but not with MSDE, which I would need also. Here the code:String rootPath1 = Request.MapPath("~/Kontoauszug.xls"); String strConn = "Provider=Microsoft.Jet.OLEDB.4.0;" +"Data Source=" + rootPath1 + ";Extended Properties=Excel 8.0;"; OleDbDataAdapter da = new OleDbDataAdapter("SELECT * FROM [Mappe1$]", strConn);DataTable dtCustomers = new DataTable(); da.Fill(dtCustomers);
string ConnectionString = ConfigurationManager.ConnectionStrings["ConnectionString"].ToString();using (SqlConnection destinationConnection = new SqlConnection(ConnectionString)) { // open the connection destinationConnection.Open();using (SqlBulkCopy bulkCopy = new SqlBulkCopy(destinationConnection.ConnectionString)) { bulkCopy.BatchSize = 500; bulkCopy.BulkCopyTimeout = 90;bulkCopy.DestinationTableName = "dbo.Auszug"; bulkCopy.WriteToServer(dtCustomers);
bulkCopy.Close(); } } With MSDE I see the following error-message Fehler bei der Anmeldung für den Benutzer 'sa'. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.Data.SqlClient.SqlException: Fehler bei der Anmeldung für den Benutzer 'sa'.Source Error:
Line 148: Line 149: // Write from the source to the destination. Line 150: bulkCopy.WriteToServer(dtCustomers); Line 151: Line 152: Source File: d:InetpubWww_rootXXXXXXXXXXXXXXXXXXXXXXXX.cs Line: 150 Stack Trace:
Can I still use bcp comman to export data into a table when the target table has a referential integrity with another table ( parent child relationship ) .. thanks
I have a bcp which generates .txt file perfectly. I just wanted to know how can i generate a text file in distributed environment.
Assuming that my Sql Server is running in machine A. I wanted the bcp to generate in Machine B. What are the permission's i should give in order to generate it in Machine B.
Server: Msg 7399, Level 16, State 1, Line 1 OLE DB provider 'STREAM' reported an error. The provider did not give any information about the error. The statement has been terminated.
i am beginner in .NET visual programming using C#.NET with SQL Server 2005
i am facing a problem in SQL Server. suppose I have a table given below.. ------------------------------------------------ Table name is "A" ------------------------------------------------ 1 Michle Administrator 2 John Consumer Finance Officer 3 Jackson Employer 4 Goeffery Empl0yer
------------------------------------------------ Here the identity column is the first one. Now if i delete the 2nd row the this table becomes like that ------------------------------------------------ Table name is "A" ------------------------------------------------ 1 Michle Administrator 3 Jackson Employer 4 Goeffery Empl0yer
------------------------------------------------
Now i have an another table "B" having same structure (3 columns and first column is the identity column). Now if i move the all the data from table "A" to table "B" then the original seqeunce is distrubed the table "B" contains the data like this.
------------------------------------------------ Table name is "B" ------------------------------------------------ 1 Michle Administrator 2 Jackson Employer 3 Goeffery Empl0yer
------------------------------------------------
but i need that what ever the first column is identity column but sequence of numbering must be same as the numbering in the table "A" to preserve the relationships.... like that...
------------------------------------------------ Table name is "B" ------------------------------------------------ 1 Michle Administrator 3 Jackson Employer 4 Goeffery Empl0yer
------------------------------------------------
waiting for ur responce. i will very thankful to you to solve my problem.
Note:- the is moved using SQLBULKCOPY command.....
We have bulk copy option enabled for our DB and we really use it. Will it be possible to set up a snapshot replication over the Internet of particular tables to a remote server from which the data will be only retrieved and never changed? Also, is it necessary to have PKs in all tables for this one-way snapshot replication? (for transactional replication it is needed, as I know)
I am working on sql server bulk copy program. I am getting data files from our vendors for shares and stocks. The data files are pipe separated values. for ex the Ascii file format is
8388182|"ACC consultanats"|"rating for the current financial year"|23
My doubt is i have four columns in my sql server table named stocks.
In this the third column named memo1 in the data file would be a large volume data. That is it may be upto one full A4 size page. One important thing is, data in the third column is not formatted.
Since it is very urgent, let me know what would be the format file for this type of data file and the bulk copy program utility.
Hi, I am have a text file that contain row header. I want to export this text file into pub database to the author table. I usually use this code: Exec master..xp_cmdshell "bcp pubs..authors in d:dataauthors.txt /c /Snameofserver /Usa /Ppassword"
from sql analyser window. it the text file does not have a header, I am able to export the data, but if the text file Does have a header, I got an error, I know that I can open the text file and delete the header then run the bcp process. But I do not want to do so, IS there a code that I can add to the bcp code above to accept the header row and have a successful bcp procedure. thanks
Can someone confirm that setting a db to use the simple recovery model in sql server 2000 is the same as setting a db to user trunc. log on checkpt. and select into/bulk copy on sql server 7?
I'm trying to start a new publication. When the snapshot agent runs, it stops on a table with the error "Bulk Copy Failed". If I remove the table from the publication, it just moves the error to the next table it tries to copy. What could cause a bulk copy to fail during a snapshot?
I got a problem in regarding Transactional Replication.
Let me explain my scenario.
I€™m doing transactional replication between two databases.
When publisher and subscriber created the data going to be bulk copied from publisher table to subscriber table.
My main intension was to create replication between different tables with different fields in which I got succeeded.
But main problem is I want to stop this bulk copy from publisher to subscriber.
Scenario 1: my subscriber table may contain some previous data which will be replaced with publisher data due to bulk copy. I don€™t want this .I want to avoid this bulk copy and wants to create procedures(for insert, update and delete transactions) in subscriber which will take care of replication.
I achieved almost everything but not able to avoid this bulk copy during the creation of subscriber.
As I know the only way I can stop bulk copy is by creating subscription without subscription agent. But here without subscription agent the procedures(for insert, update and delete transactions) won€™t get created in subscriber.
Help me regarding the above scenario and I need it urgently.
I have a set of records in application memory seperated by a record terminator ''. I can write the memory stream to a local disk file and call bcp api functions to load the file in to SQL server. But how do I transfer the in memory data directly to the SQL server, without writing to a data file, using ODBC. I am not using any .Net Framework classes in my code. The SQL server and application server(generating the data records) are on two different physical servers connected through network. I am trying to figureout the fastest and efficient way to load the data to SQL server from a remote application server. Thanks for your help.
I am trying to export a table to .CSV file using Sql 2005 BCP utility through .Net web page. I get the following error: " An error occurred during the execution of xp_cmdshell. A call to 'CreateProcessAsUser' failed with error code: '1314'. " I have done following steps: -- Set up proxy for a windows account using command : sp_xp_cmdshell_proxy_account '<winAccount>', '<password>' -- EXEC sp_grantdbaccess '<winAccount>' -- GRANT exec ON xp_cmdshell TO <winAccount> -- BCP command is inside a stored proc, so I run the stored proc as : Create Proc dbo.Export_Table with execute as '<winAccount>' What I am I missing? I can run the stored proc in SQL Managaement studio using that windows account. I don't get the error when I use the Visual Studio Developement Web Server, the export file is created without any problem. I get the error only when I run my .Net web page though IIS. Please help. Thanks
I'm running SQL 7.0 SP3 on two different machines (one with additional hotfixes). I'm taking a nightly snapshot of imported data on Server1 and pushing it out to another SQL 7.0 server on our network, Server2. All but one table is copied successfully. On the final table, I receive the message, "The process could not bulk copy into table '"%"'." Error Information Category: Data Source, Source: Server2, Number 4813.
Full error message: "Expected the text length in data stream for bulk copy of text, ntext, or image data."
I've looked up 4813, but it's pretty ambiguous/generic. Also, when I SELECT from Server1 and INSERT INTO Server2 in the QA, I receive no errors. Does anyone have any insight?