Pros: How To Bulk Delete And Bulk Insert?

Oct 11, 2000

I have a table containing 8 million records.
I need to replace 2 million of these records with
a scaled down query that goes something like:
SELECT 1, ShareholderID, Assets1
FROM MyTable (Yields appx. 200,000 recods)
SELECT 2, ShareholderID, Assets2
FROM MyTable (Yields appx. 200,000 recods)
.
.
.
SELECT 10, ShareholderID, Assets1 + Assest2 + Assets3 + ... + Assets9
FROM MyTable (Yields appx. 200,000 recods)

Updates and cursors just seem to be too slow.

So far I have done the following, but was wondering if anyone could think of a better way.
SELECT 6 million records that don't need to be deleted into a #TempTable
Use statements above to select into same #TempTable
DROP and recreate Original Table
SELECT 6 + 2 million records INTO original table.

This seems rather convoluted. Is there a better approach? Would it be worth while to dump data to a file and use bcp / Bulk Insert


Any comments are appreciated,

-Marc

View 3 Replies


ADVERTISEMENT

How Do You Use An Identity Column When Doing A Bulk Insert Using The Bulk Insert Task Editor

Apr 18, 2008



Hello,

I'm just learning SSIS and I've hit my first bump. I am doing a bulk import from a tab delimited text file to an empty sql table that has a Idendity column defined. How do I tell the bulk insert task to skip that column when inserting from the text file. If I remove the identity column it imports the data fine, but I want to create the indentity column in the table too.

Thanks.

View 8 Replies View Related

Bulk Insert - Bulk Load Data Conversion Error

Jan 17, 2008

Im having some issues with bulk insert.

This is the table:

CREATE TABLE [dbo].[tmp_GA_status](

[GA_recno] [int] NOT NULL,

[GA_desc] [varchar](40) NULL

)


This is the file (unicode):
1|"test1"
2|"test2"
3|"test3"
4|"test4"
5|"test5"
6|"test6"
7|"test7"
8|"test8"


and this is the sql:

bulk insert tmp_GA_status from 'C: empTextDumpGA_status.dta'

with (CODEPAGE='RAW', FIELDTERMINATOR='|', ROWTERMINATOR='
', DATAFILETYPE='widechar')



so yeah, pretty simple. But whatever I do I get this;

Msg 4864, Level 16, State 1, Line 1

Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 2 (GA_desc).



So what am I doing wrong ?

View 13 Replies View Related

I Don't Suppose BULK UPDATE Exists?... Like BULK INSERT?

Sep 27, 2007

I have to update a field within a table of 60 records or so. Each record has a different field value. it's type varchar. i was given an excel file with the field values and was thinking of a bulk update like bulk insert, but i don't recall that it's possible that way.

Is the only way to create a table, bulk insert, then merge the two tables together with UPDATE?

Just wanted to see if there was an easier way to do it, otherwise i'll take the latter route. Thanks!

View 1 Replies View Related

Cannot Fetch A Row From OLE DB Provider BULK With Bulk Insert Task

Nov 23, 2005

Hi, folks:

View 18 Replies View Related

DB Engine :: Bulk Update Is Recorded As Delete And Insert In CDC Tables?

Nov 18, 2015

I have a fundamental problem with how CDC works for bulk updates.When CDC enabled table is updated for single row - My CDC system tables its recording it as update (3 & 4)  which is perfect and what it should be. No Complains!But when I do a bulk update in the same CDC enabled tables for the same columns - My CDC system tables its recording as delete and then insert (1 & 2). This is not correct and this is what my problem is.  We used triggers before CDC we did not face this problem with triggers every thing was fine with triggers other than performance.The way how the CDC  is handling the bulk update is  a big problem for me because based on the output of CDC system tables we are doing some migration work to legacy system.

It will be impossible  for me to go and change my migration logic scripts because we have 100's or procedures in it.Is it a know problem with CDC? Is there any solution in CDC when a bulk update happens on a table the CDC system tables record it as updates. I don't think CDC 'net changes' in this situation because the net change would show as single inserted row.If this can't be done with CDC then I have to completely abandon CDC and go back to triggers..

View 5 Replies View Related

Integration Services :: Bulk Insert Is Locking A File Delete Task

Jun 19, 2015

I have an SSIS package doing a bulk insert from a file. Then later on I'm trying to delete that file (in a file delete task), but I'm getting an error:[File System Task] Error: An error occurred with the following error message: "The process cannot access the file 'xyz' because it is being used by another process.".I'm wondering if there isn't some way to 'tweak' the bulk insert syntax so that it doesn't lock the file?

View 5 Replies View Related

Bulk Insert Using Script And Not Bulk Insert Task

Nov 2, 2007



Does anyone know how to do a bulk insert using just the script task? I've been searching everyehere but can't seem to find a sample.

View 6 Replies View Related

BULK INSERT ERROR Using Format File - Bulk Load Data Conversion Error

Jun 29, 2015

I'm trying to use Bulk insert for the first time and getting the following error. I think it might have something to do with my Format File and from the error msg there's a conversion error for the first column. In my database the Field is nvarchar(6) so my best guess is to use SQLNChar for the first column. I've checked the end of each line is CR LF therefore the is correct for line 7 right?

Msg 4863, Level 16, State 1, Line 1
Bulk load data conversion error (truncation) for row 1, column 1 (ASXCode).
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".

BULK
INSERTtbl_ASX_Data_temp
FROM
'M:DataASXImportTest.txt'
WITH
(FORMATFILE='M:DataASXSQLFormatImport.Fmt')

[code]...

View 5 Replies View Related

Questions About Bulk Copy Insert Using 'Memory Based Bulk Copy Operations'

Feb 1, 2007

Hi~,

Before implementing memory based bulk copy insert with IRowsetFastLoad interface of SQL Server 2005 OLE DB provider, I want to know some considerations.

- performance : compared with T-SQL's "BULK INSERT ..." and bcp utility

- SQL Server's resource usage : when running memory based bulk copy, server resource's influence

- server side action(behavior) : when server is busy, delayed-update means IRowsetFastLoad::Commit(true) method can insert right after?

- row-count : The rowcount limitation can be inserted by IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit

- any other guide lines

View 1 Replies View Related

Error: 0xC002F304 At Bulk Insert Task, Bulk Insert Task: An Error Occurred With The Following Error Message: Cannot Fetch A Row

Apr 8, 2008


I receive the following error message when I try to use the Bulk Insert Task to load BCP data into a table:


Error: 0xC002F304 at Bulk Insert Task, Bulk Insert Task: An error occurred with the following error message: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.The bulk load failed. The column is too long in the data file for row 1, column 4. Verify that the field terminator and row terminator are specified correctly.Bulk load data conversion error (overflow) for row 1, column 1 (rowno).".

Task failed: Bulk Insert Task

In SSMS I am able to issue the following command and the data loads into a TableName table with no error messages:
BULK INSERT TableName
FROM 'C:DataDbTableName.bcp'
WITH (DATAFILETYPE='widenative');


What configuration is required for the Bulk Insert Task in SSIS to make the data load? BTW - the TableName.bcp file is bulk copy file as bcp widenative data type. The properties of the Bulk Insert Task are the following:
DataFileType: DTSBulkInsert_DataFileType_WideNative
RowTerminator: {CR}{LF}

Any help getting the bcp file to load would be appreciated. Let me know if you require any other information, thanks for all your help.
Paul

View 1 Replies View Related

Questions About Memory Based Bulk Copy Operation(InsertRow Count,array Insert Directly,set Memory Based Bulk Copy Option)

Feb 15, 2007

Hi~, I have 3 questions about memory based bulk copy.

1. What is the limitation count of IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit(true)?
For example, how much insert row at below sample?(the max value of nCount)
for(i=0 ; i<nCount ; i++)
{
pIFastLoad->InsertRow(hAccessor, (void*)(&BulkData));
}

2. In above code sample, isn't there method of inserting prepared array at once directly(BulkData array, not for loop)

3. In OLE DB memory based bulk copy, what is the equivalent of below's T-SQL bulk copy option ?
BULK INSERT database_name.schema_name.table_name FROM 'data_file' WITH (ROWS_PER_BATCH = rows_per_batch, TABLOCK);

-------------------------------------------------------
My solution is like this. Is it correct?

// CoCreateInstance(...);
// Data source
// Create session

m_TableID.uName.pwszName = m_wszTableName;
m_TableID.eKind = DBKIND_NAME;

DBPROP rgProps[1];
DBPROPSET PropSet[1];

rgProps[0].dwOptions = DBPROPOPTIONS_REQUIRED;
rgProps[0].colid = DB_NULLID;
rgProps[0].vValue.vt = VT_BSTR;
rgProps[0].dwPropertyID = SSPROP_FASTLOADOPTIONS;
rgProps[0].vValue.bstrVal = L"ROWS_PER_BATCH = 10000,TABLOCK";

PropSet[0].rgProperties = rgProps;
PropSet[0].cProperties = 1;
PropSet[0].guidPropertySet = DBPROPSET_SQLSERVERROWSET;

if(m_pIOpenRowset)
{
if(FAILED(m_pIOpenRowset->OpenRowset(NULL,&m_TableID,NULL,IID_IRowsetFastLoad,1,PropSet,(LPUNKNOWN*)&m_pIRowsetFastLoad)))
{
return FALSE;
}
}
else
{
return FALSE;
}

View 6 Replies View Related

Allow Single Row Delete From A Table But Not Bulk Delete

Sep 16, 2013

The requirement is: I should allow single row delete from a table but not bulk delete. An audit table should get updated if there is any single delete or single update. So I wrote the triggers as follows: for single and bulk delete

ALTER TRIGGER [dbo].[TRG_Delete_Bulk_tbl_attendance]
ON [dbo].[tbl_attendance]
AFTER DELETE
AS

[code]...

When I try to run the website, the database error I am getting is:Transaction count after EXECUTE indicates that a COMMIT or ROLLBACK TRANSACTION statement is missing. Previous count = 0, current count = 1.

View 3 Replies View Related

Bulk Delete

Nov 9, 2005

we are trying to delete data from a huge 75 million records tableit takes 4hr to prune datadelete from Company where recordid in (select top 10000 recordid fromrecordid_Fed3 where flag = 0)we have a loop that prunes 10000 records at a time in a while looplet me know if there is a better way to acheive this

View 2 Replies View Related

Bulk Delete

Jun 20, 2007

I have data in the User table with these rows UserID(PK),Usernumber,Username where userid is the primarykey

1 101 Tom

2 101 Tom

3 102 Dick

4 102 Dick

5 103 Harry

6 103 Harry







Now I want to get out put with UerId(PK),UserNumber and UserName, deleting the repeated(second occurance) row, like this...

1 101 Tom

3 102 Dick

5 103 Harry



Can any one let me know.



Thanks in adv



View 3 Replies View Related

Bulk Delete Performance

Jan 15, 2008

I've about 2-10 millions records to delete. Is there anyway to have this delete statement run faster?


while exists (select * from table_report WHERE report_date between @from_date and @to_date)

begin

DELETE TOP (10000) FROM table_report WHERE report_date between @from_date and @to_date

end


Thanks,
-Ash

View 9 Replies View Related

Bulk Delete Question

Mar 11, 2008

I have a question regarding bulk delete.

Is there any special command for bulk delete? or it is just a big delete statement?

View 9 Replies View Related

Bulk Delete On Operational Data

Nov 23, 2005

Hello,I read several articles of newsgroup about the bulk delete, and I foundone way is to:-create a temporary table with all constraints of original table-insert rows to be retained into that temp table-drop constraints on original table-drop the original table-rename the temporary tableMy purge is a daily job, and my question is how this work on a heavyload operational database? I mean thousand of records are written intomy tables (the same table that I want to purge some rows from) everysecond. While I am doing the copy to temp table and drop the table whathappens to those operational data?I also realized another way of doing the bulk delete is using BCP:1) BCP out rows to be deleted to an archive file2) BCP out rows to be retained3) Drop indexes and truncate table4) BCP in rows to be retained5) Create indexesAgain the same question: When I'm doing the BCP is there any insertionblocking to my original table? What happens to my rows meantime to beinserted?Does BCP acquire an exclusive lock on the table which prevents anyother insertion?Does any one have an experience with a BCP command for querying out 2million records, and how long will it take?I appreciate your help.

View 2 Replies View Related

Massive Bulk Delete / Data Purge Problem

Jan 28, 2008

I've got a large MS Sql Server 2000 database that has 15 indexes, with roughly 180 million rows representing 240 GB worth of data. Due to the massive size of the database we are trying to purge it down to a smaller dataset, about 40 million rows, in order to speed up the query performance and to be able to defrag the indexes (which are 30-50% fragmented). To complicate the matter, this table is also a publisher in a transactional replication setup, with one subscriber. Also, the system needs to be up constantly so I'm only allowed about a 3-5 hour period to take an outage a week.

So far I've tested several methods of delete following all best practices (batch deletes, using indexes in delete's where clause), and have come up with deleting/commiting 500 rows at a time. The problem is that it still takes 3-4 seconds to delete this many rows, on a 8 GB RAM, 4 processor machine that is not currently used or replicated.

I'm at a loss on a way to pare down the data with a delete as the current purge script will take 7 hours a day for about 3 months. Another option I'm considering is to do a truncate and copy the data back over from the replicated database, but again this has its own set of problems, i.e. network latency and slow inset times. Yet another option would be to create a replica of the table on the production db, copy the data to it, then rename the table.

Any one have experience with purging such a massive amount of data? Any help would be greatly appreciated.

View 6 Replies View Related

Can I Insert/Update Large Text Field To Database Without Bulk Insert?

Nov 14, 2007

I have a web form with a text field that needs to take in as much as the user decides to type and insert it into an nvarchar(max) field in the database behind.  I've tried using the new .write() method in my update statement, but it cuts off the text after a while.  Is there a way to insert/update in SQL 2005 this without resorting to Bulk Insert? It bloats the transaction log and turning the logging off requires a call to sp_dboptions (or a straight-up ALTER DATABASE), which I'd like to avoid if I can.

View 6 Replies View Related

Transact SQL :: Performing Bulk Delete With Joining A Staging Table

Jun 23, 2015

I have a large table with 100 Million records that has around 1 million duplicate records that need to be deleted.

I am running a script that creates a staging  table called,DuplicateTable that collects all the duplicates and then I want to write a an effecient delete statement.

Is it possible to write something like:

delete from OrigTable O join DuplicateTable D
on O.Key = D.key

Or do I have to run a loop on the DuplicateTable and run a delete statement record by record ?

View 4 Replies View Related

How To Insert Data From A File Into Table Having Two Columns-BULK INSERT

Oct 12, 2007



Hi,
i have a file which consists data as below,

3
123||
456||
789||

Iam reading file using bulk insert and iam inserting these phone numbers into table having one column as below.


BULK INSERT TABLE_NAME FROM 'FILE_PATH'
WITH (KEEPNULLS,FIRSTROW=2,ROWTERMINATOR = '||')

but i want to insert the data into table having two columns. if iam trying to insert the data into table having two columns its not inserting.

can anyone help me how to do this?

Thanks,
-Badri

View 5 Replies View Related

Bulk Insert

Oct 27, 2006

hi friends i am trying for bulk insert using SQL server 2000using this codebulk insert xyzfrom  'D:authors.txt'WITH (FIELDTERMINATOR = ',') but it gve me error  saying thatCould not bulk insert because file 'D:authors.txt' could not be opened. Operating system error code 21(error not found). i check file securityit has given full control to the file can any one give me idea about Operating System error code 21(error not found)  thanks 

View 1 Replies View Related

Something Like A Bulk Insert...

Aug 28, 2007

Hi,I've a SP that insert records in one table and then call another insert SP on a second table. The first table is like a master table and the second is like a child table. After inserting the right record in the master table, I've to insert some record in the child table. This records differ each other only by two of about ten field, so what I'd want is not to call the second SP X times, but only one time.. Is it possible??ExampleTable1: Id (identity), Desc;Table2: Id (identity),  Id_table1, Id_TableX, Num, Field1, Field2, ... Field10.In Table2 only Id_TableX and Num change every time... the other are all the same (for one record in Table1). How can I do? Probably with a bulk insert and a bulk update?? But, can I make a bulk xxx without a file??

View 3 Replies View Related

BULK INSERT

Dec 28, 2007

hi friends i am using bulk insert cmd using my table name but i am facing error.....SO
IS IT POSSIBLE TO USE BULK INSERT WITH TEMPRARY TABLE VARIABLE
PLZ HELP ME

View 2 Replies View Related

Bulk Insert ?

Dec 20, 2000

How can I do a bulk insert without the transaction being logged?

View 1 Replies View Related

Does DTS Do Bulk Insert?

Sep 15, 2000

Hi,

I want to move data from a text file to a SQL table. After DTS creates the table, does it use Bulk Insert to copy the data from the file to the table, or BCP?

Thanks,
Judith

View 2 Replies View Related

Bulk Insert

May 19, 2000

I am trying to do a bulk insert from a data file into a linked access database. When I run the query I get the error message:
'Server: Msg 4801, Level 16, State 81, Line 1
Bulk_main: The opentable system function on BULK INSERT table failed.
Not sure what the problem is because BOL just says to check Microsoft.com for updated error message information. However, when I went to the site there was no updated information. Has anyone else seen this error? If so, have you figured out the problem? Any help would be greatly appreciated. Thanks.


Jim

View 1 Replies View Related

Bulk Insert Vs BCP

Sep 12, 2000

Hi,

Which is a faster method -- bulk insert or BCP

And I assume DTS has no problem handling either one? The input files are between 400 meg and 1 gig.

Thanks,
Judith

View 3 Replies View Related

Bulk Insert ?

Dec 5, 2000

How does one perform a bulk insert?

View 1 Replies View Related

BCP-vs-Bulk Insert-vs-DTS

Dec 28, 1999

We've been using Bulk Insert to load our tables. But, recently we encountered this error message "There is insufficient system memory to run this query. [SQLSTATE 42000] (Error 701). The step failed."

Then, our DBA suggested that we use BCP. It seems to work fine until the file size exceeds 30MB. This is the message I get "Starting copy... 0 rows copied. Network packet size (bytes): 4096 Clock Time (ms.): total 14984. Process Exit Code 0. The step succeeded." Is this a known problem?

Then, we decided to use DTS. DTS seems to be able to handle any file size but it's a slower process than the other 2. Any suggestions?

View 1 Replies View Related

Bulk Insert

Mar 14, 2000

Can anyone help me on how to do a bulk insert from a text file into sql server , some facility similar to
sql loader of oracle .. thank you in advance

View 1 Replies View Related

Bulk Insert

Jun 13, 2000

howdy:

I keep getting the following error msg:
<<Bulk insert data conversion error (truncation) for row 17, column 6 (col6).>>

I created a test file of 100 rows in EXCEL and saved as tab-delimited and then I run the following from the query analyzer:

bulk insert ssdi..dmf FROM 'C:devpb6ccsssdiest.txt'
WITH
(DATAFILETYPE='char',
FIELDTERMINATOR='',
ROWTERMINATOR='',
KEEPNULLS,
MAXERRORS=10,
TABLOCK)


here's my table:
create table dmf
( col1 char(9) not null,
col2 char(12) null ,
col3 char(10) null ,
col4 char(8) null ,
col5 char(8) null ,
col6 char(12) null ,
col7 char(12) null ,
constraint pk_dmf primary key (ssn))

note:
I don't always have values in the last 2 columns
I installed SP2.

TIA
deanna

View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved