I am using sql bulk transfer method in asp.net 2.0 load large amount of data into sql server.. everything works fine except my transaction-log getting filled very soon... is there anyway to not log my transaction details into transaction log...
Thanks in advance
Anz
I 'am working with SQL Server7.0, and I need to transfer bulk of data(in millions) from aremote database in Rdb to SQL Server. What is the best approach other than using a comma delimited flat file? Is there a way to create a database link and then use a copy script in SQL to copy the data directly? I would appreciate any help. Thank You.
I have installed a msde engine (sql server 7 desktop) on a nt 4.0 sp6 workstation. I have around 17 megs of data of need to transfer from a Sql Server 7 from a nt 4.0 sp6 Server. I can't get anything to work. Bcp complains something to the effect there is no server attribute which is true because it is a work station. if I remove the -S option it complains that I did not designate the server. This bcp script works just fine from SQL Server 6.5 on servers. DTS puts a message that I do not have a liscense to use DTS for the SQL Server Desktop. I tried transfering the database to Access and then use the upsizing Wizard but the database is to big. I tried a bulk insert but I got the vague message ' OLE DB provider 'STREAM' reported an error. The provider did not give any information about the error' What is the best way to do this? Why am I having trouble with Bcp and Bulk insert?
In short, does the €œTransfer SQL Server Objects Task€? support distributed transactions?
In trying to use a €œTransfer SQL Server Objects Task€? in a container using a transaction on the container. The task is set to support the transaction. It is setup to copy table data from several tables from a non-domain server (sql server 2000) to a domain-based server (sql server 2005). I get an error stating, €œThis task can not participate in a transaction€?.
I am wondering if it means exactly what it says €“ this task in SSIS can€™t participate at all. Or does it mean that it won€™t in this scenario for some reason. I attempted a simple copy of data from mssql 2005 to mssql 2005 (same server) and the task still failed). MSDTC appears to be running properly on my machine and such (I can do a simple distributed transaction across linked server to the 2000 server in Query Analyzer (QA)). Also, MSDTC appears to be working on both servers with distributed transaction query tests in QA.
Here€™s the error info€¦
SSIS package "Development BusinessContacts and Products Migration.dtsx" starting. Information: 0x4001100A at Copy BusinessContacts Data: Starting distributed transaction for this container. Error: 0xC002F319 at Copy BusinessContacts database table data 1, Transfer SQL Server Objects Task: This task can not participate in a transaction. Task failed: Copy BusinessContacts database table data 1 Information: 0x4001100C at Copy BusinessContacts database table data 1: Aborting the current distributed transaction. Information: 0x4001100C at Copy BusinessContacts Data: Aborting the current distributed transaction. SSIS package "Development BusinessContacts and Products Migration.dtsx" finished: Failure. The program '[4700] Development BusinessContacts and Products Migration.dtsx: DTS' has exited with code 0 (0x0).
I'm getting this when executing the code below. Going from W2K/SQL2k SP4 to XP/SQL2k SP4 over a dial-up link.
If I take away the begin tran and commit it works, but of course, if one statement fails I want a rollback. I'm executing this from a Delphi app, but I get the same from Qry Analyser.
I've tried both with and without the Set XACT . . ., and also tried with Set Implicit_Transactions off.
set XACT_ABORT ON Begin distributed Tran update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TRANSACTIONMAIN set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.TRANSACTIONMAIN set REPFLAG = 0 where REPFLAG = 1 and DONE = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.WBENTRY set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.WBENTRY set REPFLAG = 0 where REPFLAG = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.FIXED set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.FIXED set REPFLAG = 0 where REPFLAG = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.ALTCHARGE set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.ALTCHARGE set REPFLAG = 0 where REPFLAG = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TSAUDIT set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.TSAUDIT set REPFLAG = 0 where REPFLAG = 1 COMMIT TRAN
It's got me stumped, so any ideas gratefully received.Thx
I am using transfer manager in SQL 6.5 to copy a database and all objects with data to another server. Transfer manager is not recreating all stored procedures. This even happens when I used it to another database on the same server. Any ideas??
Is it possible/advisable when transfering very large amounts of data from server to server to: trasnfer the data to a new table first second alter new table adding indexes, defaults, ets based on original table
if it is what flow item would be used to transfer/alter the indexes and defaults?
I'm very new to ssis so the more detail you can give the better.
I have a design a SSIS Package for ETL Process. In my package i have to read the data from the tables and then insert into the another table of same structure.
for reading the data i have write the Dynamic TSQL based on some condition and based on that it is using 25 different function to populate the data into different 25 column. Tsql returning correct data and is working fine in Enterprise manager. But in my SSIS package it show me time out ERROR.
I have increase and decrease the time to catch the error but it is still there i have tried to set 0 for commandout Properties.
if i'm using the 0 for commandtime out then i'm getting the Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.
and
Failed to open a fastload rowset for "[dbo].[P@@#$%$%%%]". Check that the object exists in the database.
I am getting this error :Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.Data.OleDb.OleDbException: Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.have anybody idea?!
i have a sequence container in my my sequence container i have a script task for drop the existing tables. This seq. container connected to another seq. container. all these are in for each loop container when i run the package it's work fine for 1st looop but it gives me error for second execution.
Message is like this:
Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.
i am getting this error "Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.".
my transations have been done using LINKED SERVER. when i manually call the store procedure from Server 1 it works but when i call it through Service broker it dosen't work and gives me this error.
I have to update a field within a table of 60 records or so. Each record has a different field value. it's type varchar. i was given an excel file with the field values and was thinking of a bulk update like bulk insert, but i don't recall that it's possible that way.
Is the only way to create a table, bulk insert, then merge the two tables together with UPDATE?
Just wanted to see if there was an easier way to do it, otherwise i'll take the latter route. Thanks!
I have a table containing 8 million records. I need to replace 2 million of these records with a scaled down query that goes something like: SELECT 1, ShareholderID, Assets1 FROM MyTable (Yields appx. 200,000 recods) SELECT 2, ShareholderID, Assets2 FROM MyTable (Yields appx. 200,000 recods) . . . SELECT 10, ShareholderID, Assets1 + Assest2 + Assets3 + ... + Assets9 FROM MyTable (Yields appx. 200,000 recods)
Updates and cursors just seem to be too slow.
So far I have done the following, but was wondering if anyone could think of a better way. SELECT 6 million records that don't need to be deleted into a #TempTable Use statements above to select into same #TempTable DROP and recreate Original Table SELECT 6 + 2 million records INTO original table.
This seems rather convoluted. Is there a better approach? Would it be worth while to dump data to a file and use bcp / Bulk Insert
I have been trying to run transfer manager to transfer all of the data from the production database on one server to a test database on another server(to refresh it). In order to make sure it runs on the server, I have been scheduling it under EM to do so and I am pointing to the log on the destination server on the EM Transfer panel.
For some reason, I am getting the following message in the destination server error log:
99/02/16 10:24:41.42 ods Error : 17824, Severity: 10, State: 0 99/02/16 10:24:41.42 ods Unable to write to ListenOn connection '.pipesqlquery', loginname 'sa', hostname 'TEMP09'. 99/02/16 10:24:41.42 ods OS Error : 232, The pipe is being closed. 99/02/16 10:24:41.42 spid17 Error : 1608, Severity: 21, State: 2 99/02/16 10:24:41.42 spid17 A network error was encountered while sending results to the front end. Check the SQL Server errorlog for more information.
I checked the event viewer error log and see no messages for today.
Can any one advise me what I need to do for this to run successfully?
Thanks. Any information furnished will be greatly appreciated.
I'm trying to use Bulk insert for the first time and getting the following error. I think it might have something to do with my Format File and from the error msg there's a conversion error for the first column. In my database the Field is nvarchar(6) so my best guess is to use SQLNChar for the first column. I've checked the end of each line is CR LF therefore the is correct for line 7 right?
Msg 4863, Level 16, State 1, Line 1 Bulk load data conversion error (truncation) for row 1, column 1 (ASXCode). Msg 7399, Level 16, State 1, Line 1 The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error. Msg 7330, Level 16, State 2, Line 1 Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
BULK INSERTtbl_ASX_Data_temp FROM 'M:DataASXImportTest.txt' WITH (FORMATFILE='M:DataASXSQLFormatImport.Fmt')
Before implementing memory based bulk copy insert with IRowsetFastLoad interface of SQL Server 2005 OLE DB provider, I want to know some considerations.
- performance : compared with T-SQL's "BULK INSERT ..." and bcp utility
- SQL Server's resource usage : when running memory based bulk copy, server resource's influence
- server side action(behavior) : when server is busy, delayed-update means IRowsetFastLoad::Commit(true) method can insert right after?
- row-count : The rowcount limitation can be inserted by IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit
I'm just learning SSIS and I've hit my first bump. I am doing a bulk import from a tab delimited text file to an empty sql table that has a Idendity column defined. How do I tell the bulk insert task to skip that column when inserting from the text file. If I remove the identity column it imports the data fine, but I want to create the indentity column in the table too.
I have Full database backup upto previous day and transaction logfile of Today transaction. my database has crashed. I have restored previous day's Full backup. I have faced difficulty to restore today's transaction from today's transaction log. What are the steps to restore full database back and one day's transaction log file. Note: there is no differential database backup and transaction backup.
I'm receiving the below error when trying to implement Execute SQL Task.
"The ROLLBACK TRANSACTION request has no corresponding BEGIN TRANSACTION." This error also happens on COMMIT as well and there is a preceding Execute SQL Task with BEGIN TRANSACTION tranname WITH MARK 'tran'
I know I can change the transaction option property from "supported" to "required" however I want to mark the transaction. I was copying the way Import/Export Wizard does it however I'm unable to figure out why it works and why mine doesn't work.
I created a Calculated measure in cube something like this : ([TransType].[TransTypeHierarchy].[TransTypeCategoryParent].&[SPEND],[Measures].[Transaction Amount]). To get only spend transactions. Now, I want to slice this measure with same hierarchy to find the amount distribution across different transaction types under spend transaction. But this query behaving like the measure doesn't have relation with measure.
you can think this as below query: WITH MEMBER SPEND AS ([TransType].[TransTypeHierarchy].[TransTypeCategoryParent].&[SPEND],[Measures].[Transaction Amount]) SELECT NON EMPTY {SPEND} ON 0 ,NON EMPTY ([TransType].[TransTypeHierarchy].[TransTypeCategoryParent]) ON 1 FROM [CUBE]
Hi~, I have 3 questions about memory based bulk copy.
1. What is the limitation count of IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit(true)? For example, how much insert row at below sample?(the max value of nCount) for(i=0 ; i<nCount ; i++) { pIFastLoad->InsertRow(hAccessor, (void*)(&BulkData)); }
2. In above code sample, isn't there method of inserting prepared array at once directly(BulkData array, not for loop)
3. In OLE DB memory based bulk copy, what is the equivalent of below's T-SQL bulk copy option ? BULK INSERT database_name.schema_name.table_name FROM 'data_file' WITH (ROWS_PER_BATCH = rows_per_batch, TABLOCK);
------------------------------------------------------- My solution is like this. Is it correct?
// CoCreateInstance(...); // Data source // Create session
How do I make use of begin transaction and commit transaction in SSIS.
As am not able to commit changes due to certain update commands I want to explicitly write begin and commit statements. but when i make use of begin and commit in OLEDB commnad stage it throws an error as follows:
Hresult:0x80004005
descriptionyntax error or access violation.
its definately not an syntax error as i executed it in sql server. also when i use it in execute sql task out side the dataflow container it doesnt throw any error but still this task doesnt serve my purpose of saving/ commiting update chanages in the database.
I am having some problem with SSIS transaction. Eventhought I tried to imitate the concept that Jamie presented at http://www.sqlservercentral.com/columnists/jthomson/transactionsinsqlserver2005integrationservices.asp
. My workflow is as followed
********************************* For Each ADO.Record in Oracle (transaction=not supported)
If (Certain_Field_Value = 'A')
Lookup Data in SQL DB with values from Oracle (transaction=not supported)
DO Sequence A (Start a Transaction , transaction=required)
INSERT/UPDATE some records in SQLDB(transaction=supported) Finish Sequence A ( transaction should stop here) UPDATE Oracle DB ( Execute SQLTask, transaction=not supported) If (Certain_Field_Value = 'B')
Lookup Data in SQL DB with values from Oracle (transaction=not supported)
DO Sequence B (Start a Transaction , transaction = required)
INSERT/UPDATE some records in SQLDB (transaction=supported) Finish Sequence A ( transaction should stop here) UPDATE Oracle DB ( Execute SQLTask, transaction=not supported) If (Certain_Field_Value = 'C')
------------ ------------ End ForEach Loop ************************************* My requirements are that I want separate transaction for each Sequence A, B, C, etc... If Sequence A transaction fails, the other should still be continuing with another transaction. But I am getting an error regarding the OLEDB Error in next Task (e.g in Certain_Field_Value = 'B') "Lookup Data in SQL DB with values from Oracle ", the error message is ".......Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction. ". What is it that I am doing wrong? Regards KyawAM
The operation could not be performed because the OLE DB provider 'SQLOLEDB' was unable to begin a distributed transaction. [OLE/DB provider returned message: New transaction cannot enlist in the specified transaction coordinator. ] OLE DB error trace [OLE/DB Provider 'SQLOLEDB' ITransactionJoin::JoinTransaction returned 0x8004d00a].
http://support.microsoft.com/kb/839279 this didn't help
I am executing a stored procedure something like this
Create Procedure TEST @test1 @test2 AS BeGIN TRANSACTION ONE SELECT... UPDATE.. .... ....
.... TRUNCATE etc and lot of Select,update and delete statements like this ..... COMMIT TRANSACTION ONE
The Block of code which I have b/w BEGIN TRANSACTION AND COMMIT TRANSACTION ..Will it be rolled back if it encounters any errors in the SELECT,UPPDATE,DELETE ..statements which I have with the transaction one. IF not How do I roll back if it encounters any erros b/w the BEGIN TRANSACTION and END TRANSACTION.
---------------------------------------------- dim conn, connectionstring
connectionstring= "Provider=SQLOLEDB.1; Data Source = blahblah; Initial Catalog = abc; User Id = osss; Password=xxx"
set conn = server.CreateObject("adodb.connection") conn.open connectionstring
----------------------------------------------
It's just basically to set up the connection to the SQL server, it works on my other pages, but on a few pages, the last line (conn.open connectionstring) triggers this error:
"Microsoft OLE DB Provider for SQL Server error '8004d00a'
New transaction cannot enlist in the specified transaction coordinator. "
I've never encountered this error before and I couldn't think of any solution... please help!!!
I had thought that if any statement failed within a BEING TRANS .. COMMIT TRANS block, then all the statements would be rolled back. But I am seeing different behavior (SQL Server 2000 8.00.2039)
For instance, run these statements to set up a test: --DROP TABLE testTable1 --DROP TABLE testTable2 CREATE TABLE testTable1 (f1 varchar(1)) CREATE TABLE testTable2 (f1 varchar(1)) CREATE UNIQUE INDEX idx_tmptmp ON testTable1 (f1) insert into testTable1(f1) values ('a')
So table testTable1 has a unique index on it..
Now try to run these statements:
--DELETE FROM testTable2 BEGIN TRANSACTION insert into testTable1(f1) values ('a') insert into testTable2(f1) values ('a') COMMIT TRANSACTION
SELECT * FROM testTable2
..the first insert fails on the unique index.. but the second insert succeeds. Shouldn't the second insert roll back? How can I make two operations atomic?