Batch Insert 10000 Rows At A Time And Commit

Jul 20, 2005

I want to Insert 10000 rows at a time and commit in sql server , Is
there a way to do it if the source tables have no id fields ?
What would be the most efficient method?

Thanks


Ajay

View 1 Replies


ADVERTISEMENT

Problem With Getdate() In Transaction Takes The Insert Time Instead Of The Commit Time

Nov 12, 2007

Hi,

We need to select rows from the database that have been recently inserted/updated. We have a main primary table (COMMIT_TEST) and a second update table (COMMIT_TEST_UPDATE). The update table contains the primary key and a LAST_UPDATE field which is a datetime (to tell us when an update occurred). Triggers on the primary table are used to populate the update table.

If we insert or update the primary table in a transaction, we would expect that the datetime of the insert/update would be at the commit, however it seems that the insert/update statement is cached and getdate() is executed at the time of the cache instead of the commit. This causes problems as we select rows based on LAST_UPDATE and a commit may occur later but the earlier insert timestamp is saved to the database and we miss that update.

We would like to know if there is anyway to tell the SQL Server to not execute the function getdate() until the commit, or any other way to get the commit to create the correct timestamp.

We are using default isolation level. We have tried using getdate(), current_timestamp and even {fn Now()} with the same results. SQL Queries that reproduce the problem are provided below:


/* Different functions to get current timestamp €“ all have been tested to produce the same results */
/*
SELECT GETDATE()
GO
SELECT CURRENT_TIMESTAMP
GO
SELECT {fn Now()}
GO
*/
/* Use these statements to delete the tables to allow recreate of the tables */
/*
DROP TABLE COMMIT_TEST
DROP TABLE COMMIT_TEST_UPDATE
*/
/* Create a primary table and an UPDATE table to store the date/time when the primary table is modified */
CREATE TABLE dbo.COMMIT_TEST (PKEY int PRIMARY KEY, timestamp) /* ROW_VERSION rowversion */
GO
CREATE TABLE dbo.COMMIT_TEST_UPDATE (PKEY int PRIMARY KEY, LAST_UPDATE datetime, timestamp ) /* ROW_VERSION rowversion */
GO
/* Use these statements to delete the triggers to allow reinsert */
/*
drop trigger LOG_COMMIT_TEST_INSERT
drop trigger LOG_COMMIT_TEST_UPDATE
drop trigger LOG_COMMIT_TEST_DELETE
*/
/* Create insert, update and delete triggers */
create trigger LOG_COMMIT_TEST_INSERT on COMMIT_TEST for INSERT as
begin
declare @time datetime
select @time = getdate()

insert into COMMIT_TEST_UPDATE (PKEY,LAST_UPDATE)
select PKEY, getdate()
from inserted
end
GO
create trigger LOG_COMMIT_TEST_UPDATE on COMMIT_TEST for UPDATE as
begin
declare @time datetime
select @time = getdate()

update COMMIT_TEST_UPDATE
set LAST_UPDATE = getdate()
from COMMIT_TEST_UPDATE, deleted, inserted
where COMMIT_TEST_UPDATE.PKEY = deleted.PKEY
end
GO
/* In our application deletes should never occur so we don€™t log when they get modified we just delete them from the UPDATE table */
create trigger LOG_COMMIT_TEST_DELETE on COMMIT_TEST for DELETE as
begin
if ( select count(*) from deleted ) > 0
begin
delete COMMIT_TEST_UPDATE
from COMMIT_TEST_UPDATE, deleted
where COMMIT_TEST_UPDATE.PKEY = deleted.PKEY
end
end
GO
/* Delete any previous inserted record to avoid errors when inserting */
DELETE COMMIT_TEST WHERE PKEY = 1
GO
/* What is the current date/time */
SELECT GETDATE()
GO
BEGIN TRANSACTION
GO
/* Insert a record into the primary table */
INSERT COMMIT_TEST (PKEY) VALUES (1)
GO
/* Simulate additional processing within this transaction */
WAITFOR DELAY '00:00:10'
GO
/* We expect at this point that the date is written to the database (or at least we need some way for this to happen) */
COMMIT TRANSACTION
GO
/* get the current date to show us what date/time should have been committed to the database */
SELECT GETDATE()
GO
/* Select results from the table €“ we see that the timestamp is 10 seconds older than the commit, in other words it was evaluated at */
/* the insert statement, even though the row could not be read with a SELECT as it was uncommitted */
SELECT * FROM COMMIT_TEST
GO
SELECT * FROM COMMIT_TEST_UPDATE


Any help would be appreciated, we understand we could make changes to the application/database to approximate what we need, but all the solutions have identified suffer from possible performance issues, or could still lead to missing deals (assuming the commit time is larger than some artifical time window).

Regards,

Mark

View 8 Replies View Related

Insert Rows Into Four Tables At One Time

Nov 14, 2007

Hello,I have 4 tables having Customer, Customer_personal_info, Customer_Financial_info, Customer_Other_infoIn this Customer table had a primary key CustomerID , related with every other table with fkey.I want to insert data into four tables using one form having TABs .I created class and storedProcedures to insert row  for each table.How to execute all four classes using beginTrans-commitTrans-Rollback-EndTrans. Thanking you, 

View 1 Replies View Related

Help With Cursor To Insert 100 Rows At A Time

Jan 17, 2007

Hi all,

Can one of you help me with using a cursor that would insert only 100 rows at a time from source table 1 to target table 2. I am not able to loop beyond the first 100 rows.

Here is what I have till now:


CREATE procedure Insert100RowsAtaTime
AS
SET NOCOUNT ON

declare @Col1 int
declare @Col2 char(9)
DECLARE @RETURNVALUE int
DECLARE @ERRORMESSAGETXT varchar(510)
DECLARE @ERRORNUM int
DECLARE @LOCALROWCOUNT int

declare Insert_Cur cursor local fast_forward
FOR
SELECT top 100 Col1,Col2 from Table1
WHERE Col1 not in ( SELECT Col1 /* Col1 is PK. This statement is used to prevent the same rows from being inserted in Table 2*/
from Table2)

set @RETURNVALUE = 0
set @ERRORNUM = 0

BEGIN

open Insert_Cur
fetch NEXT from Insert_Cur into @Col1, @Col2
while (@@FETCH_STATUS = 0)
insert into Table2 (Col1,Col2) select @Col1,@Col2

SELECT @ERRORNUM = @@ERROR, @LOCALROWCOUNT = @@ROWCOUNT
IF @ERRORNUM = 0
BEGIN
IF @LOCALROWCOUNT >= 1
BEGIN
SELECT @RETURNVALUE = 0
END
ELSE
BEGIN
SELECT @RETURNVALUE = 1
RAISERROR ('INSERT FAILS',16, 1)
END
END
ELSE
BEGIN
SELECT @ERRORMESSAGETXT = description FROM [master].[dbo].[sysmessages]
WHERE error = @@ERROR
RAISERROR (@ERRORMESSAGETXT, 16, 1)
SELECT @RETURNVALUE = 1
END

fetch NEXT from Insert_Cur into @Col1, @Col2
end

close Insert_Cur
deallocate Insert_Cur

RETURN @RETURNVALUE
END

View 4 Replies View Related

Insert Two Rows Into Two Tables At The Same Time From A Formview

Jul 31, 2007

I have a formview that uses a predefined dataset based on a cross table query. When the formview is in insert mode I need to insert the data into two seperate tables. Essentially I have tblPerson and tblAddress and my formview is capturing username, password, name, address line1, address line 2, etc. I presume I need to use a stored procedure to insert a row into tblPerson and then insert a row intp tblAddress. This is easy enough to do but the tables use RI and tblPerson has an imcremental primary key which needs to be innserted into a foreign key field in my address row. How do I do this? I'm using SQL Server. 

View 3 Replies View Related

Transactions - Problem With Commit 2nd Time Round

May 27, 2006

Hope this is the right forum. I'm using Transaction=required on a page which inserts on multiple tables, 2 of which have a foreign key relationship. All works fine as log as I don't input erroneous data. However, I have a range check in the code, and if the range is exceeded, an exception is thrown and the transaction fails using ContextUtil.SetAbort(). I then correct the data and try to save and get a Foreign key contraint error. I have debugged and the primary key table seems to be carrying out the insert ok (I'm retreiving the key at that point, and can see it). But when I use the key in the child table it fails and cites the foreign key relationship.
I suspect that having the same data for the primary key table 2nd time around means it doesn't think it has to commit????
Grateful for any help. I'm using Sqlserver 2005 by the way.

View 1 Replies View Related

Insert 9900 Records Out Of 10000 Records Using DTS

Nov 28, 2005

I tried to port 10000 records using DTS. After porting of 9900 records I got an error and comes out without any result. But I want to keep the records which has been ported till the error occured. Plz help me.

View 1 Replies View Related

Maximum Insert Commit Size

Oct 11, 2006



Hi, All,



if I set the "Maximum insert commit size" to 10 ( 0 is the default) in a OLE destination,

what does the 10 means? 10 records or 10 MB/KB of data?


Thanks

View 4 Replies View Related

Recommendation For Max Insert Commit Size

Jul 10, 2007

The DBA is not around and I would like to see if someone had a good recommendation on what the Maximum insert commit size (MICS) should be for an OLE DB Destination where the default of ZERO is not being used.

I want to use Fast Load and I want to use Redirect Row to catch the errors. I just performed a test where the OLE DB Destination was NOT set to Fast Load - it took FOREVER and I cannot have this kind of performance.

I know that this may be totally dependent on what is being inserted, but is there any problem with just setting this value to say 800,000? -.

The destination SQL database's recovery mode is set to SIMPLE as it is not a transactional database.

Suggestions?? Thx

View 4 Replies View Related

OLE DB Destination - Fast Load With Maximum Insert Commit Size

Sep 8, 2006

I'm seeing some strange behavior from the OLE DB Destination when using the "fast load" access mode and setting the "Maximum insert commit size".

When I do not set the "Rows per batch" or the "Maximum insert commit size", the package I'm working with inserts 123,070 rows using a single "insert bulk" statement. The data seems to flow through the pipeline until it gets to the OLE DB Destination and then I see a short pause. I'm assuming the pause is from the "insert bulk" statement handling all of the rows at once.

When I set the "Rows per batch" option but leave the "Maximum insert commit size" alone, I generally see the same behavior -- a single "insert bulk" statement that handles all 123,070. In this case, however, the "insert bulk" statement has a "ROWS_PER_BATCH" option appended to the statement that matches the "Rows per batch" setting. This makes sense. I'm assuming the "insert bulk" then "batches" the rows into multiple insert statements (although I'm unsure of how to confirm this). This version of the "insert bulk" statement appears to run in about the same time as the case above.

When I set the "Maximum insert commit size" option and leave the "Rows per batch" statement alone, I see multiple "insert bulk" statements being executed, each handling the lower of either the value I specify for the "Maximum insert commit size" or the number of rows in a single buffer flowing through the pipeline. In my testing, the number of rows in a buffer was 9,681. So, if I set the "Maximum insert commit size" to 5,000, I see two "insert bulk" statements for each buffer that flows into the OLE DB Destination (one handling 5,000 rows and one handling 4,681 rows). If I set the "Maximum insert commit size" to 10,000, I see a single "insert bulk" statement for each buffer that flows into the OLE DB Destination (handling 9,681 rows).

Now the problem. When I set the "Maximum insert commit size" as described in the last case above, I see LONG pauses between buffers being handled by the OLE DB Destination. For example, I might see one buffer of data flow through (and be handled by one or more "insert bulk" statements based on the "Maximum insert commit size" setting), then see a 2-3 minute pause before the next buffer of data is handled (with its one or more "insert bulk" statements being executed). Then I might see a 4-5 minute pause before the next buffer of data is handled. The pause between the buffers being passed through the OLE DB Destination (and handled via the "insert bulk" statements) is sometimes shorter, sometimes longer.

Using Profiler, I don't see any other activity going on within the database or within SQL Server itself that would explain the pauses between the buffers being handled by the OLE DB Destination and the resulting "insert bulk" statements...

Can anyone explain what is going on here? Is setting the "Maximum insert commit size" a bad idea? What are the differences between it and the "Rows per batch" setting and what are the recommended uses of these two options to try to improve the performance of the insert (particularly when handling millions of rows)?

TIA for any thoughts or information...

Dave Fackler

View 8 Replies View Related

Batch Insert

Jan 23, 2004

I'm using ASP.net to do a Select and I want to insert all the results into a table that is stored locally
I can put an SP local but cant put on the other DB

How would i achieve this batch insert? is it possible?

thanks
Mark

View 4 Replies View Related

Performance Issues And Tuning (Rows Per Batch, MICS, Etc.)

Mar 6, 2008

I built an SSIS package to pull a large amount data (over 4 million records) from our legacy system into a SQL server for analysis. As it stands the package takes somethign like 30+ hours to run. I would really like to trim down the time it is taking. The process runs monthly.

We're connecting to a cache database via an ODBC connection and dumping to an OLEDB connection to the SQL server. Is there something I can change in the rows per batch, maximum commit size to improve performance. It looks like the ODBC connection returns chunks of about 3000 rows at a time.

Is anyone familiar with cache that might know where I can get the best performing ODBC driver from?

What about the table that I'm dumping the data to? I assume no (or very few) indexes is the way to go for speed of dropping the data into the table.

I don't feel that 30 hours is an acceptable amount of time for this process to run. Am I off base here?

Any tips would be appreciated.

View 5 Replies View Related

Batch Insert (OLAP)

Feb 7, 2005

Hi can any one help me with the skeleton script (sample one)of running Bulk insert in batches........ I need to do it in batches as the input data is huge.....

The logic is I have to insert thru bcp in fact table...

After that batch execution for 50,000 thousand record.... wise.... if any of the batch failes i need to identify and have to rerun from that point onwards...... this is OLAP thing...

View 6 Replies View Related

How To Speed Up Batch Insert MSSQL

Mar 20, 2008

Hi,
I am in the middle of writing a console application that acquires data from dynamic odbc connections and inserts the results into a MSSQL database. I'm currently using a datareader to generate multiple calls to a stored procedure and batch execute them by means of a simple counter; this works fine.
However, I'm a little concerned that it seems to take so long to execute the sql stored procs and was wondering if anyone may know of any methods to help speed it up either on the app side or sql, or both.
I had a quick word with our dba who spoke briefly about some kind of process where by the application fires the request across to the database and carries on leaving the db to queue the request or something to that effect. He ran away before I could get any sort of sense out of him.
 Any help greatly appreciated
 Thanks

View 4 Replies View Related

Bulk Insert And Batch Size

Mar 31, 2015

How do you prevent this:"If you import a very large number of rows, dividing the data into batches can offer advantages. After each batch complete, the transaction is logged. If, for any reason, a bulk-import operation terminates before completion, only the current transaction (batch) is rolled back." How can I let the whole batch rollback, instead only the current transaction (batch) rolling back?

View 1 Replies View Related

JDBC Batch Insert + GetGeneratedKeys

Apr 13, 2007

I'm using MS JDBC driver to connect to SQLServer 2005 and trying to perform batch insert. Here is the code i'm using:




Code Snippet

try {
// Assume we have a valid con
con.setAutoCommit(false);

// this is the simplified SQL for the purposes of this example
StringBuffer sql = new StringBuffer();
sql.append("INSERT INTO temp_batchOpt (");
sql.append("temp)");
sql.append(" VALUES (?)");

PreparedStatement insertStatement =
con.prepareStatement(sql.toString(),
SQLServerStatement.RETURN_GENERATED_KEYS);

for (int i = 0; i < 10; i++) {

insertStatement.setInt(1,3);
insertStatement.addBatch();
}

int[] r = insertStatement.executeBatch();

// the correct number of insertions are made
if(r != null)
for(int x: r)
System.out.println("inserted "+x);

SQLServerResultSet keyRS = (SQLServerResultSet)insertStatement.getGeneratedKeys();
while (keyRS.next()) {
int id = keyRS.getInt(1);
System.out.println("**filling id ="+id);
}
con.commit();
} catch (BatchUpdateException bue) {
bue.printStackTrace();
}
catch(SQLException sqle){
sqle.printStackTrace();
}
finally {
// close the connection...
}



I'm getting the following exception:




Code Snippetinserted 1
inserted 1
inserted -2
inserted 1
inserted -2
inserted 1
inserted -2
inserted 1
inserted -2
inserted 1
com.microsoft.sqlserver.jdbc.SQLServerException: The statement must be run before the generated keys are available.
at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDriverError(Unknown Source)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.getGeneratedKeys(Unknown Source)
at com.x.U.m.Transformer.prefetchData(Transformer.java:285)
at com.x.U.m.Transformer.init(Transformer.java:51)
at com.x.U.m.XMLParse.main(XMLParse.java:215)



I can get the getGeneratedKeys() to work fine with single update (not usign batch update). Is batch + getGenerated key not supported?

TIA

View 4 Replies View Related

SQL 2012 :: Deleting Large Batches Of Rows - Optimum Batch Size?

Oct 16, 2015

In another forum post, a poster was deleting large numbers of rows from a table in batches of 50,000.

In the bad old days ('80s - '90s), I used to have to delete rows in batches of 500, then 1000, then 5000, due to the size of the transaction rollback segments (yes - Oracle).

I always found that increasing the number of deleted rows in a single statement/transaction improved overall process speed - up to some magic point, at which some overhead in the system began slowing the deletes down, so that deleting a single batch of 10,000 rows took more than twice as much time as deleting two batches of 5,000 rows each.

good rule-of-thumb numbers (or even better, some actual statistics and/or explanations) as to how many records should be deleted in a single transaction/statement for optimum speed? 50,000 - 100,000 - 1,000,000 or unlimited? Are there significant differences between 2008, 2012, 2014?

View 9 Replies View Related

DB Engine :: Batch Insert During Full Backup

Sep 9, 2015

I have a full backup scheduled at 12.00AM ET and have a batch import on 11.55PM(5 min before full backup) which takes 30min to complete .Will the backup cover the data which is being imported?

View 2 Replies View Related

ODBC Batch Insert Error On Windows 64 Bit Machine.

Jan 16, 2008

I€™m facing a SQL batch insert issue on Windows 64 bit (IA-64) platform. I€™m using €˜Column-Wise Binding€™ of parameters. The example program hosted on Microsoft site (http://msdn2.microsoft.com/en-us/library/ms709287.aspx) returns error of SQL_NEED_DATA for batch insert on Windows 64 bit platform. The same code works fine on 32 bit platform. I€™m using SQL Server 2000 as the database and tried using both native SQL server ODBC driver and Merant 5.2 ODBC drivers. Any help on this will be greatly appreciated.

View 5 Replies View Related

It Takes A Long Time To Insert The First Record Each Time When The Program Start

Dec 15, 2006

I am using VS2005 (VB) to develop a PPC WM5.0 Program. And I am using SQLCE 3.0. My PPC Hardware is in 400MHz.

The question is when the program try to insert the first record into sdf database after each time the program started. It takes a long time. Does anyone know why and how can I fix it?

I will load the whole database into a dataset when the program start and do all the "Insert", "Update", "Delete" in this dataset and fill it into database after each action.

        cn.Open()
        sda = New SqlCeDataAdapter(SQL, cn) 'SQL = Select * From Table
        scb = New SqlCeCommandBuilder(sda) 
        sda.Update(dataset)
        cn.Close()

I check the sda.update(), it takes about 0.08s for filling one record into database normally. But:

1. Start the PPC Program

2. Load DB into dataset

3. Create a ONE new record in dataset

4. Fill back to DB

When I take this four steps everytime, the filling time is almost 1s or even more!

Actually, 0.08s is just a normal case. Sometimes, it still takes over 1s to filling back a dataset which only inserted one record when the program is running. (Even all inserted records are exactly the same in data jsut different in the integer key)

 However, when I give up the dataset and using the following code:

            cn.Open()
            Dim cmd As New SqlCeCommand(SQL, cn) ' I have build the insert SQL before (Insert Into Table values(XXXXXXXXXXXXXXX All field)

           cmd.CommandType = CommandType.Text
            cmd.ExecuteNonQuery()
            cn.Close()
            StartTime = Environment.TickCount

 I found that it is still the same that the first inserted record takes more time, but just about 0.2s. And the normal insert time is around 0.02s. It is 4 times faster!!!

View 1 Replies View Related

Using SSIS 2005 To Strip Out Bad Rows In Excel And Then Insert Detailed Rows Into OLE DB Data Source

Apr 6, 2006

Environment:
 
Running this code on my PC via VS 2005
.Net version 2.0.50727 on the server (shown in IIS)
Code is in ASP.NET 2.0 and is a VB.NET Console application
SSIS 2005
 
Problem & Info:
 
I am bringing in an Excel file.  I need to first strip out any non-detail rows such as the breaks you see with totals and what not.  I should in the end have only detail rows left before I start moving them into my SQL Table.  I'm not sure how to first strip this information out in SSIS specfically how down to the right component and how to actually code the component to do this based on my Excel file here: http://www.webfound.net/excelfile.xls

Then, I assume I just use a Flat File Source coponent or something to actually take the columns in the Excel and split into an OLE DB Datasource to shove each column into a corresponding column in my SQL Server Table.  I have used a Flat File Source in the past to do so with a comma delimited txt file but never tried with an Excel.
 
Desired Help:

 
How to perform
 
1)       stripping out all undesired rows
2)       importing each column into sql table

View 1 Replies View Related

SQLCODE -10000 (10k)

Mar 24, 2008

Hi,
Having this sql=-10000 on this statement below, which referes to some mismatch between columns and host-vars.
This runs from MicroFocus program against SQLServer'05 table, all hosts are well defined, same picture.
Interesting thing is that if I change ws-volume for literal, eg.
B3M_VOLUME = 21 it will work fine, but doesn't work with B3M_VOLUME = :WS-VOLUME:

--------------------

EXEC SQL
SELECT *
INTO :DCLVB3M-BASE
FROM VB3M_BASE
WHERE B3M_REMSEQ = :WS-REMSEQ
AND B3M_LOCSEQ =
(SELECT B3M_LOCSEQ FROM VB3M_BASE
WHERE B3M_REMSEQ = :WS-REMSEQ
x2 AND B3M_VOLUME = :WS-VOLUME
x1 AND B3M_BUST <> :WS-BUST)
END-EXEC.

------------------
also anybody can point to good place to see list of all sql codes:

Tx all
dai

View 2 Replies View Related

Update 10000 Fields

Nov 15, 2007

I have a list of 10000 websites that need to be updated with certain data.

So my question is would i be able to get SQL to lookup a certain list of websites? And then me update the required fields?

Its prob an easy answer, thanks!

View 9 Replies View Related

Insert Rows With SQLServerce V3.5 Is Very Slow Can Anyone Help Insert Performance Poor

Feb 22, 2008



Hi All

I decided to change over from Microsoft Access Database file to the New SQLServerCe Compact edition. Although the reading of data from the database is greatly improved, the inserting of the new rows is extremely slow.

I was getting between 60 to 70 rows per sec using OLEDB and an Access Database but now only getting 14 to 27 rows per sec using SQLServerCe.

I have tried the below code changes and nothing seams to increase the speed, any help as I would prefer to use SQLServerCe as the database is much smaller and I€™m use to SQL Commands.

Details:
VB2008 Pro
.NET Frameworks 2.0
SQL Compact Edition V3.5
Encryption = Engine Default
Database Size = 128Mb (But needs to be changes to 999Mbs)

Where Backup_On_Next_Run, OverWriteQuick, CompressAns are Booleans, all other column sizes are nvarchar and size 10 to 30 expect for Full Folder Address size 260

TRY1

Direct Insert Using Data Adapter.

Me.BackupDatabaseTableAdapter1.Insert(Group_Name1, Full_Folder_Address1, File1, File_Size_KB1, Schedule_To_Run1, Backup_Time1, Last_Run1, Result1, Last_Modfied1, Last_Modfied1, Backup_On_Next_Run1, Total_Backup_Times1, Server_File_Number1, Server_Number1, File_Break_Down1, No_Of_Servers1, Full_File_Address1, OverWriteQuick, CompressAns)

14 to 20 rows per second (Was 60 to 70 when using OLEDB Access)


TRY 2

Using Record Sets

Private Sub InsertRecordsIntoSQLServerce(ByVal Group_Name1 As String, ByVal Full_Folder_Address1 As String, ByVal File1 As String, ByVal File_Size_KB1 As String, ByVal Schedule_To_Run1 As String, ByVal Backup_Time1 As String, ByVal Last_Run1 As String, ByVal Result1 As String, ByVal Last_Modfied1 As String, ByVal Latest_Modfied1 As String, ByVal Backup_On_Next_Run1 As Boolean, ByVal Total_Backup_Times1 As String, ByVal Server_File_Number1 As String, ByVal Server_Number1 As String, ByVal File_Break_Down1 As String, ByVal No_Of_Servers1 As String, ByVal Full_File_Address1 As String, ByVal OverWriteQuick As Boolean, ByVal CompressAns As Boolean)

Dim conn As SqlCeConnection = Nothing
Dim CommandText1 As String = "INSERT INTO BackupDatabase (Group_Name, Full_Full_Folder_Adress, File1,File_Size_KB, Schedule_To_Run, Backup_Time, Last_Run, Result, Last_Modfied, Latest_Modfied, Backup_On_Next_Run, Total_Backup_Times, Server_File_Number, Server_Number, File_Break_Down, No_Of_Servers, Full_File_Address, OverWrite, Compressed) VALUES ('" & Group_Name1 & "', '" & Full_Folder_Address1 & "', '" & File1 & "', '" & File_Size_KB1 & "', '" & Schedule_To_Run1 & "', '" & Backup_Time1 & "', '" & Last_Run1 & "', '" & Result1 & "', '" & Last_Modfied1 & "', '" & Latest_Modfied1 & "', '" & CStr(Backup_On_Next_Run1) & "', '" & Total_Backup_Times1 & "', '" & Server_File_Number1 & "', '" & Server_Number1 & "', '" & File_Break_Down1 & "', '" & No_Of_Servers1 & "', '" & Full_File_Address1 & "', '" & CStr(OverWriteQuick) & "', '" & CStr(CompressAns) & "')"
Try
conn = New SqlCeConnection(strConn)
conn.Open()

Dim cmd As SqlCeCommand = conn.CreateCommand()

cmd.CommandText = "SELECT * FROM BackupDatabase"
cmd.ExecuteNonQuery()
Dim rs As SqlCeResultSet = cmd.ExecuteResultSet(ResultSetOptions.Updatable Or ResultSetOptions.Scrollable)

Dim rec As SqlCeUpdatableRecord = rs.CreateRecord()

rec.SetString(1, Group_Name1)
rec.SetString(2, Full_Folder_Address1)
rec.SetString(3, File1)
rec.SetSqlString(4, File_Size_KB1)
rec.SetSqlString(5, Schedule_To_Run1)
rec.SetSqlString(6, Backup_Time1)
rec.SetSqlString(7, Last_Run1)
rec.SetSqlString(8, Result1)
rec.SetSqlString(9, Last_Modfied1)
rec.SetSqlString(10, Latest_Modfied1)
rec.SetSqlBoolean(11, Backup_On_Next_Run1)
rec.SetSqlString(12, Total_Backup_Times1)
rec.SetSqlString(13, Server_File_Number1)
rec.SetSqlString(14, Server_Number1)
rec.SetSqlString(15, File_Break_Down1)
rec.SetSqlString(16, No_Of_Servers1)
rec.SetSqlString(17, Full_File_Address1)
rec.SetSqlBoolean(18, OverWriteQuick)
rec.SetSqlBoolean(19, CompressAns)
rs.Insert(rec)
Catch e As Exception
MessageBox.Show(e.Message)
Finally
conn.Close()
End Try
End Sub

€™20 to 24 rows per sec

TRY 3

Using SQL Commands Direct

Private Sub InsertRecordsIntoSQLServerce(ByVal Group_Name1 As String, ByVal Full_Folder_Address1 As String, ByVal File1 As String, ByVal File_Size_KB1 As String, ByVal Schedule_To_Run1 As String, ByVal Backup_Time1 As String, ByVal Last_Run1 As String, ByVal Result1 As String, ByVal Last_Modfied1 As String, ByVal Latest_Modfied1 As String, ByVal Backup_On_Next_Run1 As Boolean, ByVal Total_Backup_Times1 As String, ByVal Server_File_Number1 As String, ByVal Server_Number1 As String, ByVal File_Break_Down1 As String, ByVal No_Of_Servers1 As String, ByVal Full_File_Address1 As String, ByVal OverWriteQuick As Boolean, ByVal CompressAns As Boolean)

Dim conn As SqlCeConnection = Nothing
Dim CommandText1 As String = "INSERT INTO BackupDatabase (Group_Name, Full_Full_Folder_Adress, File1,File_Size_KB, Schedule_To_Run, Backup_Time, Last_Run, Result, Last_Modfied, Latest_Modfied, Backup_On_Next_Run, Total_Backup_Times, Server_File_Number, Server_Number, File_Break_Down, No_Of_Servers, Full_File_Address, OverWrite, Compressed) VALUES ('" & Group_Name1 & "', '" & Full_Folder_Address1 & "', '" & File1 & "', '" & File_Size_KB1 & "', '" & Schedule_To_Run1 & "', '" & Backup_Time1 & "', '" & Last_Run1 & "', '" & Result1 & "', '" & Last_Modfied1 & "', '" & Latest_Modfied1 & "', '" & CStr(Backup_On_Next_Run1) & "', '" & Total_Backup_Times1 & "', '" & Server_File_Number1 & "', '" & Server_Number1 & "', '" & File_Break_Down1 & "', '" & No_Of_Servers1 & "', '" & Full_File_Address1 & "', '" & CStr(OverWriteQuick) & "', '" & CStr(CompressAns) & "')"

Try
conn = New SqlCeConnection(strConn)
conn.Open()

Dim cmd As SqlCeCommand = conn.CreateCommand()
cmd.CommandText = CommandText1
'cmd.CommandText = "INSERT INTO BackupDatabase (€¦"
cmd.ExecuteNonQuery()

Catch e As Exception
MessageBox.Show(e.Message)
Finally
conn.Close()
End Try
End Sub

€˜ 25 to 30 but mostly holds around 27 rows per sec I

Is this the best you can get or is there a better way. Any help would be greatly appericated

Kind Regards

John Galvin

View 3 Replies View Related

COMMIT TRAN Does Not Commit

Mar 28, 2008

Microsoft SQL Server Management Studio Express

select @@trancount
begin tran
select @@trancount
use ProdNetPerfMon
select @@trancount
update Nodes set Caption = 'xxxx' where Vendor = 'yyyy'
select @@trancount
commit tran
select @@trancount

Executes as expected including trancount without errors.
Nodes.Caption is updated but reverts after a few minutes.

sa privileges

What am I missing?

View 1 Replies View Related

390 Tabels In Database And 10000 Transaction A Day Is It A Lot?

Oct 23, 2001

From Sql server point of view 390 tabels in database and 10000 transaction a day is it a lot ?

At what point performance starts slow down ?

View 2 Replies View Related

Package Dies After About 10000 Seconds.

Apr 24, 2006

I have written a package that archives off old orders over night, it appears that this package is failing after about 10000 second every time it is run. I don't think it is memory as I am running it and checking for memory leaks.

Basic run down of package is

EXEcute SQL task to get orders to delete

If a for loop, loop each ordernumber

within the for loop there are 2 dataflow

dataflow 1

find related records in child tables (oldb connection using query)

using a mutli split first

check (with lookup) for records already in archive database

only copy on a fail from the look up

second

delete related records

dataflow 2

do the same but for the parent table



SP1 CTP is installed on server.

Any ideas?



View 4 Replies View Related

Updating More Than 10000 Records SQL Server 2000

Jul 20, 2005

Hi allI just ranUPDATE dbo.tbl_forecastedSET update_ref = 0but it only updated the first 10000 records. I'm wondering if anyone cantell me why this is, and can I get around it.I have looked on google and found a few references, but nothing in too muchdetailsthanks in advanceAndy

View 2 Replies View Related

Format Number With Thousands Separator? 10000 --&&> 10,000

Oct 24, 2007



I would like to select a BIGINT type and get a formatted result with commas. Anyone have ideas?

declare @i bigint
set @i = 123456789

select @i

--Would like to get
123,456,789

View 39 Replies View Related

Force An INSERT INTO To Insert Rows Orderly

Apr 25, 2007

Hi All,

From what I know, when you execute an
INSERT INTO table1 (field0)
SELECT field0 FROM table 2 ORDER BY field0
the rows are inserted in whatever order the server engine thinks is better at that moment. Is it any way I can have field0 in table1 inserted in the order I need?

My problem is that I can not touch table1, it is used by a legacy application which I am not allowed to modify. I was thinking about creating a clustered index, adding an id field, etc, but I can not know how this will affect the front end application.

table1 and table2 have only a few hundred records.

table1 =
CREATE TABLE [FinalPlasma] ([name] [varchar] (2000) NULL )

table2 =
CREATE TABLE [Test_FinalPlasma] ([nameID] [int] NOT NULL ,
[name] [varchar] (2000) NULL
)
The update that is not giving the "good" order =

DELETE FROM FinalPlasma
INSERT INTO FinalPlasma SELECT name from dbo.Test_FinalPlasma

Unfortunately I can not order the [name] field after the update, because it looks something like

Donald McQ. Shaver
Mark Sheilds
R.J. Shirley
W.E. Sills
Kenneth A. Smee
A. Britton Smith
LCol Edward W. Smith
Harry V. Smith
M. E. Southern
Timothy A. Sparling
Spectrum Investment Management Limited


Thank you,

View 12 Replies View Related

Can Any One Tell Me How To Delete Data In Batches Of 10000 From A VLDB Table

Nov 14, 2001

Hi,
I would like to delete a data from a 750million row table in chunks of 10000,without blocking the users.As ours is a 24/7 shop I donot want to block the users for a long time.
Answer for this is highly appreciated.
Thanks
Samna

View 3 Replies View Related

SQL 2012 :: Automating Server Patching For More Than 10000 Instances

Nov 4, 2014

Is there any Best Practice, 3rd Party Tool for Automating SQL Server Patching for around 10000+ SQL instances.

It is very difficult to do the same manually because each server will consume atleast 1-2 hours and if this is completely done manually then n number of people might be required for accomplishing this task which doesn't looks feasible at all.

View 5 Replies View Related

SSRS Report Generator Hangs If There Are Around 10000 Pages That Needs To Be Converted To A PDF

Sep 25, 2007



Hi,
I am using SSRS to generate a 4 page statement... So i have a plan whose no. of Participants is 3271 rows... so While we try to export them to a PDF we will have around 13084 pages... but we cannot export so many pages at one time (it craps out). so right now we are breaking it up by start part and stop part and creating a pdf..

Is there a way that SSRS can handle so many pages at one go without any problems... if so how ??

Regards
Karen

View 18 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved