Help With Cursor To Insert 100 Rows At A Time
Jan 17, 2007
Hi all,
Can one of you help me with using a cursor that would insert only 100 rows at a time from source table 1 to target table 2. I am not able to loop beyond the first 100 rows.
Here is what I have till now:
CREATE procedure Insert100RowsAtaTime
AS
SET NOCOUNT ON
declare @Col1 int
declare @Col2 char(9)
DECLARE @RETURNVALUE int
DECLARE @ERRORMESSAGETXT varchar(510)
DECLARE @ERRORNUM int
DECLARE @LOCALROWCOUNT int
declare Insert_Cur cursor local fast_forward
FOR
SELECT top 100 Col1,Col2 from Table1
WHERE Col1 not in ( SELECT Col1 /* Col1 is PK. This statement is used to prevent the same rows from being inserted in Table 2*/
from Table2)
set @RETURNVALUE = 0
set @ERRORNUM = 0
BEGIN
open Insert_Cur
fetch NEXT from Insert_Cur into @Col1, @Col2
while (@@FETCH_STATUS = 0)
insert into Table2 (Col1,Col2) select @Col1,@Col2
SELECT @ERRORNUM = @@ERROR, @LOCALROWCOUNT = @@ROWCOUNT
IF @ERRORNUM = 0
BEGIN
IF @LOCALROWCOUNT >= 1
BEGIN
SELECT @RETURNVALUE = 0
END
ELSE
BEGIN
SELECT @RETURNVALUE = 1
RAISERROR ('INSERT FAILS',16, 1)
END
END
ELSE
BEGIN
SELECT @ERRORMESSAGETXT = description FROM [master].[dbo].[sysmessages]
WHERE error = @@ERROR
RAISERROR (@ERRORMESSAGETXT, 16, 1)
SELECT @RETURNVALUE = 1
END
fetch NEXT from Insert_Cur into @Col1, @Col2
end
close Insert_Cur
deallocate Insert_Cur
RETURN @RETURNVALUE
END
View 4 Replies
ADVERTISEMENT
Nov 14, 2007
Hello,I have 4 tables having Customer, Customer_personal_info, Customer_Financial_info, Customer_Other_infoIn this Customer table had a primary key CustomerID , related with every other table with fkey.I want to insert data into four tables using one form having TABs .I created class and storedProcedures to insert row for each table.How to execute all four classes using beginTrans-commitTrans-Rollback-EndTrans. Thanking you,
View 1 Replies
View Related
Jul 31, 2007
I have a formview that uses a predefined dataset based on a cross table query. When the formview is in insert mode I need to insert the data into two seperate tables. Essentially I have tblPerson and tblAddress and my formview is capturing username, password, name, address line1, address line 2, etc. I presume I need to use a stored procedure to insert a row into tblPerson and then insert a row intp tblAddress. This is easy enough to do but the tables use RI and tblPerson has an imcremental primary key which needs to be innserted into a foreign key field in my address row. How do I do this? I'm using SQL Server.
View 3 Replies
View Related
Jul 20, 2005
I want to Insert 10000 rows at a time and commit in sql server , Isthere a way to do it if the source tables have no id fields ?What would be the most efficient method?ThanksAjay
View 1 Replies
View Related
Jun 28, 2004
Hi,
First of all sorry about poor english.
I have the next problem:
I have a store procedure which declares a cursor which is a select from some tables. I΄m surprised because the sp takes over 10 seconds on working and if i execute the select of the cursor as separate ways, it takes no more one second. I΄ve changed the select with another easier select and it works 0 seconds, so it seems that the problem is in the select, but only if i use a cursor to save the regs, why????????
Thanks a lot.
View 3 Replies
View Related
Apr 12, 1999
hi, is there a max number of rows created in a cursor..... if so, what is the max alowable rows in a cursor?
thanks
Ali
View 1 Replies
View Related
Dec 15, 2006
I am using VS2005 (VB) to develop a PPC WM5.0 Program. And I am using SQLCE 3.0. My PPC Hardware is in 400MHz.
The question is when the program try to insert the first record into sdf database after each time the program started. It takes a long time. Does anyone know why and how can I fix it?
I will load the whole database into a dataset when the program start and do all the "Insert", "Update", "Delete" in this dataset and fill it into database after each action.
cn.Open()
sda = New SqlCeDataAdapter(SQL, cn) 'SQL = Select * From Table
scb = New SqlCeCommandBuilder(sda)
sda.Update(dataset)
cn.Close()
I check the sda.update(), it takes about 0.08s for filling one record into database normally. But:
1. Start the PPC Program
2. Load DB into dataset
3. Create a ONE new record in dataset
4. Fill back to DB
When I take this four steps everytime, the filling time is almost 1s or even more!
Actually, 0.08s is just a normal case. Sometimes, it still takes over 1s to filling back a dataset which only inserted one record when the program is running. (Even all inserted records are exactly the same in data jsut different in the integer key)
However, when I give up the dataset and using the following code:
cn.Open()
Dim cmd As New SqlCeCommand(SQL, cn) ' I have build the insert SQL before (Insert Into Table values(XXXXXXXXXXXXXXX All field)
cmd.CommandType = CommandType.Text
cmd.ExecuteNonQuery()
cn.Close()
StartTime = Environment.TickCount
I found that it is still the same that the first inserted record takes more time, but just about 0.2s. And the normal insert time is around 0.02s. It is 4 times faster!!!
View 1 Replies
View Related
Nov 12, 2007
Hi,
We need to select rows from the database that have been recently inserted/updated. We have a main primary table (COMMIT_TEST) and a second update table (COMMIT_TEST_UPDATE). The update table contains the primary key and a LAST_UPDATE field which is a datetime (to tell us when an update occurred). Triggers on the primary table are used to populate the update table.
If we insert or update the primary table in a transaction, we would expect that the datetime of the insert/update would be at the commit, however it seems that the insert/update statement is cached and getdate() is executed at the time of the cache instead of the commit. This causes problems as we select rows based on LAST_UPDATE and a commit may occur later but the earlier insert timestamp is saved to the database and we miss that update.
We would like to know if there is anyway to tell the SQL Server to not execute the function getdate() until the commit, or any other way to get the commit to create the correct timestamp.
We are using default isolation level. We have tried using getdate(), current_timestamp and even {fn Now()} with the same results. SQL Queries that reproduce the problem are provided below:
/* Different functions to get current timestamp all have been tested to produce the same results */
/*
SELECT GETDATE()
GO
SELECT CURRENT_TIMESTAMP
GO
SELECT {fn Now()}
GO
*/
/* Use these statements to delete the tables to allow recreate of the tables */
/*
DROP TABLE COMMIT_TEST
DROP TABLE COMMIT_TEST_UPDATE
*/
/* Create a primary table and an UPDATE table to store the date/time when the primary table is modified */
CREATE TABLE dbo.COMMIT_TEST (PKEY int PRIMARY KEY, timestamp) /* ROW_VERSION rowversion */
GO
CREATE TABLE dbo.COMMIT_TEST_UPDATE (PKEY int PRIMARY KEY, LAST_UPDATE datetime, timestamp ) /* ROW_VERSION rowversion */
GO
/* Use these statements to delete the triggers to allow reinsert */
/*
drop trigger LOG_COMMIT_TEST_INSERT
drop trigger LOG_COMMIT_TEST_UPDATE
drop trigger LOG_COMMIT_TEST_DELETE
*/
/* Create insert, update and delete triggers */
create trigger LOG_COMMIT_TEST_INSERT on COMMIT_TEST for INSERT as
begin
declare @time datetime
select @time = getdate()
insert into COMMIT_TEST_UPDATE (PKEY,LAST_UPDATE)
select PKEY, getdate()
from inserted
end
GO
create trigger LOG_COMMIT_TEST_UPDATE on COMMIT_TEST for UPDATE as
begin
declare @time datetime
select @time = getdate()
update COMMIT_TEST_UPDATE
set LAST_UPDATE = getdate()
from COMMIT_TEST_UPDATE, deleted, inserted
where COMMIT_TEST_UPDATE.PKEY = deleted.PKEY
end
GO
/* In our application deletes should never occur so we dont log when they get modified we just delete them from the UPDATE table */
create trigger LOG_COMMIT_TEST_DELETE on COMMIT_TEST for DELETE as
begin
if ( select count(*) from deleted ) > 0
begin
delete COMMIT_TEST_UPDATE
from COMMIT_TEST_UPDATE, deleted
where COMMIT_TEST_UPDATE.PKEY = deleted.PKEY
end
end
GO
/* Delete any previous inserted record to avoid errors when inserting */
DELETE COMMIT_TEST WHERE PKEY = 1
GO
/* What is the current date/time */
SELECT GETDATE()
GO
BEGIN TRANSACTION
GO
/* Insert a record into the primary table */
INSERT COMMIT_TEST (PKEY) VALUES (1)
GO
/* Simulate additional processing within this transaction */
WAITFOR DELAY '00:00:10'
GO
/* We expect at this point that the date is written to the database (or at least we need some way for this to happen) */
COMMIT TRANSACTION
GO
/* get the current date to show us what date/time should have been committed to the database */
SELECT GETDATE()
GO
/* Select results from the table we see that the timestamp is 10 seconds older than the commit, in other words it was evaluated at */
/* the insert statement, even though the row could not be read with a SELECT as it was uncommitted */
SELECT * FROM COMMIT_TEST
GO
SELECT * FROM COMMIT_TEST_UPDATE
Any help would be appreciated, we understand we could make changes to the application/database to approximate what we need, but all the solutions have identified suffer from possible performance issues, or could still lead to missing deals (assuming the commit time is larger than some artifical time window).
Regards,
Mark
View 8 Replies
View Related
Feb 2, 2002
Hi, I'm newbie in SQL, could somebody tell me how do I walk through a table one record at a time without using a cursor please.
Greatly appreciated.
Ann
View 4 Replies
View Related
Dec 29, 2004
Hello,
I am using a cursor to navigate on data...of a table....
inside the while @@fetch_status = 0 command
I want to delete some rows from the table(temporary table)
in order to not be processed...
The problem is that I want this deletion to affect the rows the cursor has.
I declared a dynamic cursor but it does not work.
Does anyone know how I can do this??
Thanks :)
View 7 Replies
View Related
May 2, 2000
I wrote a stored procedure (see below), which within a database create a new column (Ordering)and order rows that have the same OID, DECISION_DATE and give a to column ORDERING value 1 to the highest VOTES_REQUIRED VALUE, VALUE 2 to the second Hignest value and so On.
In Other words I am creating a sort on Index (ORDERING Column) on subsets of the database which have same OID and DECISION_DATE based on the value of VOTES_REQUIRED.
The stored procedure is working and I get the right results, but IT TAKES 3 HOURS 30 MINUTES to go through 38,000 rows. I have to run this stored procedure for a database with 200,000 rows (it may take tens of hours...).
Is there any idea to improve the following code to make it run efficiently.
Thank you in Advance.
**************************************************
CREATE PROCEDURE SP_ORDERING_TEST AS
declare @oldoid varchar(12)
declare @olddecision_date varchar(75)
declare @oid varchar(12)
declare @decision_date varchar(75)
declare @ordering varchar(1)
declare @ordering_count int
declare @votes_required varchar(12)
set @oldoid = 'space'
set @olddecision_date = 'space'
set @oid = 'space'
set @decision_date = 'space'
set @votes_required='space'
set @Decision_id = 'space'
set @ordering = '0'
set @ordering_count = 0
declare review_test_cursor cursor for
select oid,decision_date,votes_required,ordering,decision _id from MYTABLE
open review_test_cursor
fetch review_test_cursor into @oid,@decision_date,@votes_required,
@ordering,@Decision_id
while (@@fetch_status = 0 )
begin
if @oldoid <> @oid or @olddecision_date <> @decision_date
begin
set @oldoid = @oid
set @olddecision_date = @decision_date
set @ordering_count=0
end
update MYTABLE
set ordering = CAST ((@ordering_count + 1) as VARCHAR)
where decision_id = @Decision_id
set @ordering_count = @ordering_count + 1
fetch review_test_cursor into @oid,@decision_date,@votes_required, @ordering,@Decision_id
end
close review_test_cursor
deallocate review_test_cursor
*************************
View 1 Replies
View Related
Sep 28, 2000
Hi, can someone plz give me an idea of how to proceed with this...
I've got a server side cursor that retrieves a list of customers from a sql server table which match a specific criteria (those who have had orders in the yr 2000)
For each customer, I need to retrieve a list of 5 top items purchased along with dollars spent on each item by that customer.
I've tried to do this in a loop that uses a counter but probably my syntax was off and I'm not sure that this is better than a correlated join? Can someone show me a temlate of the appropriate syntax to use for this operation
I appreciate suggestions, thanks again.
Irene
View 2 Replies
View Related
Apr 6, 2006
Environment:
Running this code on my PC via VS 2005
.Net version 2.0.50727 on the server (shown in IIS)
Code is in ASP.NET 2.0 and is a VB.NET Console application
SSIS 2005
Problem & Info:
I am bringing in an Excel file. I need to first strip out any non-detail rows such as the breaks you see with totals and what not. I should in the end have only detail rows left before I start moving them into my SQL Table. I'm not sure how to first strip this information out in SSIS specfically how down to the right component and how to actually code the component to do this based on my Excel file here: http://www.webfound.net/excelfile.xls
Then, I assume I just use a Flat File Source coponent or something to actually take the columns in the Excel and split into an OLE DB Datasource to shove each column into a corresponding column in my SQL Server Table. I have used a Flat File Source in the past to do so with a comma delimited txt file but never tried with an Excel.
Desired Help:
How to perform
1) stripping out all undesired rows
2) importing each column into sql table
View 1 Replies
View Related
Oct 29, 2015
This store procedure will get some executable queries from the select statement, the cursor will fetch each rows to execute the query and insert the queries into table_3 to mark as 'E'. Until 17:00, this store procedure will stop execute the queries and just get the queries from select statement insert into table_3 to mark as 'C'.
I don't know why the outputs in table_3 are quiet different than I think. This store procedure comes out with two exactly same queries and one marked as C and another marked as E.
CREATE PROCEDURE procedure1
AS
DECLARE cursor_1 CURSOR FOR
SELECT
'This is a executable query'
FROM table_1
DECLARE @table_2
[code]....
View 1 Replies
View Related
Jul 23, 2005
Hi I have a weird problem I want to cursor thru the values in atemporary table and use the values to do a select statement to insertinto another temporary table...This select statement uses a like clausesomething like where....when I take off the insert still nothing comesback from the select...when I hardcode values it works...I getresults...is there something wrong with appending a +'%' to a valueread from a cursor???DECLARE @DEPT VARCHAR(65)SET @DEPT = "00201,00203"DECLARE @TB_ABSENCES TABLE(DeptOrEmpId VARCHAR(65))DECLARE @TB_DEPT TABLE ( V_DEPARTMENT_CODE VARCHAR(128) )INSERT INTO @TB_DEPT (V_DEPARTMENT_CODE)SELECT V_DEPT FROM [ISIS].[dbo].[FU_GET_DEPTS_FROM_STRING](',', @DEPT)DECLARE DEPTS CURSOR FAST_FORWARD FORSELECT V_DEPARTMENT_CODE+'%' FROM @TB_DEPTOPEN DEPTSFETCH NEXT FROM DEPTS INTO @DEPT_CODEWHILE @@FETCH_STATUS = 0BEGIN--INSERT INTO @TB_ABSENCES TABLESELECT Code from TB_EMPLOYEE_DEPARTMENT T2WHERE T2.V_HIERARCHY_CODE LIKE @DEPT_CODE + '%'FETCH NEXT FROM DEPTS INTO @DEPT_CODEENDCLOSE DEPTSDEALLOCATE DEPTS
View 2 Replies
View Related
Aug 9, 2014
I don't understand using a dynamic cursor.
Summary
* The fetch next statement returns multiple rows when using a dynamic cursor on the sys.dm_db_partition_stats.
* As far as I know a fetch-next-statement always returns a single row?
* Using a static cursor works as aspected.
* Works on production OLTP as well as on a local SQL server instance.
Now the Skript to reproduce the whole thing.
create database objects
-- create the partition function
create partition function fnTestPartition01( smallint )
as range right for values ( 1, 2, 3, 4, 5, 6, 7, 8 , 9, 10 ) ;
[Code]....
Why does the fetch statement return more than 1 row? It returns the whole result of the select-statement. When using a STATIC cursors instead I get the first row of the cursor as I would expect. Selecting a "normal" user table using a dynamic cursor I get the first row only, again as expected.
View 8 Replies
View Related
Oct 29, 2015
This store procedure will get some executable queries from the select statement, the cursor will fetch each rows to execute the query and insert the queries into table_3 to mark as 'E'. Until 17:00, this store procedure will stop execute the queries and just get the queries from select statement insert into table_3 to mark as 'C'.
I don't know why the outputs in table_3 are quiet different than I think. This store procedure comes out with two exactly same queries and one marked as C and another marked as E.
CREATE PROCEDURE procedure1
AS
DECLARE cursor_1 CURSOR FOR
SELECT
'This is a executable query'
FROM table_1
DECLARE @table_2
DECLARE @stoptime DATETIME = NULL;
[Code] ....
View 8 Replies
View Related
May 15, 2015
This is my code and I don't know why this error keeps coming out : PS : I did cursor to execute query.Th error showed is bold:
DECLARE RegCreatedDate CURSOR FOR
SELECT DISTINCT (CONVERT(NVARCHAR,CreatedDate,103))Β
FROM CA_Registration WHERE Month(CreatedDate)= @paMonthIn AND YEAR(CreatedDate)=@paYearIn
OPEN RegCreatedDate
FETCH NEXT FROM RegCreatedDate INTO @RegCreatedDate
WHILE @@FETCH_STATUS = 0
[Code] ....
View 9 Replies
View Related
Oct 19, 2006
Hi,
I was just wondering if anybody came across this behaviour where closing a Fast Forward Read only cursor takes abnormally long time to close. I am running SQL Server 2005 standard edition.
Thanks
Nand
View 1 Replies
View Related
May 1, 2007
I want to do insert and delete in a cursour but it's not working for me can any one check the below code and help me out to solve the problem,
DECLARE @LINKID AS INT
DECLARE @INITCOUNT AS INT
DECLARE @NumberOfDuplicates AS INT
SELECT @InitCount = (SELECT COUNT(*) FROM [SSDTemp])
Declare DedupeDataInSSDTempTableNewCur Cursor For
SELECT LINKID FROM SSD WHERE LINKID IS NOT NULL
OPEN DedupeDataInSSDTempTableNewCur
FETCH NEXT FROM DedupeDataInSSDTempTableNewCur
into @LINKID
WHILE @@FETCH_STATUS = 0
BEGIN
DECLARE @CNT AS INT
SELECT @CNT = COUNT(LINKID) FROM SSD WHERE LINKID=@LINKID
if (@cnt =1) begin
INSERT INTO SSD (
[CallID],
[LinkID], [FirstName], [MiddleInit], [LastName],
[Suffix], [DateOfBirth], [SSN], [PhoneNumber],
[Addr1], [Addr2], [City], [State], [Zip] )
SELECT top 1
[CallID], [LinkID], [FirstName], [MiddleInit],
[LastName], [Suffix], [DateOfBirth], [SSN],
[PhoneNumber], [Addr1], [Addr2], [City], [State], [Zip] from SSDTemp
where linkid = @LINKID
DELETE top 1 from ssdtemp where linkid=@LINKID
end
if (@cnt =0) begin
INSERT INTO SSD (
[CallID],
[LinkID], [FirstName], [MiddleInit], [LastName],
[Suffix], [DateOfBirth], [SSN], [PhoneNumber],
[Addr1], [Addr2], [City], [State], [Zip] )
SELECT top 2
[CallID], [LinkID], [FirstName], [MiddleInit],
[LastName], [Suffix], [DateOfBirth], [SSN],
[PhoneNumber], [Addr1], [Addr2], [City], [State], [Zip] from SSDTemp
where linkid = @LINKID
DELETE top 2 from ssdtemp where linkid=@LINKID of DedupeDataInSSDTempTableNewCur
End
SELECT @NumberOfDuplicates = @InitCount - (SELECT COUNT(*) FROM [SSDTemp])
FETCH NEXT FROM DedupeDataInSSDTempTableNewCur
INTO @LINKID
END
CLOSE DedupeDataInSSDTempTableNewCur
DEALLOCATE DedupeDataInSSDTempTableNewCur
View 2 Replies
View Related
Feb 22, 2008
Hi All
I decided to change over from Microsoft Access Database file to the New SQLServerCe Compact edition. Although the reading of data from the database is greatly improved, the inserting of the new rows is extremely slow.
I was getting between 60 to 70 rows per sec using OLEDB and an Access Database but now only getting 14 to 27 rows per sec using SQLServerCe.
I have tried the below code changes and nothing seams to increase the speed, any help as I would prefer to use SQLServerCe as the database is much smaller and Im use to SQL Commands.
Details:
VB2008 Pro
.NET Frameworks 2.0
SQL Compact Edition V3.5
Encryption = Engine Default
Database Size = 128Mb (But needs to be changes to 999Mbs)
Where Backup_On_Next_Run, OverWriteQuick, CompressAns are Booleans, all other column sizes are nvarchar and size 10 to 30 expect for Full Folder Address size 260
TRY1
Direct Insert Using Data Adapter.
Me.BackupDatabaseTableAdapter1.Insert(Group_Name1, Full_Folder_Address1, File1, File_Size_KB1, Schedule_To_Run1, Backup_Time1, Last_Run1, Result1, Last_Modfied1, Last_Modfied1, Backup_On_Next_Run1, Total_Backup_Times1, Server_File_Number1, Server_Number1, File_Break_Down1, No_Of_Servers1, Full_File_Address1, OverWriteQuick, CompressAns)
14 to 20 rows per second (Was 60 to 70 when using OLEDB Access)
TRY 2
Using Record Sets
Private Sub InsertRecordsIntoSQLServerce(ByVal Group_Name1 As String, ByVal Full_Folder_Address1 As String, ByVal File1 As String, ByVal File_Size_KB1 As String, ByVal Schedule_To_Run1 As String, ByVal Backup_Time1 As String, ByVal Last_Run1 As String, ByVal Result1 As String, ByVal Last_Modfied1 As String, ByVal Latest_Modfied1 As String, ByVal Backup_On_Next_Run1 As Boolean, ByVal Total_Backup_Times1 As String, ByVal Server_File_Number1 As String, ByVal Server_Number1 As String, ByVal File_Break_Down1 As String, ByVal No_Of_Servers1 As String, ByVal Full_File_Address1 As String, ByVal OverWriteQuick As Boolean, ByVal CompressAns As Boolean)
Dim conn As SqlCeConnection = Nothing
Dim CommandText1 As String = "INSERT INTO BackupDatabase (Group_Name, Full_Full_Folder_Adress, File1,File_Size_KB, Schedule_To_Run, Backup_Time, Last_Run, Result, Last_Modfied, Latest_Modfied, Backup_On_Next_Run, Total_Backup_Times, Server_File_Number, Server_Number, File_Break_Down, No_Of_Servers, Full_File_Address, OverWrite, Compressed) VALUES ('" & Group_Name1 & "', '" & Full_Folder_Address1 & "', '" & File1 & "', '" & File_Size_KB1 & "', '" & Schedule_To_Run1 & "', '" & Backup_Time1 & "', '" & Last_Run1 & "', '" & Result1 & "', '" & Last_Modfied1 & "', '" & Latest_Modfied1 & "', '" & CStr(Backup_On_Next_Run1) & "', '" & Total_Backup_Times1 & "', '" & Server_File_Number1 & "', '" & Server_Number1 & "', '" & File_Break_Down1 & "', '" & No_Of_Servers1 & "', '" & Full_File_Address1 & "', '" & CStr(OverWriteQuick) & "', '" & CStr(CompressAns) & "')"
Try
conn = New SqlCeConnection(strConn)
conn.Open()
Dim cmd As SqlCeCommand = conn.CreateCommand()
cmd.CommandText = "SELECT * FROM BackupDatabase"
cmd.ExecuteNonQuery()
Dim rs As SqlCeResultSet = cmd.ExecuteResultSet(ResultSetOptions.Updatable Or ResultSetOptions.Scrollable)
Dim rec As SqlCeUpdatableRecord = rs.CreateRecord()
rec.SetString(1, Group_Name1)
rec.SetString(2, Full_Folder_Address1)
rec.SetString(3, File1)
rec.SetSqlString(4, File_Size_KB1)
rec.SetSqlString(5, Schedule_To_Run1)
rec.SetSqlString(6, Backup_Time1)
rec.SetSqlString(7, Last_Run1)
rec.SetSqlString(8, Result1)
rec.SetSqlString(9, Last_Modfied1)
rec.SetSqlString(10, Latest_Modfied1)
rec.SetSqlBoolean(11, Backup_On_Next_Run1)
rec.SetSqlString(12, Total_Backup_Times1)
rec.SetSqlString(13, Server_File_Number1)
rec.SetSqlString(14, Server_Number1)
rec.SetSqlString(15, File_Break_Down1)
rec.SetSqlString(16, No_Of_Servers1)
rec.SetSqlString(17, Full_File_Address1)
rec.SetSqlBoolean(18, OverWriteQuick)
rec.SetSqlBoolean(19, CompressAns)
rs.Insert(rec)
Catch e As Exception
MessageBox.Show(e.Message)
Finally
conn.Close()
End Try
End Sub
20 to 24 rows per sec
TRY 3
Using SQL Commands Direct
Private Sub InsertRecordsIntoSQLServerce(ByVal Group_Name1 As String, ByVal Full_Folder_Address1 As String, ByVal File1 As String, ByVal File_Size_KB1 As String, ByVal Schedule_To_Run1 As String, ByVal Backup_Time1 As String, ByVal Last_Run1 As String, ByVal Result1 As String, ByVal Last_Modfied1 As String, ByVal Latest_Modfied1 As String, ByVal Backup_On_Next_Run1 As Boolean, ByVal Total_Backup_Times1 As String, ByVal Server_File_Number1 As String, ByVal Server_Number1 As String, ByVal File_Break_Down1 As String, ByVal No_Of_Servers1 As String, ByVal Full_File_Address1 As String, ByVal OverWriteQuick As Boolean, ByVal CompressAns As Boolean)
Dim conn As SqlCeConnection = Nothing
Dim CommandText1 As String = "INSERT INTO BackupDatabase (Group_Name, Full_Full_Folder_Adress, File1,File_Size_KB, Schedule_To_Run, Backup_Time, Last_Run, Result, Last_Modfied, Latest_Modfied, Backup_On_Next_Run, Total_Backup_Times, Server_File_Number, Server_Number, File_Break_Down, No_Of_Servers, Full_File_Address, OverWrite, Compressed) VALUES ('" & Group_Name1 & "', '" & Full_Folder_Address1 & "', '" & File1 & "', '" & File_Size_KB1 & "', '" & Schedule_To_Run1 & "', '" & Backup_Time1 & "', '" & Last_Run1 & "', '" & Result1 & "', '" & Last_Modfied1 & "', '" & Latest_Modfied1 & "', '" & CStr(Backup_On_Next_Run1) & "', '" & Total_Backup_Times1 & "', '" & Server_File_Number1 & "', '" & Server_Number1 & "', '" & File_Break_Down1 & "', '" & No_Of_Servers1 & "', '" & Full_File_Address1 & "', '" & CStr(OverWriteQuick) & "', '" & CStr(CompressAns) & "')"
Try
conn = New SqlCeConnection(strConn)
conn.Open()
Dim cmd As SqlCeCommand = conn.CreateCommand()
cmd.CommandText = CommandText1
'cmd.CommandText = "INSERT INTO BackupDatabase (¦"
cmd.ExecuteNonQuery()
Catch e As Exception
MessageBox.Show(e.Message)
Finally
conn.Close()
End Try
End Sub
25 to 30 but mostly holds around 27 rows per sec I
Is this the best you can get or is there a better way. Any help would be greatly appericated
Kind Regards
John Galvin
View 3 Replies
View Related
Jun 11, 2008
I have stored procedure which contains follwing part of it. it says syntax when i worte line to get @@identity valuewhen delete that line command succesful. but i need to get @@identity from the insert statement and assign it to a variable and use it after
any body pls tell me how to get this within a stored prosedure or what is the error of the following code bit. (#tblSalesOrders is a temporary table which containsset of records from orginal table )DECLARE @soNo1 INT
DECLARE @CursorOrders CURSOR
SET @CursorOrders = CURSOR FAST_FORWARD FOR
select fldSoNo from #tblSalesOrders
declare @newSONO1 int OPEN @CursorOrders
FETCH NEXT FROM @CursorOrders INTO @soNo1
WHILE @@FETCH_STATUS = 0
BEGIN
----for each salesorder insert to salesorderline
insert into tblSalesOrders (fldWebAccountNo,fldDescription,fldSoDate,fldGenStatus) select (fldWebAccountNo,fldDescription,fldSoDate,fldGenStatus) from #tblSalesOrders where fldSoNo =@soNo1;
set @newSONO1=SCOPE_IDENTITY;
-------in this section again create another cursor for another set of records and insert to a table passing identity value return from the above insert --------------------------
SELECT @intErrorCode = @@ERRORIF (@intErrorCode <> 0) GOTO PROBLEM
FETCH NEXT FROM @CursorOrders INTO @soNo1
END CLOSE @CursorOrders
DEALLOCATE @CursorOrders
View 2 Replies
View Related
Apr 25, 2007
Hi All,
From what I know, when you execute an
INSERT INTO table1 (field0)
SELECT field0 FROM table 2 ORDER BY field0
the rows are inserted in whatever order the server engine thinks is better at that moment. Is it any way I can have field0 in table1 inserted in the order I need?
My problem is that I can not touch table1, it is used by a legacy application which I am not allowed to modify. I was thinking about creating a clustered index, adding an id field, etc, but I can not know how this will affect the front end application.
table1 and table2 have only a few hundred records.
table1 =
CREATE TABLE [FinalPlasma] ([name] [varchar] (2000) NULL )
table2 =
CREATE TABLE [Test_FinalPlasma] ([nameID] [int] NOT NULL ,
[name] [varchar] (2000) NULL
)
The update that is not giving the "good" order =
DELETE FROM FinalPlasma
INSERT INTO FinalPlasma SELECT name from dbo.Test_FinalPlasma
Unfortunately I can not order the [name] field after the update, because it looks something like
Donald McQ. Shaver
Mark Sheilds
R.J. Shirley
W.E. Sills
Kenneth A. Smee
A. Britton Smith
LCol Edward W. Smith
Harry V. Smith
M. E. Southern
Timothy A. Sparling
Spectrum Investment Management Limited
Thank you,
View 12 Replies
View Related
Mar 6, 2015
I have stored procedure .In SP i am using cursur to load data from Parent to several child table.
I have attached the script with this message.
And my problem is how to use direct select and insert or load to speedup the process instead of cursor.
USE [IconicMarketing]
GO
/****** Object: StoredProcedure [dbo].[SP_DMS_INVENTORY] Script Date: 3/6/2015 3:34:03 PM ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
[Code] ....
View 3 Replies
View Related
Feb 26, 2015
I am writing a query to return some production data. Basically i need to insert either 1 or 2 rows into a Table variable based on a decision as to does the production part make 1 or 2 items ( The Raw data does not allow for this it comes from a look up in my database)
I can retrieve all the source data i need easily but when i come to insert it into the table variable i need to insert 1 record if its a single part or 2 records if its a twin part. I know could use a cursor but im sure there has to be an easier way !
Below is the code i have at the moment
declare @startdate as datetime
declare @enddate as datetime
declare @Line as Integer
DECLARE @count INT
set @startdate = '2015-01-01'
set @enddate = '2015-01-31'
[Code] .....
View 1 Replies
View Related
Jul 23, 2004
Hi,
I am working on a SQL Server table designed by a partner company and cannot change the data structures. There is a table with a list of people available for calling out to a security system (keyholders).
From a web form, I need to allow my users to change the telephone calling order of the keyholders in the table.
The two important fields are AccountCode nVarChar and CallOrder nVarChar - where AccountCode + CallOrder must be unique.
As an example, the table may contain records with the following data..
1234, 1, Fred
1234, 2, Bert
1234, 3, Bob
If the user wants to make Bob the number 1 keyholder, Fred number 2 and Bert number 3 - what is the best practice for me to approach the update ?
Is this a job for ADO.Net or T-SQL ?
Thanks in advance.
Steve.
View 3 Replies
View Related
Aug 11, 2004
How many result-rows does mssql return should be used asynchronous method to use mssql cursor, can get the best performance in any time in any result offset?
i want to make the cursor fast in any time whatever how many results returned
View 2 Replies
View Related
Jul 27, 2007
There is a stored procedure that inserts a row into 'Vendors' table. Is it possible that two different calls to this sp happen at the same time and as a result, each sp inserts into the table its row at exactly the same time?
View 1 Replies
View Related
Jul 3, 2000
Hi all -
is there a way to process a file x records at a time?
We have a table that I need to append to an existing table. The date columns are currently in char but must be converted to datetime for the existing table. The problem is I have bad data. There are 3 million rows where the date field isn't valid for SQL's datetime format. Since this is the data I have, I have to work with it. I would like for SQL to just insert a null if it comes upon a bad date. Currently when it encounters a field that isn't valid, it stops the process with an error.
I have tried to go around it below, but there is still something "hanging" I would like to be able just to insert one million rows at a time and if it errors, then I can look at the next million, find the error, fix it and continue on.
Any suggestions? Or if you have a better idea all together I would love to see it.
SQL Server 7.0, SP2
,CASE
when
(substring(check_date,1,4) not between '1997' and '2000' or
substring(check_date,5,2) not between '01' and '12' or
substring(check_date,7,2) not between '01' and '31') THEN null
ELSE cast(check_date as datetime)
END AS check_date
Thanks,
Michelle
View 1 Replies
View Related
Jun 9, 2007
I have a table of machine data that captures fault codes, time machine stopped and time machine started. I can easily calculate the downtime, but how do i take the last start time and subtract it from the next stop time (1 row from the next row) to get an up time. I need it to group by short date and fault as I intend to use Excel as a reporting tool and pulling in all data will not fit. What is the most efficient way?
Thanks,
Lee
View 8 Replies
View Related
Sep 25, 2015
The stored procedure takes ~ 10 seconds every call. The execution plan is Index seek or clustered index scan almost everywhere . No table spools or lazy loads or key lookups. Cannot share the code or execution plan due to company rules and regulations.Issue is that i am dealing with result set of around 50,000 records . Out of these i have to return 20 records at a single time (which is also customizable i.e 40 / 60/ 150 records in a page). Application cannot handle all 50k records so i have to return 20 records for every stored procedure call.
The result set changes as per the start date and end date which i recieve as parameters. In application there are few Column filters namely- Country(around 50 countries), Outcome(around 6 to 10 values ) . These filters will values in drop down( as excel ) depending on the distinct values in that columns. These filters will be populated on every page, if no filter value is selected.Issue is if user does sorting or filtering any records , this stored procedure is called and every time i have to deal with ~50000 records.
Current Code :
Step 1 ) Get the required result set in temp table .
Step 2) Compute the results on some business rules . (Outcome and SharesAvailable calculation - see attachment)
Step 3) populate filter Columns (Country,Outcome) these values will be comma seperated.
Step 4) Dynamic query to get required result set i.e if user wants only 10 records in single page then TOP 10 . Sorting can be applied on any column mentioned in screenshot.
View 9 Replies
View Related
Oct 6, 2006
hi,
here i am with a table containing 5columns and 100 rows.i want to delete all rows at the same time. pls suggest me a way on this
One can never consent to creep,when one feels an impulse to soar
RAMMOHAN
View 5 Replies
View Related
Jul 14, 2006
Hi guys,This might be simplest thing, but I am newbie to databases.I need to find out only rows modified within certain time period from adatabase. As I undertand a way out could be adding an where clause forthe time period might be an option, I might be wrong here again.But, wanted to know is there any other option. Can triggers or anyother things help me in this matter.Regards,Abhijeet
View 1 Replies
View Related