I have a table named sales, num of rows are 20. total number of rows for stor_id is 5 rows
I assume that when I create a cursor like this one which will have 5 rows from the cursor Now if I print a statement , I assume that I will have the statement 5 times( one for each row) Am I correct...
well, if this is the concept, I am still having one printed statement.. I do not know why , am I doing something wrong?
thanks for help
CREATE PROCEDURE p_cursor @ord_nbr char(4)
declare rst cursor for
select * from sales
where stor_id= 123
open rst
fetch rst
if (@@fetch_status=0)
print ' I may have 5 rows '
We're very interested in having our application use SQL/e but we can't have the 4 gig limit. It makes sense to me that SQL/e simply should not be able to access a file over the network, and then you wouldn't have any reason to put a 4 gig limit on it.
At that point it becomes a very flexible alternative for remote users that need to have a large amount of data (i.e. documents, images etc.) with them.
We'd love to start building an abstraction layer so that we can support both SQL Server and SQL/e so that we can support network and remote users and not have the nightmare that is SQL Server Express installation. (care of the windows installer group's bugs...)
I am wanting to continuously monitor a source table throughout the day and as data becomes available, process it and insert it into one of a number of tables.
I have tried achieving this using a FOR LOOP and setting the halt condition such that it is not stisfiable. However, this has a couple of problems:
1) It runs in a tight loop and consequently degrades system performance enormously.
2) I can't get transactions to work. I would like each iteration of the loop to spawn a new transaction under which the tasks in the loop can run. Therefore, if one of the tasks fails during such an iteration, only the updates affected by that iteration are lost.
Ideally, I would like to be able to put a wait statement within the loop container so that it runs every couple of seconds. And would also like to implement transactions as described above.
I have transactional replication setup from server A to Server B. I wanted to move the subscriber from B to C. What could be the best approach.
1. Backup the DB from Server B and restore on Server C. set the replication between A & C. 2. setup the transaction replication between A & C along between A & B. Test A& C working fine and then remove B.
If I am going with approach 2 , I have to replicate data approx. 70 GB so If I ran both the replication on Server A that will stress as 140 GB of data moving out. How do I control this large movement ? Can the replication be manual synch?
The secondary server for my availability group was recycled. When SQL Server came back online the data movement for a database was suspended. The error log shows:
"AlwaysOn Availability Groups data movement for database 'XXXXXXXXX' has been suspended for the following reason: "system" (Source ID 5; Source string: 'SUSPEND_FROM_RESTART'). To resume data movement on the database, you will need to resume the database manually. For information about how to resume an availability database, see SQL Server Books Online."
I was able to resume data movement with no issue. I would like to understand the technical reason as to WHY the data movement was put in the suspended state and left there upon recycle. I searched for an article that would list possible reasons (BOL, Google, Bing, etc..). I just could not find much information out there on 'SUSPEND_FROM_RESTART'.
I am aware that TDE protects data at Rest and not during communication or data in motion (UNLESS you use Encrypted communication channels using SSL certs etc). Hence I am thinking of doing data export from a TDE encrypted database to a database on the instance where TDE is not enabled or supported. I believe it works and need to take care of relationships between tables.The target database is hosted on SQL 2012 standard edition on which TDE is not supported.
Hello everyone,There's an interesting SQL problem I've come across that I'm currentlybanging my head against. Given the following table that contains itemlocation information populated every minute :location_id date_created=========== ============5 2000-01-01 01:00 <-- Don't need5 2000-01-01 01:01 <-- Don't need5 2000-01-01 01:02 <-- Need7 2000-01-01 01:03 <-- Don't need7 2000-01-01 01:04 <-- Need5 2000-01-01 01:05 <-- Need2 2000-01-01 01:06 <-- Don't Need2 2000-01-01 01:07 <-- Need7 2000-01-01 01:08 <-- Needhow would you generate a result-set that returns the item's locationhistory *without* duplicating the same location if the item has beensitting in the same room for a while. For example, the result setshould look like the following :location_id date_created=========== ============5 2000-01-01 01:027 2000-01-01 01:045 2000-01-01 01:052 2000-01-01 01:077 2000-01-01 01:08This is turning out to be a finger twister and I'm not sure if itcould be done in SQL; I may have to resort to writing a stored-proc.Regards,Anthony
I'm getting this error on expanding the Databases node in SQL Server Management Studio Express Object Explorer:
Failed to retrieve data for this request. (Microsoft.SqlServer.Express.SmoEnum) An exception occurred while executing a Transact-SQL statement or batch. (Microsoft.SqlServer.Express.ConnectionInfo) Could not continue scan with NOLOCK due to data movement. (Microsoft SQL Server, Error: 601)
A similar error message also appeared when I tried opening an existing data connection within Database Explorer in Visual Web Developer Express. (However, the web application ran fine and it managed to access the database normally.)
These errors only appeared recently. Any ideas how to go about solving this issue?
In my organization, we have a database which exceed unto 2 GB. The application on which we are working is developing since the time of asp / cb 6 and in short it is the baby of many developers. It is a kind of ERP which has no documentation.
Problem 1: ========= Now, to reduce the size of the database we have examined some of the orphan procedures, tables, and columns. Is it possible for me to create a DTS package which create the backup of all the orphan stuff to some other database and delete it from the actual one. Moreover, We still have doubts that some of fields which we think are orphan, might not be. So we need another DTS package which rollback all the changes. (Any Idea for that)
Problem 2: ========= We want another DTS package which would move the data from one database to another and same rollback DTS for this one. (Any Idea for that)
Problem 3: ========= How can we use DTS programming to do these tasks ? what benefits do I got from DTS Programming over DTS Wizard.
We have a stored procedure that failed with: Could not continue scan with NOLOCK due to data movement
We are running with ISOLATION LEVEL READ UNCOMMITTED. There are other jobs running, some of which might be hitting these tables (all using the ROWLOCK hint - though I know that's not guaranteed), however, this stored proc would not be going near the same rows. But even if they were, we'd be happy with either the before-look or after-look. This needs to be a low-impact job and should have minimal impact on ther other jobs, so we can't take out locks. Is there any hint we can use to do this? e.g. can we tell the query to just wait until the data has stopped moving then try again?
STATIC Defines a cursor that makes a temporary copy of the data to be used by the cursor. All requests to the cursor are answered from this temporary table in tempdb; therefore, modifications made to base tables are not reflected in the data returned by fetches made to this cursor, and this cursor does not allow modifications
It say's that modifications is not allowed in the static cursor. I have a questions regarding that
Static Cursor declare ll cursor global static for select name, salary from ag open ll fetch from ll
while @@FETCH_STATUS=0 fetch from ll update ag set salary=200 where 1=1
close ll deallocate ll
In "AG" table, "SALARY" was 100 for all the entries. When I run the Cursor, it showed the salary value as "100" correctly.After the cursor was closed, I run the query select * from AG.But the result had updated to salary 200 as given in the cursor. file says modifications is not allowed in the static cursor.But I am able to update the data using static cursor.
Hello,I have a test database with table A containing 10,000 rows and a tableB containing 100,000 rows. Rows in B are "children" of rows in A -each row in A has 10 related rows in B (ie. B has a foreign key to A).Using ODBC I am executing the following loop 10,000 times, expressedbelow in pseudo-code:"select * from A order by a_pk option (fast 1)""fetch from A result set""select * from B where where fk_to_a = 'xxx' order by b_pk option(fast 1)""fetch from B result set" repeated 10 timesIn the above psueod-code 'xxx' is the primary key of the current Arow. NOTE: it is not a mistake that we are repeatedly doing the Aquery and retrieving only the first row.When the queries use fast-forward-only cursors this takes about 2.5minutes. When the queries use dynamic cursors this takes about 1 hour.Does anyone know why the dynamic cursor is killing performance?Because of the SQL Server ODBC driver it is not possible to havenested/multiple fast-forward-only cursors, hence I need to exploreother alternatives.I can only assume that a different query plan is getting constructedfor the dynamic cursor case versus the fast forward only cursor, but Ihave no way of finding out what that query plan is.All help appreciated.Kevin
I'm trying to implement a sp_MSforeachsp howvever when I call sp_MSforeach_worker I get the following error can you please explain this problem to me so I can over come the issue.
Msg 16958, Level 16, State 3, Procedure sp_MSforeach_worker, Line 31
Could not complete cursor operation because the set options have changed since the cursor was declared.
Msg 16958, Level 16, State 3, Procedure sp_MSforeach_worker, Line 32
Could not complete cursor operation because the set options have changed since the cursor was declared.
Msg 16917, Level 16, State 1, Procedure sp_MSforeach_worker, Line 153
Cursor is not open.
here is the stored procedure:
Alter PROCEDURE [dbo].[sp_MSforeachsp]
@command1 nvarchar(2000)
, @replacechar nchar(1) = N'?'
, @command2 nvarchar(2000) = null
, @command3 nvarchar(2000) = null
, @whereand nvarchar(2000) = null
, @precommand nvarchar(2000) = null
, @postcommand nvarchar(2000) = null
AS
/* This procedure belongs in the "master" database so it is acessible to all databases */
/* This proc returns one or more rows for each stored procedure */
/* @precommand and @postcommand may be used to force a single result set via a temp table. */
declare @retval int
if (@precommand is not null) EXECUTE(@precommand)
/* Create the select */
EXECUTE(N'declare hCForEachTable cursor global for
DECLARE DBCur CURSOR FOR SELECT U_OB_DB FROM [@OB_TB04_COMPDATA]
OPEN DBCur FETCH NEXT FROM DBCur INTO @DBNAME
WHILE @@FETCH_STATUS = 0 BEGIN
SELECT @SQLCMD = 'SELECT T0.CARDCODE, T0.U_OB_TID AS TRANSID, T0.DOCNUM AS INV_NO, ' + + 'T0.DOCDATE AS INV_DATE, T0.DOCTOTAL AS INV_AMT, T0.U_OB_DONO AS DONO ' + + 'FROM ' + @DBNAME + '.dbo.OINV T0 WHERE T0.U_OB_TID IS NOT NULL' EXEC(@SQLCMD) PRINT @SQLCMD FETCH NEXT FROM DBCur INTO @DBNAME
END
CLOSE DBCur DEALLOCATE DBCur
Part 2
SELECT T4.U_OB_PCOMP AS PARENTCOMP, T0.CARDCODE, T0.CARDNAME, ISNULL(T0.U_OB_TID,'') AS TRANSID, T0.DOCNUM AS SONO, T0.DOCDATE AS SODATE, SUM(T1.QUANTITY) AS SOQTY, T0.DOCTOTAL - T0.TOTALEXPNS AS SO_AMT, T3.DOCNUM AS DONO, T3.DOCDATE AS DO_DATE, SUM(T2.QUANTITY) AS DOQTY, T3.DOCTOTAL - T3.TOTALEXPNS AS DO_AMT INTO #MAIN FROM ORDR T0 JOIN RDR1 T1 ON T0.DOCENTRY = T1.DOCENTRY LEFT JOIN DLN1 T2 ON T1.DOCENTRY = T2.BASEENTRY AND T1.LINENUM = T2.BASELINE AND T2.BASETYPE = T0.OBJTYPE LEFT JOIN ODLN T3 ON T2.DOCENTRY = T3.DOCENTRY LEFT JOIN OCRD T4 ON T0.CARDCODE = T4.CARDCODE WHERE ISNULL(T0.U_OB_TID,0) <> 0 GROUP BY T4.U_OB_PCOMP, T0.CARDCODE,T0.CARDNAME, T0.U_OB_TID, T0.DOCNUM, T0.DOCDATE, T3.DOCNUM, T3.DOCDATE, T0.DOCTOTAL, T3.DOCTOTAL, T3.TOTALEXPNS, T0.TOTALEXPNS
my question is, how to join the part 1 n part 2? is there posibility?
I'm new to cursors, and I'm not sure what's wrong with this code, it run for ever and when I stop it I get cursor open errors
declare Q cursor for select systudentid from satrans
declare @id int
open Q fetch next from Q into @id while @@fetch_status = 0 begin
declare c cursor for
Select b.ssn, SaTrans.SyStudentID, satrans.date, satrans.type, SaTrans.SyCampusID, Amount = Case SaTrans.Type When 'P' Then SaTrans.Amount * -1 When 'C' Then SaTrans.Amount * -1 Else SaTrans.Amount END
From SaTrans , systudent b where satrans.systudentid = b.systudentid
I having a difficult time here trying to figure out what to do here.I need a way to scroll through a recordset and display the resultswith both forward and backward movement on a web page(PHP usingADO/COM)..I know that if I use a client side cursor all the records get shovedto the client everytime that stored procedure is executed..if thisdatabase grows big wont that be an issue?..I know that I can set up a server side cursor that will only send therecord I need to the front end but..Ive been reading around and a lot of people have been saying never touse a server side cursor because of peformance issues.So i guess im weighing network performance needs with the client sidecursor vs server performance with the server side cursor..I am reallyconfused..which one should I use?-Jim
I hope this is the appropriate forum for this question, if not then I apologize. I've got a SQL Server 2000 stored procedure that returns data to be used in a crystal report in Visual Studio 2005. Most of the stored procedure works well, but there is a point where I need to calculate an average number of days been a group of date pairs. I'm not familiar with cursors, but I think that I will need to use one to achieve the result I am looking for so I came up with the code below which is a snippet from my stored procedure. In this part of the code, the sp looks at the temporary table #lmreport (which holds all of the data that is returned at the end to crystal) and for every row in the table where the terrid is 'T' (the territory is domestic), it selects all of those territories from the territory table and loops through them to determine the date averages (by calling a nested stored procedure, also included below) for each territory and then updates #lmreport with that data. When I try to run the stored procedure, I get "The column prefix '#lmreport' does not match with a table name or alias name used in the query." on the line indicated. Does anyone have any idea what might be wrong or if this will even work the way I need it to? Thank you in advance.
I need some help with the concept of a Cursor, as I see it being used in a stored procedure I need to maintain. Here is some code from the stored proc. Can someone tell me what is going on here. I haveleft out some of the sql, but have isolated the Cursor stuff. Open MarketCursor -- How is MarketCursor loaded with data ? FETCH NEXT FROM MarketCursorINTO ItemID, @Item, @Reguest WHILE @@FETCH_STATUS = 0BEGIN
I have something like update table set field = ... where field = ... and for each entry that was effected by this query I want to insert an entry into another table. I have always done this with cursors is there a more effecient way? For some reason cursors run a lot slower on my sql2005 server than the sql2000 server...
hii have creted cursor but i want to use in my asp.net programming when some insert or delete command is work that time i want to excute my cursor how can i do that using asp.net with c# waiting for replaythanks
Hello: I am trying to define a cursor as follows: DECLARE EmployeeList CURSOR FOR dbo.GetRecord(@EmployeeID,@CurrentDate)Can't I use a UDF in the CURSOR FOR ?Help please.thank you.
Hello, I'm trying to construct a cursor that will sequentually increment a number and then update a column with the incremented number. My propblem is that all the rows in the table are being updated with the base number +1. So all rows are updated with 278301. BUT, what I really want is for only the items with adrscode of 'bill to' to be given an incremented number. For example, if there are only five rows of 100 with an adrscode = 'bill to' then only five rows will be updated and the value of the custnmbr should be, 278301, 278302, 278303 ..... I could really use some help with this cursor: Declare @CustomerName as char (60), @seqno as int, @BaseSeqno as intset @Baseseqno = 278300 declare c cursor for select custnmbr from NXOFcustomers Where adrscode = 'BILL TO' order by custnmbropen cfetch next from c into @CustomerNamewhile @@fetch_status=0begin set @seqno = @BaseSeqno + 1 update NXOFcustomers set custnmbr = @seqnoWhere custnmbr = @CustomerName fetch next from c into @CustomerNameend close cdeallocate c
Declare c_cursor Cursor Scroll For Select card_id From cardcreator where card_every_cat = @cat_id Open c_cursor /* Scroll to the randomly selected row and populate into output parameters */ Fetch absolute @iRandomRecord From c_cursor Into @vi_cardid1 print @vi_cardid1 /*Need to check to fetch status if you have reached end of record set begin from first */ select @@fetch_status if (@@fetch_status = 0) Begin Fetch next from c_cursor into @vi_cardid2 print @vi_cardid2 End if (@@fetch_status = 0) Begin Fetch next from c_cursor into @vi_cardid3 End print @vi_cardid3
/* Close and deallocate cursor */ Close c_cursor Deallocate c_cursor
I need to have atleast three records. But if random value starts the cursor at the end, the cursor would not wrap to the beginning.
Is there a way to wrap the cursor to begining if its status is not zero.
I am sure I am not the first one ask this. I have got two tables, what I would like to do now is to update the second table using the values in the first table where T1.id = T2.id, normally I have to use cursor to loop through table two to achieve this. But is it possible to do this without using cursor?
Table 1 has numerous resume's for each person. Each resume has a unique id. ie: Table 1 res_id fname lname userid pwd address city state etc... 100 John Doe jd ok xxxx xxxx xx xxxx 104 Sally May sm sm ccccc cc c cc ccc 643 John Doe jd ok ssss null null 1003 John Doe jd ok 123 elm Nome AK ... 5000 Tom Cat tc tc null null null
I need to insert into Table 2 only the demographic information for each person appearing in Table 1. The catch is that Table 2 doesn't have the same unique id that appears in Table 1. userid and pwd are unique to Table 2 but are numerous in Table 1.
Table 2 new_ident userid pwd address city state etc.. 10 jd ok 123 elm Nome AK .... 11 Sally May sm sm ccccc cc c cc ccc 12 Tom Cat tc tc null null null
Basically I need to choose the most current "max(res_id)" occurance for John Doe above to get only one row out of his three rows. Then I need to get all the other unique rows from table 1.
I hope that is clear. I was considering a cursor. Any ideas??
Is using cursor the only way to do update in this case. I'm updating TableA.ID with TableB.New_id where TableA.ID = TableB.ID. TableA has 2.5 million records and TableB has 500,000 records. Doing it this way bring the system down to it's knees, and is taking forever. Any suggestion are welcome.
declare mrn_cur cursor for select dealer_ident, kealer_id from dealer for read only
declare @result int declare @temp_ident int declare @temp_id int declare @temp_var int
open mrn_cur fetch mrn_cur into @temp_ident, @temp_id
while (@@fetch_status = 0) begin begin transaction update label set dealer_id = @temp_ident where dealer_id = @temp_id commit tran fetch mrn_cur into @temp_ident, @temp_id end close mrn_cur deallocate mrn_cur go
I am writing an application for a training facility. I have three tables in particular that I'm concerned with.
Here they are with the relevent keys 1. Registrations - contains a customerid and a classid 2. Sessions - contains a classid (multiple sessions can exist for a class) 3. Attendence - contains a session and a customerid.
Whenever I insert a session record for a class, I want to automatically create a corresponding record in the attendence table for every student in the class . My only thought is to create an insert trigger on the session table than creates creates a cursor containing the customerid for every student registered in the class. Then I can walk through the cursor and insert an Attendence record for each student.
I really don't want to use a cursor if I can help it, but I can't think of a way to write an single INSERT statement to put into my trigger. Is there a way to do this without using a cursor?