"Cursor provide row-by-row level processing and it will store the result sets in 'TEMPDB' database".
(Because of this) or (By using Cursor in Triggers or Stored Procedures) the performance will increase or performance will come down?. I am thankful if I get a good reason for this?
Hello,I have a test database with table A containing 10,000 rows and a tableB containing 100,000 rows. Rows in B are "children" of rows in A -each row in A has 10 related rows in B (ie. B has a foreign key to A).Using ODBC I am executing the following loop 10,000 times, expressedbelow in pseudo-code:"select * from A order by a_pk option (fast 1)""fetch from A result set""select * from B where where fk_to_a = 'xxx' order by b_pk option(fast 1)""fetch from B result set" repeated 10 timesIn the above psueod-code 'xxx' is the primary key of the current Arow. NOTE: it is not a mistake that we are repeatedly doing the Aquery and retrieving only the first row.When the queries use fast-forward-only cursors this takes about 2.5minutes. When the queries use dynamic cursors this takes about 1 hour.Does anyone know why the dynamic cursor is killing performance?Because of the SQL Server ODBC driver it is not possible to havenested/multiple fast-forward-only cursors, hence I need to exploreother alternatives.I can only assume that a different query plan is getting constructedfor the dynamic cursor case versus the fast forward only cursor, but Ihave no way of finding out what that query plan is.All help appreciated.Kevin
Hi, I created a view on two huge tables. I tried to run a simple SELECT statement on this view and it took me several hours to obtain the result. How can I improve the performance of a view? The view should make use of the indexes built in both table, am I right? Thanks.
I'll just throw my question: how could I increase SSIS-performance?
I have a really heavy job with thousands of records my base selection, then I perform some lookups (I replaced most of them by sql) and derived columns (again, I replaced as much as possible by sql). Finally, after a slowly changing dimension task, I do update/insert on a given table. Is there a trick to speed up lookups and inserts (something like manipulating the buffer sizes - just asking). Fact is that I replaced a script task by pure sql-joins and gained 6 of the 12 hours this job took.
I am writing an ASP based application that creates a dynamic querry and then executes it and displays results. I was thinking about writing a stored procedure to increase performance. How much can the SP help me boost querry responce time ???
I have 3 tables (A, B, C) with milions of records (A ca 5 milions, B and C ca 10 milions). I have created a join betwenn them
select some fields (A, B, C) FROM A as a JOIN B as B on a.a1 = b.a1 and a.a2 = b.a2 JOIN C as c ON b.b1 = c.b1 and b.b2 = c.b2 Where fieldtime <= date/time
But it takes to much time: aftre 2 hours and half is still running.
Dear all, I'm using SQL Server 2005 Standard Edetion. I have the following stored procedure that is executed against two tables (RecrodedCalls) and (RecordedCallsTags) The table RecordedCalls has more than 10000000 Records and RecordedCallsTags is about 7500000 Records Now the lines marked in baby blue are dynamic (Dynamic where statement) that varies every time this stored procedure is executed, may it contains 7 columns in condetion statement or may it contains 10 columns, or 2 coulmns.....etc Now I want to create non-clustered indexes on the columns used in the where statement, THE DTA suggests different indexing whenever the where statement changes. So what is the right way to created indexes, to create one index on all the columns once, or to create separate indexes on each columns, sometimes the DTA suggests 5 columns together at one if I€™m using 5 conditions, I can€™t accumulate all the possible indexes hence the where statement always vary from situation to situation, below the SP:
CREATE TABLE #tempLookups (ID int identity(0,1),Code NVARCHAR(100),NameE NVARCHAR(500),NameA NVARCHAR(500))
CREATE TABLE #tempTable (ID int identity(0,1),TypesCount INT,CallsType NVARCHAR(50))
INSERT INTO #tempLookups SELECT Code, NameE, NameA FROM lookups WHERE [Type] = 'CALLTYPES' ORDER BY Ordering ASC
INSERT INTO #tempTable SELECT COUNT(DISTINCT(RecordedCalls.ID)) As TypesCount,RecordedCalls.CallType as CallsType
FROM RecordedCalls LEFT OUTER JOIN RecordedCallsTags ON RecordedCalls.ID = RecordedCallsTags.CallID
WHERE RecordedCalls.ID <= '9369907'
AND (RecordedCalls.CallDate BETWEEN cast ('01 Jan 1910 00:00:00:000' as datetime ) AND cast ( '01 Jan 2210 00:00:00:000' as datetime ))
AND (RecordedCalls.Duration BETWEEN 0 AND 1000000)
AND RecordedCalls.ChannelID NOT IN('62061','62062','62063','62064','64110','64111','64112','64113','64114','69860','69861','69862','69863','69866','69867','69868')
AND RecordedCalls.ServerID NOT IN('2')
AND RecordedCalls.AgentID NOT IN('1000010000')
AND (RecordedCallsTags.TagID is null OR RecordedCallsTags.TagID NOT IN('100','200'))
AND RecordedCalls.IsDeleted='false'
GROUP BY RecordedCalls.CallType
SELECT IsNull(#tempTable.TypesCount, 0) AS TypesCount, CASE('English')
WHEN 'Arabic' THEN #tempLookups.NameA
ELSE #tempLookups.NameE
END AS CallsType FROM
#tempTable RIGHT OUTER JOIN #tempLookups ON #tempTable.CallsType = #tempLookups.Code
DROP TABLE #tempLookups
DROP TABLE #tempTable
Thanks all, Tayseer
Any suggestions how to create efficient indexes??!!
I have a stored procedure below that loops through a table and updates another table. I used to do this client side in vb but was hoping for better performance when doing everything server side.
Question;
Am I doing something wrng since the query is that slow.? Why is cursors so bad ?
Thanks
David
ALTER PROCEDURE usp_setinternational
AS
Declare @acnt_no int
declare @cntry_code varchar(10)
declare @counter int
declare @error int
declare MyC cursor
Local
for
SELECT ACNT_NO, COUNTRY_CODE FROM INTNL_ACNT
open MyC
fetch next from MyC into @acnt_no,@cntry_code
select @counter = 1
while @@fetch_status = 0
begin
update xform2 set
address_type='m'
where acnt_no = @acnt_no
update addr set delivery='m',cntry_cde= @cntry_code
where addr_id in (select addr_id from acnt_addr where acnt_no = @acnt_no)
hi friends i am using a cursors to fetch data row by row , but the main problem is it takes a very big time span to execute can anybody help me out to replace cursor with some other method or trick so that i can make it perform fast... see my query is like this:
--update cdr set destinationid = null
DECLARE @CDRID BIGINT;
DECLARE @CodeDestID BIGINT;
DECLARE @TempCodeDestID BIGINT;
DECLARE @CtryCode VARCHAR(20);
DECLARE @MATCH INT;
DECLARE @dialed_digits NVARCHAR(50); DECLARE @FormattedNbr VARCHAR(50); DECLARE @CountryCode VARCHAR(2); DECLARE CDR_cursor CURSOR FOR SELECT CDRID, dialed_digits FROM mydata.dbo.CDR WHERE DestinationID IS NULL OPEN CDR_cursor
FETCH NEXT FROM CDR_cursor INTO @CDRID, @dialed_digits
WHILE @@FETCH_STATUS = 0
BEGIN
SET @FormattedNbr = replace(LTRIM(RTRIM(@dialed_digits)), '"', '');
SET @CountryCode = substring(@FormattedNbr, 0, 3);
SET @CodeDestID = null;
SET @MATCH = 0;
DECLARE Dest_cursor CURSOR FOR SELECT DestinationID, Code FROM mydata.dbo.ctryCode WHERE code Like LTRIM(RTRIM(@CountryCode)) +'%' Order by code Desc
OPEN Dest_cursor
FETCH NEXT FROM Dest_cursor INTO @TempCodeDestID, @CtryCode
WHILE @@FETCH_STATUS = 0
BEGIN
IF @MATCH <> 1
BEGIN
SET @MATCH = PATINDEX(@CtryCode + '%', @FormattedNbr);
IF @MATCH = 1
BEGIN
print @TempCodeDestID
SET @CodeDestID = @TempCodeDestID
END END
FETCH NEXT FROM Dest_cursor INTO @TempCodeDestID, @CtryCode
END
CLOSE Dest_cursor
DEALLOCATE Dest_cursor
IF @MATCH = 1
BEGIN
--Updating the maximum possible match
UPDATE mydata.dbo.CDR SET DestinationID = @CodeDestID WHERE CDRID = @CDRID
SET @CodeDestID = null;
SET @MATCH = 0;
END FETCH NEXT FROM CDR_cursor INTO @CDRID, @dialed_digits
the cursor at the bottom  iterates only to print the number of rows.The problem is in the select. This takes 30 seconds to iterate through 1242 records.But if I add a TOP 1000000 or whatever number to the select, the same iteration takes less than a 1 second.I've tested each query without cursor, and  both have the same cost and performance. (Not exactly the same plan)Note that I got the same performance improvement declaring the cursor as STATIC.Why the top is affecting the cursor iteration so much?
STATIC Defines a cursor that makes a temporary copy of the data to be used by the cursor. All requests to the cursor are answered from this temporary table in tempdb; therefore, modifications made to base tables are not reflected in the data returned by fetches made to this cursor, and this cursor does not allow modifications
It say's that modifications is not allowed in the static cursor. I have a questions regarding that
Static Cursor declare ll cursor global static            for select name, salary from ag  open ll             fetch from ll               while @@FETCH_STATUS=0               fetch from ll                update ag set salary=200 where 1=1    close ll deallocate ll
In "AG" table, "SALARY" was 100 for all the entries. When I run the Cursor, it showed the salary value as "100" correctly.After the cursor was closed, I run the query select * from AG.But the result had updated to salary 200 as given in the cursor. file says modifications is not allowed in the static cursor.But I am able to update the data using static cursor.
I'm trying to implement a sp_MSforeachsp howvever when I call sp_MSforeach_worker I get the following error can you please explain this problem to me so I can over come the issue.
Msg 16958, Level 16, State 3, Procedure sp_MSforeach_worker, Line 31
Could not complete cursor operation because the set options have changed since the cursor was declared.
Msg 16958, Level 16, State 3, Procedure sp_MSforeach_worker, Line 32
Could not complete cursor operation because the set options have changed since the cursor was declared.
Msg 16917, Level 16, State 1, Procedure sp_MSforeach_worker, Line 153
Cursor is not open.
here is the stored procedure:
Alter PROCEDURE [dbo].[sp_MSforeachsp]
@command1 nvarchar(2000)
, @replacechar nchar(1) = N'?'
, @command2 nvarchar(2000) = null
, @command3 nvarchar(2000) = null
, @whereand nvarchar(2000) = null
, @precommand nvarchar(2000) = null
, @postcommand nvarchar(2000) = null
AS
/* This procedure belongs in the "master" database so it is acessible to all databases */
/* This proc returns one or more rows for each stored procedure */
/* @precommand and @postcommand may be used to force a single result set via a temp table. */
declare @retval int
if (@precommand is not null) EXECUTE(@precommand)
/* Create the select */
EXECUTE(N'declare hCForEachTable cursor global for
DECLARE DBCur CURSOR FOR SELECT U_OB_DB FROM [@OB_TB04_COMPDATA]
OPEN DBCur FETCH NEXT FROM DBCur INTO @DBNAME
WHILE @@FETCH_STATUS = 0 BEGIN
SELECT @SQLCMD = 'SELECT T0.CARDCODE, T0.U_OB_TID AS TRANSID, T0.DOCNUM AS INV_NO, ' + + 'T0.DOCDATE AS INV_DATE, T0.DOCTOTAL AS INV_AMT, T0.U_OB_DONO AS DONO ' + + 'FROM ' + @DBNAME + '.dbo.OINV T0 WHERE T0.U_OB_TID IS NOT NULL' EXEC(@SQLCMD) PRINT @SQLCMD FETCH NEXT FROM DBCur INTO @DBNAME
END
CLOSE DBCur DEALLOCATE DBCur
Part 2
SELECT T4.U_OB_PCOMP AS PARENTCOMP, T0.CARDCODE, T0.CARDNAME, ISNULL(T0.U_OB_TID,'') AS TRANSID, T0.DOCNUM AS SONO, T0.DOCDATE AS SODATE, SUM(T1.QUANTITY) AS SOQTY, T0.DOCTOTAL - T0.TOTALEXPNS AS SO_AMT, T3.DOCNUM AS DONO, T3.DOCDATE AS DO_DATE, SUM(T2.QUANTITY) AS DOQTY, T3.DOCTOTAL - T3.TOTALEXPNS AS DO_AMT INTO #MAIN FROM ORDR T0 JOIN RDR1 T1 ON T0.DOCENTRY = T1.DOCENTRY LEFT JOIN DLN1 T2 ON T1.DOCENTRY = T2.BASEENTRY AND T1.LINENUM = T2.BASELINE AND T2.BASETYPE = T0.OBJTYPE LEFT JOIN ODLN T3 ON T2.DOCENTRY = T3.DOCENTRY LEFT JOIN OCRD T4 ON T0.CARDCODE = T4.CARDCODE WHERE ISNULL(T0.U_OB_TID,0) <> 0 GROUP BY T4.U_OB_PCOMP, T0.CARDCODE,T0.CARDNAME, T0.U_OB_TID, T0.DOCNUM, T0.DOCDATE, T3.DOCNUM, T3.DOCDATE, T0.DOCTOTAL, T3.DOCTOTAL, T3.TOTALEXPNS, T0.TOTALEXPNS
my question is, how to join the part 1 n part 2? is there posibility?
Choose an EEO-1 Classification: Increase all employees salaries that have: the selected EEO-1 classification by 10%.
Increase all employees salaries by 5% Hi! I am new to SQL, but trying to teach myself. I have had lots of help from you guys and appreciate it very much. Today, I was trying to do the following:
In my table of "Employees", I have a column called classifications. I want to increase the salaries of the employees with a classification of EE0-1 by 10%, but I have not figured out to do so.
My second question is how would I increase all employees' salaries by 5%?
Hello all, I have, what i think, is a unique problem that i'm hoping some of you can help me on.
I need to create a record number that is incremented by 1 whenever someone adds a new record to the database. For example, records numbering 1,2,3 are in the database. When the users adds a new record, SQL takes the last recordno, 3 in this case, and adds 1 to it thus producing 4.
Also, i need to have the ability to replace deleted record numbers with new ones. Using the example above, say a user deletes record number 2. Whenever someone adds a new record, sql would see the missing number and assign the new record that number.
I hope i'm making sense here. Does anyone have any ideas about this? Any articles on the web that someone could point me to?
Can someone tell me how I can write a stored procedure that will automatically increment the value of the index by 001? I am attempting to build a table for a menu system and I need to increment the index value of the document by 001 when submitted to the database. I also need to have the script obtain the last index value of the node before inserting new value.
All ideas appreciated.
Sincerely,
Tim
If interested, this requested is based upon an article written by Michael Feranda: http://www.pscode.com/vb/scripts/ShowCode.asp?txtCodeId=7321&lngWId=4
We are getting the following error on an SSIS package: "54 Diagnostic VirtualSQLName DomainUserName Load Daily 859CF005-CB7F-47D8-8432-AE7C074B343C 1A986F18-343F-4424-ABAB-AC6575187DF3 2007-02-14 10:05:42.000 2007-02-14 10:05:42.000 0 0x Based on the system configuration, the maximum concurrent executables are set to 4. "
Basically it is saying the we have reached the maximum amount of executables. How can we increase this value and where?
I am really starting to like CTEs. My only qualm is that you can only use them within the first statement after you declare them. I wish they would remain active for the duration of the sql batch just like everything else (variables,local temp tables, ect). Is there any trick so that you can use them multiple times? Maybe define them as a string and do dynamic sql or something like that? Hopefully (i have not researched the new version of sql server) in SQL Server 2008 the scope has increase. Anyways any help would be appreciated.
I am receiving an error from my ODBC driver “Maximum number of DBPROCESSes already allocated.”
I confirmed that there are 25 connections and that this is the default. This is caused by error message 10029, SQLEDBPS, when the maximum number of simultaneously open DBPROCESS structures exceeds the current setting. I would like to increase this maximum.
I have found only two ways to do this. One is using dbsetmaxprocs using C and the other is using SqlSetMaxProcs using Visual Basic. My problem is that I am interfacing to SQL Server using a third party tool that is doing the lower level programming.
Is there some way that I can increase the maximum number of DB processes for all databases that are part of the SQL Server 7 environment, or can I set this value using a program that is called from a stored procedure?
Any ideas in this area will be greatly appreciated.
Hi all, I encountered error 1105 (Can`t allocate space for object .....) while running a query. I had dump transaction with no log sucessfully and retry running my query but still encounter error 1105.
I had checked my database size and following are the info: Datasize : 2500MB Database size available : 753.39 Log space : 500MB Log space available : 500MB (this was acheive after after dump trans. twice. it was around 480MB before)
Thus looks like the data segment is full!!! I expanded the size of the disk device D on which the segment device A is on by 50MB. When I did a sp_helpdevice D, the expansion is reflected. Next I run sp_extendsegement, A, D. However running sp_helpsegment A doesn`t show that segment A is expanded. Please help!!! How do I expand the size of the segment? (without creating a new disk device, is it possible) Any other way to resolve this problem?
Also how to check if a particular segment is full? eg. the %usage of data/log segment.
I seem to remember that the font size and type can be controlled in SQL Server 6.5 and the process controlls the fonts displayed throughout the program - Enterprise Manager, Query Tool, etc.
I have just re-installed 6.5 and the fonts are way too small.
We are using SQL server 7.0, our users use to connect to SQL server remotely using Visual Basic Application. When all the users are connected it keeps on increasing the memory utilization and at 1,836,848 KB memory utilization it stops increasing the memory i.e memory utilization become stangnet. The actual amount of physical Memory is around 5GB but SQL server dont increase it after the treshhold level and the performance of the server get affected badly. Any one please suggest me the possible solution.
Am using Windows Server 2008R2,In this Server C:/ drive space is getting increase day by day as per my knowledge i have to format that system.If is there any other chances to get space and remove unwanted things from my Server.
I am trying to increase the price of an product by a user entered % for items with Dishwasher in the itemdesc.
Below is the procedure I have which doesn't appear to show any errors in the SQL Developer.
CREATE OR REPLACE PROCEDURE AdjustPrice( pItemDesc IN ITEM.ItemDesc%TYPE, pPercent IN NUMBER) IS BEGIN UPDATE Item SET ItemPrice = ItemPrice + ItemPrice * pPercent / 100 WHERE ItemDesc = pItemDesc; END;
This is my run script:
BEGIN AdjustPrice ('%Dishwasher%',10); END;
I think the problem is with the way I have done the run script to filter to items with Dishwasher in the description. I tried LIKE an that didn't work either.