I want to write a stored procedure in which I will update the all calendar week values (stored in a table). And to achieve this I will have to loop through all the records present in calendar table.
I am just wondering that should I use Cursor or TEMP table (caz of performance issues)?
As it will be part of data warehouse so all the processing will be carried out at SERVER ( means no client)
Can any one tell me that which would be the best solution ( Cursor or Temp table) in my case and WHY?
I have a task of rewriting all the old procs which were build on cursors, the management does want me to remove the cursor completely since that would mean rewriting everthing.. they want to use a temp table and run a cursor on the temp table. Since there are many cursors in each proc. Any suggestions on to go about it. I am against the use of cursor any everywhere in it , even if it calls for rewriting the whole db again. I did some tests on running the old proc and the semi new proc , which runs a cursor from the temp table for the cursor query. And it even bad than the cursor only. what does this mean? Is it really a very bad idea to run cursor on a temp table?
DECLARE @AssessBenefitID INT DECLARE @ActualQuantity INT DECLARE @ExpectedQuantity INT DECLARE @ActualQuality VARCHAR(2000) DECLARE @ExpectedQuality VARCHAR(2000) DECLARE @AssessFlag CHAR DECLARE CUR_BENEFIT CURSOR FOR SELECT AssessBenefitID,ProjectBenefitID,AssessFlag FROM PMPT_AssessBenefit WHERE ProjectBenefitID=@ProjectBenefitID
OPEN CUR_BENEFIT
FETCH NEXT FROM CUR_BENEFIT INTO @AssessBenefitID,@ProjectBenefitID,@AssessFlag
WHILE @@FETCH_STATUS = 0 BEGIN
SELECT @ActualQuantity=Quantity FROM dbo.PMPT_AssessBenefit WHERE AssessFlag='A' AND AssessBenefitID=@AssessBenefitID AND ProjectBenefitID=@ProjectBenefitID SELECT @ExpectedQuantity=Quantity FROM dbo.PMPT_AssessBenefit WHERE AssessFlag='E' AND AssessBenefitID=@AssessBenefitID AND ProjectBenefitID=@ProjectBenefitID SELECT @ActualQuality=Quality FROM dbo.PMPT_AssessBenefit WHERE AssessFlag='A' AND AssessBenefitID=@AssessBenefitID AND ProjectBenefitID=@ProjectBenefitID SELECT @ExpectedQuality=Quality FROM dbo.PMPT_AssessBenefit WHERE AssessFlag='E' AND AssessBenefitID=@AssessBenefitID AND ProjectBenefitID=@ProjectBenefitID
Hi , I have two tables within a SQL database. The 1st table has an identified column and column which lists one of more email identifers for a second table, e.g. ID Email -- ---------- 1 AS1 AS11 2 AS2 AS3 AS4 AS5 3 AS6 AS7
The second table has a column which has an email identifier and another column which lists one email address for that particular identifier, e.g. ID EmailAddress --- ------------------ AS1 abcstu@emc.com AS2 abcstu2@emc.com AS3 abcstu3@emc.com AS4 abcstu4@em.com AS5 abcstu5@emc.com AS6 abcstu6@emc.com AS7 abcstu7@emc.com AS11 abcstu8@emc.com I need to create a stored procedure or function that: 1. Selects an Email from the first table, based on a valid ID, 2. Splits the Email field of the first table (using the space separator) so that there is an array of Emails and then, 3. Selects the relevant EmailAddress value from the second table, based on a valid Email stored in the array Is there any way that this can be done directly within SQL Server using a stored procedure/function without having to use cursors?
DECLARE DBCur CURSOR FOR SELECT U_OB_DB FROM [@OB_TB04_COMPDATA]
OPEN DBCur FETCH NEXT FROM DBCur INTO @DBNAME
WHILE @@FETCH_STATUS = 0 BEGIN
SELECT @SQLCMD = 'SELECT T0.CARDCODE, T0.U_OB_TID AS TRANSID, T0.DOCNUM AS INV_NO, ' + + 'T0.DOCDATE AS INV_DATE, T0.DOCTOTAL AS INV_AMT, T0.U_OB_DONO AS DONO ' + + 'FROM ' + @DBNAME + '.dbo.OINV T0 WHERE T0.U_OB_TID IS NOT NULL' EXEC(@SQLCMD) PRINT @SQLCMD FETCH NEXT FROM DBCur INTO @DBNAME
END
CLOSE DBCur DEALLOCATE DBCur
Part 2
SELECT T4.U_OB_PCOMP AS PARENTCOMP, T0.CARDCODE, T0.CARDNAME, ISNULL(T0.U_OB_TID,'') AS TRANSID, T0.DOCNUM AS SONO, T0.DOCDATE AS SODATE, SUM(T1.QUANTITY) AS SOQTY, T0.DOCTOTAL - T0.TOTALEXPNS AS SO_AMT, T3.DOCNUM AS DONO, T3.DOCDATE AS DO_DATE, SUM(T2.QUANTITY) AS DOQTY, T3.DOCTOTAL - T3.TOTALEXPNS AS DO_AMT INTO #MAIN FROM ORDR T0 JOIN RDR1 T1 ON T0.DOCENTRY = T1.DOCENTRY LEFT JOIN DLN1 T2 ON T1.DOCENTRY = T2.BASEENTRY AND T1.LINENUM = T2.BASELINE AND T2.BASETYPE = T0.OBJTYPE LEFT JOIN ODLN T3 ON T2.DOCENTRY = T3.DOCENTRY LEFT JOIN OCRD T4 ON T0.CARDCODE = T4.CARDCODE WHERE ISNULL(T0.U_OB_TID,0) <> 0 GROUP BY T4.U_OB_PCOMP, T0.CARDCODE,T0.CARDNAME, T0.U_OB_TID, T0.DOCNUM, T0.DOCDATE, T3.DOCNUM, T3.DOCDATE, T0.DOCTOTAL, T3.DOCTOTAL, T3.TOTALEXPNS, T0.TOTALEXPNS
my question is, how to join the part 1 n part 2? is there posibility?
I think this is a very simple question, however, I don't know the answer. What is the difference between a regular Temp table and a Global Temp table? I need to create a temp table within an sp that all users will use. I want the table recreated each time someone accesses the sp, though, because some of the same info may need to be inserted and I don't want any PK errors.
i am inserting something into the temp table even without creating it before. But this does not give any compilation error. Only when I want to execute the stored procedure I get the error message that there is an invalid temp table. Should this not result in a compilation error rather during the execution time.?
--create the procedure and insert into the temp table without creating it. --no compilation error. CREATE PROC testTemp AS BEGIN INSERT INTO #tmp(dt) SELECT GETDATE() END
only on calling the proc does this give an execution error
Simple example: declare @tTable(col1 int) insert into @tTable(col1) values (1) select * from @tTable
Works perfectly in SQL Server Management Studio and the database connection is OK to as I may generate PP table using complex (or simple) queries without difficulty.
But when trying to get this same result in a PP table I get an error, idem when replacing table variable by a temporary table.
Message: OLE DB or ODBC error. .... The current operation was cancelled because another operation the the transaction failed.
If on the source I have a new column, the script generated by SqlPackage.exe recreates the table on the background with moving the data into a temp storage. If the table is big, such approach can cause issues.
Example of the script is below: in the source project I added columns [MyColumn_LINE_1] and [MyColumn_LINE_5].
Is there any way I can make it generating an alter statement instead?
BEGIN TRANSACTION; SET TRANSACTION ISOLATION LEVEL SERIALIZABLE; SET XACT_ABORT ON; CREATE TABLE [dbo].[tmp_ms_xx_MyTable] ( [MyColumn_TYPE_CODE] CHAR (3) NOT NULL,
[Code] ....
The same script is generated regardless the table having data or not, having a clustered or nonclustered PK.
The SP UserPersist_GetByCriteria does a "SELECT * FROM tbl_User WHERE gender = @Gender AND culture = @Culture", so why am I receiving this error when both tables have the same structure?
The error is being reported as coming from UserPersist_GetByCriteria on the "SELECT * FROM tbl_User" line.
I want to insert the data from temp table to other table. Only condition is, it needs to sorted based on tool number and tool date. For example if we have ten records for tool number 1000, it should be order by tool number and then based on tool_dt. Both tables doesn't have any primary keys. Please find below my code. I removed all the unnecessary columns for simple understanding. INSERT INTO tool_summary (tool_nbr, tool_dt) select tool_nbr, tool_dt from #tool order by tool_nbr, tool_dt...But this query is not working as expected. Data is getting shuffled.
WE have a job that loads data from an Oralce DB into our SQL Server 2000 DB twice a day. The schedule has just changed so that now there is a possibility of having my west coast users impacted when it runs at 5 PM PST and my east coast users impacted when it runs at 7 AM EST. As a workaround, I have developed a DTS package that loads the data into temp tables instead of the real tables. IE. Oracle -> XTable_temp instead of Oracle -> XTable. The load sometimes takes about an hour to an hour and a half to load, so this solution works great, but I want to then lock the table, delete it and rename the temp table to table X. The pseudo code would be:
Begin Transaction
Lock Table XTable
Drop XTable
Alter Table XTable_temp rename to XTable
Release Lock XTable
End Transaction
Create XTable_temp
I see two issues with this solution. 1) I think if I can lock XTable that the lock would be released when the table is dropped and the XTable_temp was being renamed. 2) I can't find a command to rename a table.
I want to pass the 'inserted' table from a trigger into an SP, I think I need to do this by dumping inserted table into a temporary table and passing the temp table. However, I need to do this for many tables, and don't want to list all the column names for each table/trigger (maintenance nightmare).
Can I dump the 'inserted' table to a temp table WITHOUT specifying the column names?
I need to decide what is better to use: global temp table ( I can't use local one) or permanent table in SQL 2000 stored procedures. I extract data from linked server table and update several tables on our server. Those procedures scheduled to run every 3 hours.
Another question: for some reasons when I used global temp table, I wasn't able to schedule multi steps with every step executing one of the stored procedures.I think global temp tables should be visible to other stored procedures, right?
Hi everyone, I'm fairly new to sql and right now I am struggling with a script. I am trying to extract data from a normal table into a temporary table, update it in the temporary table, then put it back into the normal table. I'll display my code, let me know what you think, any suggestions are appreciated. Thanks a lot.
Hi, I wanna know is there any advantage of perf gain when using Derived Tables over Temp Tables, advice me which one is better to use. Can I create Indexes and Insert/Update records into Derived Tables.
Hi All,Hope someone can help me...Im trying to highlight the advantages of using table variables asapposed to temp tables within single scope.My manager seems to believe that table variables are not advantageousbecause they reside in memory.He also seems to believe that temp tables do not use memory...Does anyone know how SQL server could read data from a temp tablewithout passing the data contained therein through memory???Is this a valid advantage/disadvantage of table variables VS temptables?
In a previous post "Could #TempTable within SP cause lock on tempdb?" http://forums.microsoft.com/msdn/showpost.aspx?postid=2691763&siteid=1
It was obvious that we have to limit the use of #Temp table to a minimum. Let assume that some of the temp tables are really difficult to replace and we have to live with them.
Would it be easier on tempdb if the #TempTable is replaced by a table variable? Or do they all end up in tempdb?
I am trying to update a table in one database with data from a temporary table which i created in the tempdb.
I want to update field1 in the table with the tempfield1 from the #temp_table
The code looks something like this:
Use master UPDATE [dbname].dbo.table SET [dbname].dbo.table.field1 = [tempdb].dbo.#temp_table.tempfield1 WHERE ( [dbname].dbo.table.field2= [tempdb].dbo.#temp_table.tempfield2 AND [dbname].dbo.table.field3= [tempdb].dbo.#temp_table.tempfield3 AND [dbname].dbo.table.field4= [tempdb].dbo.#temp_table.tempfield4)
I get the following error: Msg 4104, Level 16, State 1, Line 2 The multi-part identifier "tempdb.dbo.#temp_table.tempfield2" could not be bound. Msg 4104, Level 16, State 1, Line 2 The multi-part identifier "tempdb.dbo.#temp_table.tempfield3" could not be bound. Msg 4104, Level 16, State 1, Line 2 The multi-part identifier "tempdb.dbo.#temp_table.tempfield4" could not be bound.
I am querying a table in oracle, the server connection to the Oracle database is determined by a criteria. Though how can I put the results from the oracle query into a temp table ?
This is the code i'm using for the query:
DECLARE @cmd VARCHAR(500) declare @Year varchar(25) set @Year = '2006'
Currently using a cursor in a stored procedure to retrieve data from one table and to put it into another table. like so:
declare @HTR money declare @Uniqueid varchar(15)
declare cur_HTR cursor for select uniqueid, sum(hours_to_resolve) as thtr from com_trail_helpdesk_module group by uniqueid
open cur_htr fetch next from cur_HTR into @Uniqueid, @HTR
while @@fetch_status = 0 begin update com_hpd_helpdesk_history set total_hours_to_resolve = @HTR where case_id_ = @Uniqueid and dateasat = @dateasat fetch next from cur_HTR into @Uniqueid, @HTR end
close cur_htr deallocate cur_htr ...
This is taking about 45 minutes each time to do 21k records. Is there a faster, better way to do this?
HI ALL,iam creating a temporary table using following Stored procedure but when i complile the Stored procedure iam getting the following errorServer: Msg 2714, Level 16, State 1, Procedure SP!, Line 15There is already an object named '#TEMP2' in the database.the stored procedure is ALTER PROC SP1@SELECT VARCHAR(15)=NULL AS BEGIN IF @SELECT ='FOLDER'BEGIN SELECT DISTINCT(HIERARCHY_ID),HIERARCHY_NAME,HIERARCHY_DESCRIPTION,HIERARCHY_PARENT_ID INTO #TEMP2 FROM BM_HIERARCHY_MASTER ENDELSE IF @SELECT='PAGE' BEGIN SELECT DISTINCT(HIERARCHY_ID),HIERARCHY_NAME,HIERARCHY_DESCRIPTION,HIERARCHY_PARENT_ID INTO #TEMP2 FROM BM_HIERARCHY_MASTEREND
Which one is better to use User Defined function or Temp Table?
I am working on SQL Server 2005, and I have a user defined function that returns a table. I need to run queries on this table.
I am not sure if I should run the function once and store the results in a temp table and run queries on the temp table or call the UDF multiple times. I am concerned about performance.