Query Runs In Sub Second Time Until I Put It In A Stored Procedure Then It Takes 2 Minutes.
Jun 25, 2007
The query below runs in sub second time if I don't call it as a stored procedure. I have looked at the execution plan for both the query and the query as a stored procedure and they are the same.
When I put the query into a stored procedure it takes over 2 minutes to run.
All feedback (even the ugly stuff) is more than welcome. I want to master this issue and forever put it behind me.
This is the sql when I just execute it outright:1 DECLARE
2 @WebUserID nvarchar(20)
3 ,@DocumentTypeID int
4 ,@RouteID nvarchar(10)
5 ,@CustomerID nvarchar(15)
6 ,@DocumentIDPrefix nvarchar(20)
7 ,@StartDate datetime
8 ,@EndDate datetime
9 ,@OversoldOnly bit
10 ,@DexCustomersOnly bit
11 ,@DeviationsOnly bit
12 ,@CashNoPaymentOnly bit
13 ,@SignatureName nvarchar(45)
14 ,@SortExpression varchar(200)
15 ,@StartRowIndex int
16 ,@MaximumRows int
17
18 SET @WebUserID = 'manager'
19 SET @DocumentTypeID = 0
20 SET @DocumentIDPrefix = '%'
21 SET @StartDate = '04/17/2007'
22 SET @EndDate = '04/19/2007'
23 SET @OversoldOnly = 0
24 SET @DexCustomersOnly = 0
25 SET @DeviationsOnly = 0
26 SET @CashNoPaymentOnly = 0
27 SET @SortExpression = ''
28 SET @StartRowIndex = 0
29 SET @MaximumRows = 20;
30
31 WITH OrderedDocumentHistory AS
32 (
33 SELECT
34 dh.DocumentHistoryID
35 ,dh.DocumentID
36 ,dh.DocumentTypeID
37 ,dh.DocumentTypeDesc
38 ,dh.RouteID
39 ,dh.RouteDesc
40 ,dh.CustomerID
41 ,dh.CustomerName
42 ,dh.DocDate
43 ,ISNULL(dc.HasReceipt, 0) AS 'HasReceipt'
44 ,ddt.Description AS 'SignatureReason'
45 ,a.Amount
46 ,ROW_NUMBER() OVER (ORDER BY dh.DocDate DESC) AS 'RowNumber'
47 FROM
48 DocumentHistory dh
49 INNER JOIN Customers c ON dh.CustomerID = c.CustomerID
50 INNER JOIN DeviationTypes ddt ON dh.DriverDeviationTypeID = ddt.DeviationTypeID
51 INNER JOIN
52 (
53 SELECT
54 DocumentHistoryID
55 ,(COALESCE(SUM((CONVERT(INT, Units + DeviationUnits)) * (UnitPrice - UnitDiscount)) + SUM((CONVERT(INT, Cases + DeviationCases)) * (CasePrice - CaseDiscount)), 0.0)) AS Amount
56 FROM
57 DocumentHistoryItems dhia
58 GROUP BY
59 dhia.DocumentHistoryID
60 ) AS a ON a.DocumentHistoryID = dh.DocumentHistoryID
61 LEFT OUTER JOIN
62 (
63 SELECT DISTINCT
64 dca.DocumentID
65 ,1 AS 'HasReceipt'
66 FROM
67 DocumentCollections dca
68 ) AS dc ON dh.DocumentID = dc.DocumentID
69 WHERE
70 dh.DocDate BETWEEN @StartDate AND @EndDate
71 AND (dh.DocumentTypeID = @DocumentTypeID OR @DocumentTypeID IS NULL)
72 AND (dh.RouteID = @RouteID OR @RouteID IS NULL)
73 AND (dh.CustomerID = @CustomerID OR @CustomerID IS NULL)
74 AND dh.DocumentID LIKE @DocumentIDPrefix
75 AND CASE WHEN @OversoldOnly = 1 THEN ISNULL( (SELECT TOP 1 (dhio.DeviationUnits + dhio.DeviationCases) FROM DocumentHistoryItems dhio WHERE dh.DocumentHistoryID = dhio.DocumentHistoryID AND (dhio.DeviationUnits > 0 OR dhio.DeviationCases > 0)), 0) ELSE 1 END > 0
76 AND CASE WHEN @DexCustomersOnly = 1 THEN c.DEXEnable ELSE 'Y' END = 'Y'
77 AND CASE WHEN @DeviationsOnly = 1 THEN ISNULL( (SELECT TOP 1 (dhio.DeviationUnits + dhio.DeviationCases) FROM DocumentHistoryItems dhio WHERE dh.DocumentHistoryID = dhio.DocumentHistoryID AND (dhio.DeviationUnits != 0 OR dhio.DeviationCases != 0)), 0) ELSE 1 END != 0
78 AND CASE WHEN @CashNoPaymentOnly = 1 THEN dh.Terms ELSE 'CHECK/CASH' END = 'CHECK/CASH'
79 AND CASE WHEN @CashNoPaymentOnly = 1 THEN (SELECT MAX(dhio.AlcoholPct) FROM DocumentHistoryItems dhio WHERE dhio.DocumentHistoryID = dh.DocumentHistoryID) ELSE 1 END > 0
80 AND CASE WHEN @CashNoPaymentOnly = 1 THEN ISNULL(dc.HasReceipt, 0) ELSE 0 END = 0
81 AND (dh.SigName = @SignatureName OR @SignatureName IS NULL)
82 AND (c.WarehouseID IN (SELECT WarehouseID FROM WebUserWarehouses WHERE WebUserID = @WebUserID)
83 OR @WebUserID IS NULL)
84 )
85
86 SELECT
87 DocumentHistoryID
88 ,DocumentID
89 ,DocumentTypeDesc
90 ,RouteID
91 ,RouteDesc
92 ,CustomerID
93 ,CustomerName
94 ,DocDate
95 ,Amount
96 ,HasReceipt
97 ,SignatureReason
98 FROM
99 OrderedDocumentHistory
100 WHERE
101 RowNumber BETWEEN (@StartRowIndex + 1) AND (@StartRowIndex + @MaximumRows) Here is the sql for creating the stored procedure. 1 CREATE Procedure w_DocumentHistory_Select
2 (
3 @WebUserID nvarchar(20)
4 ,@DocumentTypeID int
5 ,@RouteID nvarchar(10)
6 ,@CustomerID nvarchar(15)
7 ,@DocumentIDPrefix nvarchar(20)
8 ,@StartDate datetime
9 ,@EndDate datetime
10 ,@OversoldOnly bit
11 ,@DexCustomersOnly bit
12 ,@DeviationsOnly bit
13 ,@CashNoPaymentOnly bit
14 ,@SignatureName nvarchar(45)
15 ,@SortExpression varchar(200)
16 ,@StartRowIndex int
17 ,@MaximumRows int
18 )
19 AS
20 SET NOCOUNT ON
21
22 IF LEN(@SortExpression) = 0 OR @SortExpression IS NULL
23 SET @SortExpression = 'Number DESC'
24
25 IF @StartRowIndex IS NULL
26 SET @StartRowIndex = 0
27
28 IF @MaximumRows IS NULL
29 SELECT
30 @MaximumRows = COUNT(dh.DocumentHistoryID)
31 FROM
32 DocumentHistory dh;
33
34 WITH OrderedDocumentHistory AS
35 (
36 SELECT
37 dh.DocumentHistoryID
38 ,dh.DocumentID
39 ,dh.DocumentTypeID
40 ,dh.DocumentTypeDesc
41 ,dh.RouteID
42 ,dh.RouteDesc
43 ,dh.CustomerID
44 ,dh.CustomerName
45 ,dh.DocDate
46 ,ISNULL(dc.HasReceipt, 0) AS 'HasReceipt'
47 ,ddt.Description AS 'SignatureReason'
48 ,a.Amount
49 ,CASE
50 WHEN @SortExpression = 'Number DESC' THEN (ROW_NUMBER() OVER (ORDER BY dh.DocumentID DESC))
51 WHEN @SortExpression = 'Number ASC' THEN (ROW_NUMBER() OVER (ORDER BY dh.DocumentID ASC))
52 WHEN @SortExpression = 'CustomerName DESC' THEN (ROW_NUMBER() OVER (ORDER BY dh.CustomerName DESC))
53 WHEN @SortExpression = 'CustomerName ASC' THEN (ROW_NUMBER() OVER (ORDER BY dh.CustomerName ASC))
54 WHEN @SortExpression = 'CompletedDate DESC' THEN (ROW_NUMBER() OVER (ORDER BY dh.DocDate DESC))
55 WHEN @SortExpression = 'CompletedDate ASC' THEN (ROW_NUMBER() OVER (ORDER BY dh.DocDate ASC))
56 WHEN @SortExpression = 'RouteDescription DESC' THEN (ROW_NUMBER() OVER (ORDER BY dh.RouteDesc DESC))
57 WHEN @SortExpression = 'RouteDescription ASC' THEN (ROW_NUMBER() OVER (ORDER BY dh.RouteDesc ASC))
58 END AS 'RowNumber'
59 FROM
60 DocumentHistory dh
61 INNER JOIN Customers c ON dh.CustomerID = c.CustomerID
62 INNER JOIN DeviationTypes ddt ON dh.DriverDeviationTypeID = ddt.DeviationTypeID
63 INNER JOIN
64 (
65 SELECT
66 DocumentHistoryID
67 ,(COALESCE(SUM((CONVERT(INT, Units + DeviationUnits)) * (UnitPrice - UnitDiscount)) + SUM((CONVERT(INT, Cases + DeviationCases)) * (CasePrice - CaseDiscount)), 0.0)) AS Amount
68 FROM
69 DocumentHistoryItems dhia
70 GROUP BY
71 dhia.DocumentHistoryID
72 ) AS a ON a.DocumentHistoryID = dh.DocumentHistoryID
73 LEFT OUTER JOIN
74 (
75 SELECT DISTINCT
76 dca.DocumentID
77 ,1 AS 'HasReceipt'
78 FROM
79 DocumentCollections dca
80 ) AS dc ON dh.DocumentID = dc.DocumentID
81 WHERE
82 dh.DocDate BETWEEN @StartDate AND @EndDate
83 AND (dh.DocumentTypeID = @DocumentTypeID OR @DocumentTypeID IS NULL)
84 AND (dh.RouteID = @RouteID OR @RouteID IS NULL)
85 AND (dh.CustomerID = @CustomerID OR @CustomerID IS NULL)
86 AND dh.DocumentID LIKE @DocumentIDPrefix
87 AND CASE WHEN @OversoldOnly = 1 THEN ISNULL( (SELECT TOP 1 (dhio.DeviationUnits + dhio.DeviationCases) FROM DocumentHistoryItems dhio WHERE dh.DocumentHistoryID = dhio.DocumentHistoryID AND (dhio.DeviationUnits > 0 OR dhio.DeviationCases > 0)), 0) ELSE 1 END > 0
88 AND CASE WHEN @DexCustomersOnly = 1 THEN c.DEXEnable ELSE 'Y' END = 'Y'
89 AND CASE WHEN @DeviationsOnly = 1 THEN ISNULL((SELECT TOP 1 (dhio.DeviationUnits + dhio.DeviationCases) FROM DocumentHistoryItems dhio WHERE dh.DocumentHistoryID = dhio.DocumentHistoryID AND (dhio.DeviationUnits != 0 OR dhio.DeviationCases != 0)), 0) ELSE 1 END != 0
90 AND CASE WHEN @CashNoPaymentOnly = 1 THEN dh.Terms ELSE 'CHECK/CASH' END = 'CHECK/CASH'
91 AND CASE WHEN @CashNoPaymentOnly = 1 THEN (SELECT MAX(dhio.AlcoholPct) FROM DocumentHistoryItems dhio WHERE dhio.DocumentHistoryID = dh.DocumentHistoryID) ELSE 1 END > 0
92 AND CASE WHEN @CashNoPaymentOnly = 1 THEN ISNULL(dc.HasReceipt, 0) ELSE 0 END = 0
93 AND (dh.SigName = @SignatureName OR @SignatureName IS NULL)
94 AND (c.WarehouseID IN (SELECT WarehouseID FROM WebUserWarehouses WHERE WebUserID = @WebUserID)
95 OR @WebUserID IS NULL)
96 )
97 SELECT
98 DocumentHistoryID
99 ,DocumentID
100 ,DocumentTypeDesc
101 ,RouteID
102 ,RouteDesc
103 ,CustomerID
104 ,CustomerName
105 ,DocDate
106 ,Amount
107 ,HasReceipt
108 ,SignatureReason
109 FROM
110 OrderedDocumentHistory
111 WHERE
112 RowNumber BETWEEN (@StartRowIndex + 1) AND (@StartRowIndex + @MaximumRows)
Here is the code for calling the stored procedure:1 DECLARE @RC int
2 DECLARE @WebUserID nvarchar(20)
3 DECLARE @DocumentTypeID int
4 DECLARE @RouteID nvarchar(10)
5 DECLARE @CustomerID nvarchar(15)
6 DECLARE @DocumentIDPrefix nvarchar(20)
7 DECLARE @StartDate datetime
8 DECLARE @EndDate datetime
9 DECLARE @OversoldOnly bit
10 DECLARE @DexCustomersOnly bit
11 DECLARE @DeviationsOnly bit
12 DECLARE @CashNoPaymentOnly bit
13 DECLARE @SignatureName nvarchar(45)
14 DECLARE @SortExpression varchar(200)
15 DECLARE @StartRowIndex int
16 DECLARE @MaximumRows int
17
18 SET @WebUserID = 'manager'
19 SET @DocumentTypeID = 0
20 SET @DocumentIDPrefix = '%'
21 SET @StartDate = '04/17/2007'
22 SET @EndDate = '04/19/2007'
23 SET @OversoldOnly = 0
24 SET @DexCustomersOnly = 0
25 SET @DeviationsOnly = 0
26 SET @CashNoPaymentOnly = 0
27 SET @SortExpression = ''
28 SET @StartRowIndex = 0
29 SET @MaximumRows = 20;
30
31 EXECUTE @RC = [Odom].[dbo].[w_DocumentHistory_Select]
32 @WebUserID
33 ,@DocumentTypeID
34 ,@RouteID
35 ,@CustomerID
36 ,@DocumentIDPrefix
37 ,@StartDate
38 ,@EndDate
39 ,@OversoldOnly
40 ,@DexCustomersOnly
41 ,@DeviationsOnly
42 ,@CashNoPaymentOnly
43 ,@SignatureName
44 ,@SortExpression
45 ,@StartRowIndex
46 ,@MaximumRows
I have an SSIS package that when run from Visual Studio takes 1 minute or less to complete. When I schedule this package to run as a SQL Server job it takes 5+ and sometimes hangs complaining about buffers.
The server is a 4 way 3ghz Xeon (dual core) with 8GB ram and this was the only package running.
When I look in the log I see that the package is running and processing data, although very very very very very slowly.
update xxx_TableName_xxx set d_50 = 'DE',modify_timestamp = getdate(),modified_by = 1159
where enc_id in
('C24E6640-D2CC-45C6-8C74-74F6466FA262',
'762E6B26-AE4A-4FDB-A6FB-77B4782566C3',
'D7FBD152-F7AE-449C-A875-C85B5F6BB462')
but From linked server this takes 8 minutes????!!!??!:
update [xxx_servername_xxxx].xxx_DatabaseName_xxx.dbo.xxx_TableName_xxx set d_50 = 'DE',modify_timestamp = getdate(),modified_by = 1159
where enc_id in
('C24E6640-D2CC-45C6-8C74-74F6466FA262',
'762E6B26-AE4A-4FDB-A6FB-77B4782566C3',
'D7FBD152-F7AE-449C-A875-C85B5F6BB462')
What settings or whatever would cause this to take so much longer from the linked server?
Edit: Note) Other queries from the linked server do not have this behavior. From the stored procedure where we have examined how long each query/update takes... this particular query is the culprit for the time eating. I thought it was to do specefically with this table. However as stated when a query window is opened directly onto that server the update takes no time at all.
2nd Edit: Could it be to do with this linked server setting? Collation Compatible right now it is set to false? I also asked this question in a message below, but figured I should put it up here.
I have a sp that was taking very little time (about 34 sec). But suddenly is stacked. It is running and running and running but not LOCKED neither SUSPENDED. It is always RUNNABLE. I have made Index and statistics optimization but nothing. I looked into execution plan but everything seems ok. All the time is in 3 indexes that are Index Seek and not Table Scan!!! So why is stacked... I do not know how much time it takes because I have to stop it. (SQL SERVER 2008 R2, the database was migrated from SQL SERVER 2000)
I have been googling all day and not found a reasonable solution to my problem.
Here is what I am trying to do. The T-SQL code is included at the bottom of this message.
- The user can specify columns that are stored in a CVColumn table (the data for which comes from an external source). For example, we may provide LastName and FirstName columns and the user could add ShoeSize and FavoriteColor.
- When we get the data from the other source, we create it as a dataset in the *code* with each user-defined field as a column of the table and each person as a row in the table. This is used *everywhere* including binding to grids, etc... So the dataset may look like this: PersonID LastName FirstName ShoeSize FavoriteColor
- The dba has required that the data be stored in a normalized fashion. Since we don't know what the columns are - the data is stored in the *table* as: PersonID ColumnName ColumnValue
- When the user gets the data from the external source the first time, the code loops through every person row and every column of the row to insert fields into the table using an Insert stored proc. This performs reasonable well.
- When the code retrieves the data from the database, it needs to reformat the data into one row per person, one column per field so that all of the other code can do the binding, etc.
- So I created a temp table with the appropriate rows and columns and then updated that table with the appropriate data. *This* is the stored proc that has been running now for almost an hour.
- There *must* be a way to do this that performs better?
Here is my T-SQL:
Code Snippet DECLARE @columnNamesWithSizes varchar(8000) DECLARE @delim varchar(1) DECLARE @values varchar(8000) -- Get the set of column names SET @columnNamesWithSizes='swiCMKey VARCHAR(4092), swiCMContactManager VarChar(100), swiCMContactManagerID int, swiCMFolder VarChar(4092), swiCMCVFolder VarChar(4092)' SELECT @columnNamesWithSizes=COALESCE(@columnNamesWithSizes + ',', '') + ColumnName + ' VARCHAR(50)' FROM CVColumn -- Create the table with the desired set of columns EXEC ('CREATE TABLE ##tempTable ( ' + @columnNamesWithSizes + ')')
-- Insert the primary key into each row EXEC ('INSERT INTO ##tempTable (swiCMKey) SELECT DISTINCT ContactID as swiCMKey FROM CVContact') -- Use a cursor to loop through all of the records SET @delim='''' DECLARE @swiCMKey VARCHAR(4096) DECLARE contactList CURSOR FOR SELECT swiCMKey FROM ##tempTable OPEN contactList FETCH NEXT FROM contactList INTO @swiCMKey
-- Loop until all rows in temp table have been processed. WHILE @@FETCH_STATUS = 0 BEGIN SET @values=null -- Retrieve all of the columns for this contact SELECT @values = COALESCE(@values + ',', '') + ColumnName + '=' + @delim + COALESCE(Value,'') + @delim FROM CVContact WHERE CVContact.ContactID=@swiCMKey
-- Perform the update to the temp table with the column values EXEC ('UPDATE ##tempTable SET ' + @values + ' WHERE swiCMKey=' + @delim + @swiCMKey + @delim)
-- Get the next one FETCH NEXT FROM contactList INTO @swiCMKey END CLOSE contactList DEALLOCATE contactList -- Return the results SELECT * from ##tempTable -- Kill the temp table DROP TABLE ##tempTable
Any ideas or othre suggestions would be much appreciated!
(BTW - up until today the data from the code-based dataset was stored as XML and then read back as XML. However, we found with 80,000+ people that it took 5 minutes to read the XML. But that is faster than this stored proc!<G>)
I have a Stored Procedure that has a query in it and it take 0 second and then a stored procedure that takes 16 seconds. From what I can tell they shoul be the same.
It doesn't recompile when i run the stored procedure, I checked that.
Ok, I'll admit right off the bat that I never suspected that I'd ever raise this complaint, much less worry about how to fix the "problem" associated with it!
We're preparing to take a large set of changes (projects) to PeopleSoft Financials from development to test. The code is still somewhat rough, but it has been "desk checked" to ensure that it does what the developers think that it ought to do, and they've blessed it at that point. The code is now moving into the test phase, and the QA team is finding locking/blocking issues that we've never seen in this code before... Sort of a "lock avalanche" where no one process locks for very long, but many of them block one another to the point where applications actually "freeze" while almost never hitting a deadlock.
My solution was to create a "blitzkrieg" query / stored procedure that would periodically sample master.dbo.sysprocesses, master.dbo.sysdatabases, and apply one of the dm_ functions to gather information on locking, blocking, and deadlocking. My procedure runs nicely (it never hangs) and gets about 99.3% of the data that I want.
The problem is that the blasted query / stored procedure runs either too fast or too slow, depending on how you look at it. Because the dm_ function takes a few ms to run, there can be a situation where either a row appears as a false positive or as a missing row because of timing... Either the culprit shows up as a blocker, but by the time the victim spid is evaluated the block has cleared, or the row is skipped and by the time the victim is evaluated the block has occured.
The whole process runs in well under 100 ms when there is nothing to report, and I've never seen it run 200 ms yet under the worst conditions it has faced, so the code is fast... The problem is that I really don't want to try to enforce any kind of locking to resolve the issue, because that locking would impact performance and that is EXACTLY what I do NOT want to do.
I have a very complex Stored Procedure called by a Job that is Scheduled to run every night. It's execution takes sometimes 1 or 2 hours and sometimes 7 hours or more.
So, if it is running for more than 4 hours I stop the Job and I run the procedure from a Query Window and it never takes more than 2 hours.
Can anyone help me identify the problem ? I want to run from the Job and not to worry about it.
Some more information: - It is SQL 2000 Enterprise with SP4 in a Cluster (It happens the same way in any node). - The SQL Server and SQL Agent services run using a Domain Account that have full Administrative access. - When I connect to a Query Window I also use a Windows Account.
- There is no locks or process bloking or being blocked while the job is running. - Using the Task Manager the processor activity is ok, no more than 30 % in any processor.
I have a stored proceedure (which I will tag on at the end for those interested) which is taking at least 15 minutes to run when executed, but completes in 1 minute when the tsql statement is run in Query Analyser. Why is this?
I suspect that it may be connected to table indexing, but why then is this bypassed when QA is used?
Any advice appreciated.
Derek
************************************************** *********************** IF EXISTS (SELECT * FROM sysobjects WHERE id = object_id(N'dbo.sp_ValidateAIGL') AND OBJECTPROPERTY(id, N'IsProcedure') = 1) DROP PROCEDURE dbo.sp_ValidateAIGL GO
-- *** Global Non Program Specific Data Errors *** -- CHECK - that there are records in the DEB_IGL_PAYROLL_OUTPUT file.....none and the routine failed... IF NOT EXISTS(SELECT * FROM tbl_OUT_Payroll WHERE IGLProgramID = @IGLProgramID) INSERT INTO #IGLErrors SELECT NULL, 100, 'No records were processed by the IGL run!'
SELECT * FROM #IGLErrors
-- CHECK - search for any records where the employee's EXPENSE_CODE is NULL INSERT INTO #IGLErrors SELECT DISTINCT NULLIF(EmpNo, ''), 2, 'Employee "' + COALESCE(NULLIF(RTRIM(EmpNo), ''), '<Missing Employee>') + '" (Organisation Unit - ' + COALESCE(RTRIM(OrgUnitCode), '<No Organisation Unit>') + ') does not have a EXPENSE_CODE Code.' FROM tbl_OUT_Payroll WHERE NULLIF(ExpenseCode, '') IS NULL ANDIGLProgramID = @IGLProgramID
SELECT * FROM #IGLErrors -- CHECK - check that the BALANCE of DEBITs match the balance of CREDITs IF (SELECT SUM(Cash) FROM tbl_OUT_Payroll WHERE IsCredit = 1 AND IGLProgramID = @IGLProgramID) <> (SELECT SUM(Cash) FROM tbl_OUT_Payroll WHERE IsCredit = 0 AND IGLProgramID = @IGLProgramID) INSERT INTO #IGLErrors SELECT NULL, 3, 'The total cash value for DEBIT elements does not match the total cash for CREDIT elements.'
SELECT * FROM #IGLErrors -- *** Program 1 and 2 errors *** IF @IGLProgramID IN (1, 2) BEGIN -- CHECK - search for any records where the employee's COST_CENTRE is NULL INSERT INTO #IGLErrors SELECT DISTINCT NULLIF(EmpNo, ''), 1, 'Employee "' + NULLIF(RTRIM(EmpNo), '') + '" (Organisation Unit = ' + RTRIM(OrgUnitCode) + ') does not have a COST_CENTRE Code.' FROM tbl_OUT_Payroll WHERE NULLIF(CostCenter, '') IS NULL ANDIGLProgramID = @IGLProgramID
SELECT * FROM #IGLErrors
-- Check for EMPLOYEEs that were not transfered to the PAYROLL output (usually caused by missing ORG_UNITs or not picked up in -- the DEB_VIEW_APPOINTEE view...) INSERT INTO #IGLErrors SELECT DISTINCT EMP_NO, 11, 'Employee "' + RTRIM(EMP_NO) + '" was excluded from the summary. Check their Organisation Unit codes!' FROM PSELive.dbo.COSTING_OUTPUT WHERENOT EMP_NO IN (SELECT DISTINCT EmpNo FROM tbl_OUT_Payroll WHERE IGLProgramID = @IGLProgramID) ANDPERIOD_NO = @PeriodNo ANDTAX_YEAR = @TaxYear
SELECT * FROM #IGLErrors
-- Check that there are no ELEMENTS in the COSTING_OUTPUT table that don't exist in the tbl_IGLElements table INSERT INTO #IGLErrors SELECT DISTINCT ELEMENT, 12, 'Element "' + RTRIM(ELEMENT) + '" does not exist in the IGL Interface Elements table!' FROM PSELive.dbo.COSTING_OUTPUT WHERE ELEMENT NOT IN ( SELECT DISTINCT Element FROM tbl_IGLElements ) ANDPERIOD_NO = @PeriodNo
SELECT * FROM #IGLErrors
END
-- *** Add a error to indicate the number of errors *** IF EXISTS (SELECT * FROM #IGLErrors) INSERT INTO #IGLErrors SELECT 0, 0, 'Warning, there are ' + CAST(Count(*) AS VarChar(5)) + ' recorded errors!' FROM#IGLErrors
-- Transfer the records to the ErrorsLog table ready for the user to view... DELETE FROM tbl_SYSErrorsLog INSERT INTO tbl_SYSErrorsLog (IGLProgramID, OutputLogID, KeyField, ErrorID, Description) SELECT@ProgramLogID, @IGLPeriodID, KeyField, ErrorID, Description FROM #IGLErrors ORDER BY ErrorID
DROP TABLE #IGLErrors
SELECT * FROM tbl_SYSErrorsLog ORDER BY ErrorID
--SET NOCOUNT OFF
GO GRANT EXECUTE ON dbo.sp_ValidateAIGL TO Public GO
Select 'PIT_ID' = CASE WHEN Best_BID_DATA.PIT_ID IS NOT NULL THEN Best_BID_DATA.PIT_ID ELSE Best_OFFER_DATA.PIT_ID END, Best_Bid_Data.Bid_Customer, Best_Bid_Data.Bid_Size, Best_Bid_Data.Bid_Price, Best_Bid_Data.Bid_Order_Id, Best_Bid_Data.Bid_Order_Version, Best_Bid_Data.Bid_ProductId, Best_Bid_Data.Bid_TraderId, Best_Bid_Data.Bid_BrokerId, Best_Bid_Data.Bid_Reference, Best_Bid_Data.Bid_Indicative, Best_Bid_Data.Bid_Park, Best_Offer_Data.Offer_Customer, Best_Offer_Data.Offer_Size, Best_Offer_Data.Offer_Price, Best_Offer_Data.Offer_Order_Id, Best_Offer_Data.Offer_Order_Version, Best_Offer_Data.Offer_ProductId, Best_Offer_Data.Offer_TraderId, Best_Offer_Data.Offer_BrokerId, Best_Offer_Data.Offer_Reference, Best_Offer_Data.Offer_Indicative, Best_Offer_Data.Offer_Park
from ( Select PITID PIT_ID, CustomerId Bid_Customer, Size Bid_Size, Price Bid_Price, orderid Bid_Order_Id, Version Bid_Order_Version, ProductId Bid_ProductId, TraderId Bid_TraderId, BrokerId Bid_BrokerId, Reference Bid_Reference, Indicative Bid_Indicative, Park Bid_Park From OrderTable C Where version = (select max(version) from OrderTable where orderid = c.orderid) and BuySell = 'B' and Status <> 'D' and Park <> 1 and PitId in (select distinct pitid from MarketViewDef Where MktViewId = 4) and Price = ( Select max(Price) From OrderTable cc where version = (select max(version) from OrderTable where orderid = cc.orderid) and PitId = c.PitId and BuySell = 'B' and Status <> 'D' and Park <> 1 ) and Orderdate = ( Select min(Orderdate) From OrderTable dd where version = (select max(version) from OrderTable where orderid = dd.orderid) and PitId = c.PitId and BuySell = 'B' and Status <> 'D' and Price = c.Price and Park <> 1 ) and OrderId = (select top 1 OrderId from OrderTable ff Where version = (select max(version) from OrderTable where orderid = ff.orderid) and orderid = ff.orderid and PitId = c.PitId and BuySell = 'B' and Status <> 'D' and Price = c.Price and Orderdate = c.Orderdate and Park <> 1 )
) Best_Bid_Data
full outer join ( Select PITID PIT_ID, CustomerId Offer_Customer, Size Offer_Size, Price Offer_Price, orderid Offer_Order_Id, Version Offer_Order_Version, ProductId Offer_ProductId, TraderId Offer_TraderId, BrokerId Offer_BrokerId, Reference Offer_Reference, Indicative Offer_Indicative, Park Offer_Park From OrderTable C Where version = (select max(version) from OrderTable where orderid = c.orderid) and BuySell = 'S' and Status <> 'D' and Park <> 1 and PitId in (select distinct pitid from MarketViewDef Where MktViewId = 4) and Price = ( Select min(Price) From OrderTable cc where version = (select max(version) from OrderTable where orderid = cc.orderid) and PitId = c.PitId and BuySell = 'S' and Status <> 'D' and Park <> 1 ) and Orderdate = ( Select min(Orderdate) From OrderTable dd where version = (select max(version) from OrderTable where orderid = dd.orderid) and PitId = c.PitId and BuySell = 'S' and Status <> 'D' and Price = c.Price and Park <> 1 ) and OrderId = (select top 1 OrderId from OrderTable ff Where version = (select max(version) from OrderTable where orderid = ff.orderid) and orderid = ff.orderid and PitId = c.PitId and BuySell = 'S' and Status <> 'D' and Price = c.Price and Orderdate = c.Orderdate and Park <> 1 )
) Best_Offer_Data ON Best_Bid_Data.Pit_Id = Best_Offer_Data.Pit_Id
On SQL 2012 (64bit) I have a CLR stored procedure that calls another, T-SQL stored procedure.
The CLR procedure passes a sizeable amount of data via a user defined table type resp.table values parameter. It passes about 12,000 rows with 3 columns each.
For some reason the call of the procedure is verz very slow. I mean just the call, not the procedure.
I changed the procdure to do nothing (return 1 in first line).
So with all parameters set from
command.ExecuteNonQuery()to create proc usp_Proc1 @myTable myTable read only begin return 1 end
it takes 8 seconds.I measured all other steps (creating the data table in CLR, creating the SQL Param, adding it to the command, executing the stored procedure) and all of them work fine and very fast.
When I trace the procedure call in SQL Profiler I get a line like this for each line of the data table (12,000)
SP:StmtCompleted -- Encrypted Text.
As I said, not the procedure or the creation of the data table takes so long, really only the passing of the data table to the procedure.
I have a table tblCustTrans which contains custid int transid int startdate datetime value int
the custid, transid and startid are composite primary key.
the table contains more than 10 million records. Now i want to fetch record for select * from tblcusttrans where startdate > = 10/10/2006 10:00:000 and startdate <= 10/10/2006 11:00:000
This statement is taking more than 2 hours to fetch the data. is there a way to fetch the record with less time
I execute a pretty big sql query which joins multiple tables and I reviewed indexes on all these tables. I am happy with the result when I run the query using SSMS in the server locally i.e., where my SQL Server database is installed. It takes 4 seconds to get around 17000 records. If I run the same query in a network or from my desktop using SSMS i.e., i connect to the above mentioned SQL Server using SSMS, it takes more than 60 seconds. Not sure how to solve this. If someone could help me, it will be of great help.
I have an IS package containing approx. 10 tasks in the control flow - one of these tasks is a rather large data flow containing around 50 transformations plus 3 sources and two destinations. Around 10 of these are script components and another 10 are Union All transformations. The rest are primarily lookups, conditional splits and derived column transformation.
The XML file containing the package is approx. 4.5 MB. As I am developing the package, it is becoming increasingly slow to work with as more transformations are added to the data flow. Now, it takes 8 minutes every time I have to open the package for development (DelayValidation is even set to true) and DTExec (not the debugger) uses the same amount of time before it starts executing the package. It also takes a very long time to edit the data flow, as I typically have to wait 1-2 minutes every time the designer has to "commit" a change.
Does anyone have any idea what can be done to speed up the package - both with regard to development and execution?
I have problem with JDBC 2005 (1.1) running against SQL 2005 Express edition (SP2). Sometimes, the statement takes long time (more than 10 seconds). Sometimes, the same statement takes just a few seconds. It is very unpredictable. The query that we have problem is most of the time is join sql statement.
We are having problems getting the ADO.NET client to failover in a timely manner in a web application. Here's what happens:
1) Database fails over in 3 to 5 seconds 2) Clients attempt to login to primary and receive SQL server not available errors 3) The clients slowly start to connect and succeed without errorson the secondary. 4) We are still see SQL Server login errors on the primary, even after the failover 5) When we failback to the primary, it all happens within a few seconds (no problems)
If we recycle IIS, this problem goes away
Here's what we have tried 1) Turning off connection pooling (has no effect). We also can see the connections get killed on the primary server 2) Applying this hotfix (http://support.microsoft.com/kb/944099)
And yes, we are running a script to sync up the SIDS so that doesn't appear to be a problem. What seems to be happening is the primary server is getting stuck in the ADO.NET application memory. The problem is we don't know how to fix it.
BTW, our hardware is SQL Server 2005, Windows Server 2003, .NET 2.0 (and .NET 3.5)
Let me know if you need additional information. Thanks in advance
I have one stored procedure and its taking 10 mins to execute. My stored procedure has 7 input parameters and one temp table( I am getting the data into temp table by using the input parameters) and also I used SET NOCOUNT ON. But if copy the whole code of the SP and execute that as regular sql statement in my query analyzer I am getting the result in 4 seconds. I am really puzzled with this.
What could be the reason why the SP is taking more than query,Unfortunately I can't post the code here.
After differential restore I start Remedy service. It starts in few seconds.
After full restore the same service takes 15 minutes to start. Bothe the things are done through SQL service agent. Even manual restaring the service also takes 15 minutes after full restore. WHy is it happening this way?
HI, I have an interesting situation. I have created a stored procedure which has a select union query and it accepts some parameters. When I execute this procedure it takes 8 minutes. When I copy the script in stored procedure and run it directly in Query Analyzer it takes 2 1/2 minutes?? Same numbers of rows are returned either way in the result set with about 13,000.
I cannot figure this out and it is almost the same thing except that in Query Analyzer I declare the parameters variables and its values?
I need to create a stored procedure to select records that cal_callin_tm (this is a datetime field in table) is less than 10 minutes of the current system datetime. I want to update cal_callin_tm field with current system time and pass back the cal_assign_off_p field to my VS 2008/C# windows form. I'm new at SQL code, I know the basic stuff but this requires more knowledge than I have. I need from you experts the best practice to accomplish coding the SQL. I have some junk code below I have started with, help me out.
Thanks a bunch,
CREATE PROCEDURE SP_DpOffCallIn
(
@offname varchar(40) = null OUTPUT,
)
AS
SELECT * from dpcalldtl
where cal_callstat = 'ON SCENE' and (datediff(minute,0,cal_callin_tm) < datediff(minute, 0, currentdate)
Hi there... I wrote a SP to check for different types of exceptions in a few database tables. When I was writing the scripts, everything seemed to execute fairly quickly and I was satisfied with the performance. When I completed the scripts and compiled them into a stored procedure and ran it (using Exec), it took a lot longer to run than I thought it would. So I went through each section of the script and ran each portion individually to see which part was taking so long.... but all the scripts ran very quickly. The individual scripts, run separately, took a combined total of 0:26 to run.... but the SP was taking 1:30 to run. (????) So then I took ALL the script contained in the SP and ran it by itself in the Query Analyzer.... it took 0:27 to run. (??????)
So basically... the script that I wrote takes 27 seconds to execute, when run by itself in the Query Analyzer... but when I take that very same script and turn it into a Store Procedure and run it, it takes a minute and a half.
Any ideas why?? I thought SP's were supposed to run faster because they're pre-compiled.
When i execute a stored procedure it generally takes about half a second to run but sometimes it takes 20 to 30 seconds. I am the only one using the server so I know it is not due to other traffic. I have looked at Profiler and nothing looks out of the ordinary. Another observation is that the slow ones are always near eachother. I will have about 10 fast executions and then 3 slow ones and then back to fast ones. Has anyone seen anything like this before?
We have a customer table and an address table. Each customer can have 1 or more addresses in the address table.
The task is to synchronize customer information from 2 entirely separate systems. I have a stored procedure that reads a text file (exported from the 'master' system) and updates all the customer records in the second system (which is in the sql server db). The queries themselves work. The data is accurate. My issue is with how long it takes the stored procedure to run. With over 11,000 records in the text file, the stored procedure sometimes takes over 3 hours to run.
That seems excessive to me. (and probably to those of you snickering in the back)
As an overview: my stored procedure does the following.
1) creates a temp table 2) dumps the data from the text file into the temp table 3) updates the address table with data from the temp table (records already there) 4) inserts records into the address table from the temp table (records not already there) 5) updates the customer table with an address id 6) drops the temp table
Any help/suggestions is appreciated. If you need more info, let me know.
I needed to create a stored procedure to lock a user who makes 3 incorect entries of his password. I did it successfully. Now the problem what i have is that i want to lock the user only if he makes the 3 incorrect entries within 30 minutes.
I created a field named "FirstEntryTime" of type datetime that saves the date of the first incorrect entry. I tried to make an if statement:
if (@timesOfEntry <=2 AND DATEDIFF(MINUTE, firstEntryTime,GETDATE()) <= 30) Begin Update myTable set ... End
I need to call a SQL Server stored procedure, which takes over five hours to run, from an asp.NET web page. This procedure then calls a DTS package which is what takes 5+ hours to run. I need the user to be able to click on the 'run' button and have the page kick off the stored procedure (or the DTS Package if that will work instead) and display a message saying the load has begun and to check a 'status' link.
Right now I get a page timeout because it's waiting for results.
Any help would be greatly appreciated! Thanks, cat72
I have a stored procedure that syncs 2 databases. It runs fine if I go into the stored procedures and execute it manually, there are a few errors but it doesnt choke on them it completes with the desired end result. However if I create a job to run this dbo, it fails, The error log shows this:
Date,Source,Severity,Step ID,Server,Job Name,Step Name,Notifications,Message,Duration,Sql Severity,Sql Message ID,Operator Emailed,Operator Net sent,Operator Paged,Retries Attempted 10/03/2007 16:06:00,sync,Error,0,MACHINENAME,sync,(Job outcome),,The job failed. The Job was invoked by User MACHINENAMEAdministrator. The last step to run was step 1 (sync).,00:00:19,0,0,,,,0 10/03/2007 16:06:00,sync,Error,1,MACHINENAME,sync,sync,,Executed as user: companyname. ... 0) aspnet Member Exists [SQLSTATE 01000] (Message 0) DNN User Exists [SQLSTATE 01000] (Message 0) Updaing Roles __noelle(5281) [SQLSTATE 01000] (Message 0) aspnet User Exists [SQLSTATE 01000] (Message 0) aspnet Member Exists [SQLSTATE 01000] (Message 0) DNN User Exists [SQLSTATE 01000] (Message 0) Updaing Roles 0(5036) [SQLSTATE 01000] (Message 0) aspnet User Exists [SQLSTATE 01000] (Message 0) aspnet Member Exists [SQLSTATE 01000] (Message 0) New DNN User [SQLSTATE 01000] (Message 0) aspnet User Exists [SQLSTATE 01000] (Message 0) aspnet Member Exists [SQLSTATE 01000] (Message 0) DNN User Exists [SQLSTATE 01000] (Message 0) Updaing Roles 0000000(3795) [SQLSTATE 01000] (Message 0) aspnet User Exists [SQLSTATE 01000] (Message 0) aspnet Member Exists [SQLSTATE 01000] (Message 0) DNN User Exists [SQLSTATE 01000] (Message 0) Updaing Roles 007(8) [SQLSTATE 01000] (Message 0) aspnet User Exists [SQLSTATE 01000] (Message 0) aspnet Member Exis... The step failed.,00:00:19,14,3621,,,,0
So the error is 3621, and the severity is 14 which as I understand it means Insufficient Permissions.
I have already tried this: Making the Job Agent service logon as MACHINeNAME/Administrator instead of System, giving the user here (companyname) more permissions, but I am at a loss as to why this runs ok when manually executed but fails when run as a job. I have created the job by scripting the dbo "Execute to a file" and having the job run that file as a step, I have also tried copying and pasting the code from the stored procedure into the job step. It always fails for the same reason.
I must say I am very new to SQL, I am a network admin and have inherited this database and server from another client and have to get this sync task to run daily at certain intervals.
Please can anyone shed light on this issue? Thanks,
I used a stored procedure in my report. If I run the sp in Management Studio (on my pc, database is on a sql server) it takes only several minutes; but from reporting services (also on pc) I put it in the data tab and execute it, it takes forever, actually never finish. I want to know why it's taking so long to execute it from reporting services while it returns data instantly from Mgt Studio. There is cursor in the sp. I don't know whether this is the culprit. Anyone knows why? Thanks!
Below is the sp. --------------------------------------------------------------------
I am writing a sales managing software using C# in .NET 2005. The program was used to work properly with sql server 2000. I decided to write a "FILE Version" of sofware using sql server express 2005 , and detached db from 2000 , attaching it to 2005 using AttachDBFileName clause in connection string.
But the problem is when program executes something like this: EXEC [AddOrder] ... two rows inserted instead of one.and when I use server explorer of VS2005 to launch SP, it works fine. I should mention again that same code works correctly when I change connection string and force it to use sqlserver 2000.
I have a rather complex sp that runs for 4 minutes in SQL2000. It computes some messy oil and gas revenue/cost transactions. It involves lots of calls to functions that return single values of all types and also returns recordsets. There are all kinds of joins with temp and memory tables, etc. Just a mess, but it works. However, in SQL2005, it runs and apparently hangs, as it never ends.
I have run the Upgrade Advisor and otherwise, have not found any information that tells me that there are issues with functions or SQL-specific functions, tables, etc. that might cause this. Does anyone on this forum have some pointers on where I might look for assistance on this matter? Surely someone knows something about things working differently in 2005.