How does one prevent a long running procedure form crapping out in CLR?
I am trying to do a pull from a distant data source and it works, except I have to break down my stored procedure call into several smaller calls. I would like to do everything in one shot, but I get the thread abort exception when I try to get a lot of data.
I am using SQL Server 2005 and I have an endpoint that exposes some stored procedures as web-methods in the endpoint.
One particular stored procedure I have exposed takes a long time to execute: about 10 - 15 minutes. While, it is OK, that this stored procedure takes this long, it is not desirable for the HTTP Request that executed this proc to not wait for that long.
What I want to be able to do is to call the stored procedure and have the call return immidetaly but the stored proc continues what its doing. I will call another stored proc at a later time to retrive the result of the first stored proc. The first proc will store its results in a temp table. I am thinking of using SQL Server Service Broker to achieve this.
Is there a better a way to achieve this? And how does SQL Server process the Service Broker requests, i.e., I dont want the query to be executed when the server is busy. Are there any hints that I need to give to Service Broker to be able to do this?
One of my stored procs, taking one parameter, is running about 2+ minutes. But if I run the same script in the stored proc with the same parameter hardcoded, the query only runs in a couple of seconds. The execution plans are different as well. Any reason why this could happen? TIA.
I have a stored proc that is taking way too long to process, over 50 minutes.
What I need to do is go through 180K rows, and if the projId and langid is the same, increment 1 to a value, if not, reset the incrementer
I am doing the following at the moment
DECLARE @Id int, @ProjectId int, @LangCode char(6) DECLARE @CurProjectId int, @CurLangCode char(6) DECLARE @incrementer int DECLARE Order_cursor CURSOR FOR
SELECT Id, langCode, projectid FROM #tempTable
OPEN Order_cursor SET @incrementer = 1 FETCH NEXT FROM Order_cursor INTO @Id, @LangCode, @ProjectId WHILE @@FETCH_STATUS = 0 BEGIN IF(@CurProjectId = @ProjectId) AND (@CurLangCode = @LangCode) BEGIN SET @incrementer = @incrementer + 1 END ELSE BEGIN SET @incrementer = 1 END UPDATE #tempTable SET edbOrder = @incrementer WHERE Id = @id SET @CurProjectId = @ProjectId SET @CurLangCode = @LangCode FETCH NEXT FROM Order_cursor INTO @Id, @LangCode, @ProjectId END CLOSE Order_cursor DEALLOCATE Order_cursor
I have a stored procedure that normally takes about 5 hours to complete: DELETE tblX WHERE PROC_DT < dateadd(day, -93 , getdate())
tblX has about 55 million records and has an index on PROC_DT.
I have this running as a scheduled task. Over the weekend, the task executed and it is still running 56+ hours later. Does anybody have any ideas as to where I should look for the problem? I am afraid to kill the process because of the rollback time.
I have a stored procedure being called from Visual Cafe 4.0 that takes over 30 minutes to run. Is there any way to backround this so that control returns to the browser that the JFC Applet is running in? The result set is saved to local disk and an email message sent to the user on completion. Thanks, Dave.
I have a stored procedure in SQL 2005 that purges data, and may take a few minutes to run. I'd like to report back to the client with status messages as the sp executes, using PRINT statements or something similar. I imagine something similar to BACKUP DATABASE, where it reports on percentage complete as the backup is executing.
I can't seem to find any information on how to do this. All posts on this subject state that it's not possible; that PRINT data is returned after the procedure executes. However it would seem possible since BACKUP DATABASE, for example, does this.
Is there any way to send status type messages to the client as the sp is executing??
We have just implemented our new app and I need to improve performance. We are the victims of not having a very adequate stress testing tool prior to launch. Is there an easy way to track all queries or stored procs longer than a specified time?
We have just implemented our new app and are expericing some slowness but no blocking issues. Is there an easy way to track all queries taking over x seconds to run?
I have a stored procedure in SQL Server 2008 (the stored proc is actually stored there with a name) and I can run it with the 'exec storedproc_name insert_date' command (my stored proc needs a date to run).
The stored proc just creates a temp table with some data (but we can ignore this bit).
I only need a way to run this stored proc remotely (I dont care about getting the data, I just need to run it in the server).
Is there any way of doing this? Preferably via a unix system? I just need a way to run the 'exec' command. Returning data etc. isn't needed.
im getting an error when i run the stored proc with a string parameter in execute sql task object.
this is the only code i have:
exec sp_udt_keymaint 'table1'
I also set the 'Isstoredprocedure' in the properties as 'True' though, when you edit the execute sql task object, i can see that this parameter is disabled.
Hi. I've got a report with 4 different sections - the datasets coming from some tables that are populated via a stored procedure. I'd love it if the the first thing this report did was run that stored procedure and then the data would be available for the actual reporting piece. Is that possible? And if so, how do I make it work?
For reasons beyond the scope of my question, is there a way to run this command within a Stored Procedure from a low privileged user login? I can grant the entity "db_ddladmin" privilege and the proc runs, but I'd rather not give out that level of permission to what is basically a glorified web access login.
Please forgive the simplicity of this question - I am not the dba type. When I connect to a server and look at my connection attributes in activity monitor, the user column shows the correct information for my domainusername. When I run a certain stored procedure in that connection, the domainusername changes to another person. We are not using execute as, setuser, or anything special to explicitly change the user. The stored procedure is in a schema that is owned by dbo (principal_id = 1 - I verified by checking sys.database_principals.)
If I start a long running query running on a background thread is there a way to abort the query so that it does not continue running on SQL server?
The query would be running on SQL Server 2005 from a Windows form application using the Background worker component. So the query would have been started from the background workers DoWork event using ado.net. If the user clicks an abort button in the UI I would want the query to die so that it does not continue to use sql server resources.
Hullo all I have two machines, One is a standalone machine and the other is on a domain network. How can I run a stored procedure/job on the standalone machine from the domain machine ? running the procedure as a Domain user results in a failed job/stored proc. also creating an sql login and attempting to run it as that user also fails. How can I solve this problem ? please e-mail me at wayde@sunnygrp.com if you have any thoughts...
I am working with a large application and am trying to track down a bug. I believe an error that occurs in the stored procedure isbubbling back up to the application and is causing the application not to run. Don't ask why, but we do not have some of the sourcecode that was used to build the application, so I am not able to trace into the code. So basically I want to examine the stored procedure. If I run the stored procedure through Query Analyzer, I get the following error message: Msg 2758, Level 16, State 1, Procedure GetPortalSettings, Line 74RAISERROR could not locate entry for error 60002 in sysmessages. (1 row(s) affected) (1 row(s) affected) I don't know if the error message is sufficient enough to cause the application from not running? Does anyone know? If the RAISERROR occursmdiway through the stored procedure, does the stored procedure terminate execution? Also, Is there a way to trace into a stored procedure through Query Analyzer? -------------------------------------------As a side note, below is a small portion of my stored proc where the error is being raised: SELECT @PortalPermissionValue = isnull(max(PermissionValue),0)FROM Permission, PermissionType, #GroupsWHERE Permission.ResourceId = @PortalIdAND Permission.PartyId = #Groups.PartyIdAND Permission.PermissionTypeId = PermissionType.PermissionTypeId IF @PortalPermissionValue = 0BEGIN RAISERROR (60002, 16, 1) return -3END
I have a pretty complex query that aggregates lots of data and inserts multiple rows of that data into a reporting table. When I call this SPROC from SQL Server Management Studio, it executes in under 3 seconds. When I try to execute the same SPROC using .NET's SqlCommand object the query runs indefinitely until the CommandTimeout is reached. Why would this SPROC behave differently with the same inputs, but being called from .NET? Thanks for your help!
Hi everyone.... I'm trying to execute this update statement... It takes an eternity... any ideas on how to rewrite or speed it up?
It's a several step process... below is everything that i run, one step at a time. The final update statement is what takes so long. It should only affect about 2600 rows out of a potential 9000. That's why I'm confused on the response time
select d.olddevicename, de.device, d.newdevicename into #temp9 from dns d, devices de where de.device = d.olddevicename
update #temp9 set device = newdevicename where olddevicename = device
update devices set device = #temp9.device from #temp9, devices where #temp9.device in (select #temp9.device from #temp9, devices where #temp9.olddevicename = devices.device)
I have 3 three scheduled job: one runs onece a day, one runs once per hour, and another runs every 17 minutes. It is a NetIQ application. I just scheduled SQL Server maintianace job last night which ran at 2:00Am and 4:00Am. This morning, I came in office and found all my jobs were still running; and they were all blocked by the first 3 jobs. I had to kill all of them. In this afternoon, I kicked off one of my many DTS packages which runs usually about 40 minutes, but it failed. I tried several times but no luck. I suspected one of user tables corrupted or one of stored procedures corrupted. After I recycle the server, and dropped the table and the stored procedure, and recreated them, the package went fine. The store procedure involves many updates and inserts.
The question I have is: is it possible to cause this problem because I killed the unfinished jobs (especially the sql maintanace job)?
NOTE: the sql maintanace job does not include the backup of database and transaction log.
My backups are running 5-6 hours on SQL2000. I'm sure they only used to take 1 hour or so. On another server, backing up the same database (both about 50 gig), the backup only takes 45 min - 1 hour. What can I look at to see why it's taking so long ?
Trying to come up with a way to monitor (without profiler, hopefully with a job and a select statement) a specific sql job that may cause a problem if the duration is too long. It seems that there is an sp called sp_sqlagent_log_jobhistory that shoves a record in sysjobhistory, but only after all the job steps run. Anyone tried this before?
Hello Gurus I am using sql 2005 and one job status is executing in job monitor in 2005,How can i check since how long this job is running? Please advice
I've got a server (SQL 2K, Win2K) where the backupshave started running long.The database is a bit largish -- 150GB or so. Up untillast month, the backups were taking on the order of4 to 5 hours -- depending on the level of activity on theserver.I'm using a T-SQL script in the SQLAgent to run thebackups. Native SQL backup to an AIT tape drive.Now, for no apparent reason, the backups are takingon the order of 24 to 26 hours. The backups completesuccessfully -- no errors, just taking an outrageouslylong time to complete. DBCCs check out AOK, noproblems with the database.No changes to the machine. No hardware changes. Nosoftware changes. Weird.Multiple tape media have been tried -- it's not a caseof a tape going bad.We've had no problems with this box for almost 4years. Now it's gettin' jiggy with us!Any ideas on where to start with this one?Thanks in advance.
Hi. I'm fairly new to SSRS, and very new to this forum. I have a report based on a stored procedure. I've optimized the procedure so that it runs from 2-4 minutes (previously over half an hour). However, when I run the report that calls the sp, it runs forever (well over 45 minutes in some cases), and the users basically give up on it. Any ideas of why this happens and what steps I can take to improve performance?
I need to execute a long running package (it takes about 16 hours to finish) to load a data warehouse for the first time with all historical data. This package it's a master package and execute other packages; I log the start time and the finish time of the package in a table to manage future incremental loads.
I executed the package on sql server where it is saved, but after 8 hours it was running, a new package was started automatically. Then two more packages started .. each every two hours.
I set the MaxConcorrentExecutable = 4, this could affect this strange behavoir ?
I'm trying to optimize a long running (several hours) query. This query is a cross join on two tables. Table 1 has 3 fields - ROWID, LAt and Long. Table 2 has Name, Addr1,Addr2,City,State,Zip,Lat,Long.
Both tables has LatRad - Lat in radians, LonRad- Lon in Radians. Sin and Cos values of Lat and Lon are calulated and stored to be used in the distance formula.
What I'm trying to do here is find the nearest dealer (Table 2) for each of Table 1 rows. The Select statement takes a long time to execute as there are about 19 million recs on table 1 and 1250 rows in table 2. I ran into Log issues- filling the transaction log, so I'm currently using table variables and split up the process into 100000 recs at a time. I cross join and calculate the distance (@DistValues) and then find the minimum distance (tablevar2) for each rowid and then the result is inserted into another Table (ResultTable).
distance=3963.1*Case when cast(S.SinLat * T.SinLat + S.CosLat * T.cosLat * cos(T.Lonrad - s.Lonrad) as numeric(20,15)) not between -1.0 and 1.0 then 0.0 else acos(cast(S.SinLat * T.SinLat + S.CosLat * T.cosLat * cos(T.Lonrad - s.Lonrad) as numeric(20,15))) end
from dbo.TopNForProcess T , dbo.Table2 S where Isnull(T.Lat,0) <> 0 and Isnull(T.Lon,0)<> 0
Insert into @MinDistance
Select DataSeqno,Min(distance) From @DistValues Group by DataSeqno
Insert into ResultTable (DataSeqno,Lat2,Lon2,StoreNo,LAt1,Long1,distance)
Select D.DataSeqno, D.Lat2, D.Lon2, D.StoreNo, D.LAt1, D.Long1, M.distance from @DistValues D Inner Join @MinDistance M on D.DataSeqno = M.DataSeqno and D.Distance = M.Distance
I created a View called TopNForProcess which looks like this. This cut down the processing time compared to when I had this as the Subquery.
SELECT TOP (100000) DataSeqno, lat, Lon, LatRad, LonRad, SinLat, cosLat, SinLon, CosLon FROM Table1 WHERE (DataSeqno NOT IN (SELECT DataSeqno FROM dbo.ResultTable)) AND (ISNULL(lat, 0) <> 0) AND (ISNULL(Lon, 0) <> 0)
I have indexes on table table1 - Rowid and another one with Lat and lon. Table2 - Lat and Long.
Is there any way this can be optimized/improved? This is already in a stored procedure.
I have a SQL procedure that can take several minutes to complete. I allow users to initiate the process through a web site and view a progress bar. When the process is running, though, the site slows to a crawl or times out completely. That long running process seems to block all other queries on the database. Is there a way to give this process a low priority or somehow throttle its resource use so that the other web processes can get a chance to run in a timely manner? Thanks for any advice.