Very long story and I will not bore you with it. What I need to do is call a stored procedure and from within that stored procedure, initiate other sp and get out before those other sp are finished running.
I know I can execute a job but my volume would be about 1 job per second with a life of 2-5 seconds. So in a very short period of time, I can stack up a large volume of jobs.
We are facing lot of problems with Blocking,can any one help us in this matter,The problem is as follows
We have SQL Server 7.0 running on Nt4.0, and three web servers and 5 application servers are accessing SQL server. Till Yesterday everything was fine,Suddenly today more than 18 processes were blocked by other(Like chain),First i killed some blocking process,then it was fine,once again it started and continuously some processes are blocked by other,and i found that all blocking process are running from webservers.I ran SQL Profiler to get some information,but no use. I am not understanding why suddenly it happend,because we have't modified anything.Is there any way to overcome this situation,this is production server. because of this users are getting slow responce/no responce.
----Here i want to know why it happend? ---How to trace the problem and fix it
In the past few days I try to work with SQL Mobil and Replication.
And now I have a big problem.
When the replication ist running in my little application the pda-user wants not stop there working. So I try to implement some routines of code that I find in this onlinearticle: http://msdn2.microsoft.com/en-us/library/2ysxae29.aspx
The codes workes fine as long I don't start another SQL Task. When I start either (Select, Insert, Update or Delete) Statement the Database crash with errormessage: "The database file may be corrupted. Run the repair utility...
Can anyone give me a tip what I must do to fix this problem.
I'm currently writing an windows application where a LOT of threads connect to database and send small amount of data. I'd like to ask you for the best approach to do it.
The way it's implemented now:
Each therad uses it's own SqlConnection sends few bytes of data and closes the SqlConnection. Each thread has to send initliazation data which i guess is few times bigger then accual data and then it has to close connestion even more bytes that could be avoided are sent.
The way i'd like it to be:
There is only one SqlConnection all therads share it. It's opened when the applications starts and closed when it's application is closed. This way i send initialization data only once and by doing this i can save some bandwidth.
Does SqlConnection operations has to be in critical area? Can it be used pararelly by many threads? Maby there is better way to do it? Is it recomended to close SqlConnection as soon as possible after query? If so why?
I've found a pretty good code example on http://www.dotnetfun.com for a Asynchronous Trigger.
I've parsed through the Code, to understand how to wirte my own Async Trigger with a Service Broker, but the Code isn't working! It seems that the stored procedure don't receive the messages in the queue, but the queue get's filled.
MessageType
CREATE MESSAGE TYPE myMsgXML VALIDATION = WELL_FORMED_XML; Contract
CREATE CONTRACT myContractANY (myMsgXML SENT BY ANY) Queue
CREATE QUEUE myQueue WITH STATUS = ON, RETENTION = ON, ACTIVATION ( STATUS = ON, PROCEDURE_NAME = sp_myServiceProgram, MAX_QUEUE_READERS = 5, EXECUTE AS SELF ) Service
CREATE SERVICE myService ON QUEUE myQueue (myContractANY) Procedure (greped from http://www.dotnetfun.com/)
CREATE PROC sp_myServiceProgram AS -- This procedure will get triggered automatically -- when a message arrives at the -- Let's retrieve any messages sent to us here: DECLARE @XML XML, @MessageBody VARBINARY(MAX), @MessageTypeName SYSNAME, @ID INT, @COL2 VARCHAR(MAX); DECLARE @Queue TABLE ( MessageBody VARBINARY(MAX), MessageTypeName SYSNAME); WHILE (1 = 1) BEGIN WAITFOR ( RECEIVE message_body, message_type_name FROM myQueue INTO @Queue ), TIMEOUT 5000; -- If no messages exist, then break out of the loop: IF NOT EXISTS(SELECT * FROM @Queue) BREAK; DECLARE c_Test CURSOR FAST_FORWARD FOR SELECT * FROM @Queue; OPEN c_Test; FETCH NEXT FROM c_Test INTO @MessageBody, @MessageTypeName; WHILE @@FETCH_STATUS = 0 BEGIN -- Let's only deal with messages of Message Type -- myMsgXML: IF @MessageTypeName = 'myMsgXML' BEGIN SET @XML = CAST(@MessageBody AS XML); -- Now let's save the XML records into the -- historical table: INSERT INTO tblDotNetFunTriggerTestHistory SELECT tbl.rows.value('@ID', 'INT') AS ID, tbl.rows.value('@COL2', 'VARCHAR(MAX)') AS COL2, GETDATE() AS UPDATED FROM @XML.nodes('/inserted') tbl(rows); END FETCH NEXT FROM c_Test INTO @MessageBody, @MessageTypeName; END CLOSE c_Test; DEALLOCATE c_Test; -- Purge the temporary in-proc table: DELETE FROM @Queue; END Send Message in a Update Trigger
SELECT @XML = (SELECT * FROM inserted FOR XML AUTO); -- Send the XML records to the Service Broker queue: DECLARE @DialogHandle UNIQUEIDENTIFIER, @ConversationID UNIQUEIDENTIFIER; /* The target Service Broker service is the same service as the initiating service; however, you can set up this type of trigger to send messages to a remote server or another database. */ BEGIN DIALOG CONVERSATION @DialogHandle FROM SERVICE myService TO SERVICE 'myService' ON CONTRACT myContractANY; SEND ON CONVERSATION @DialogHandle MESSAGE TYPE myMsgXML (@XML); -- Let's detect an error state for this dialog -- and rollback the entire transaction if one is -- detected: IF EXISTS(SELECT * FROM sys.conversation_endpoints WHERE conversation_handle = @DialogHandle AND state = 'ER') RAISERROR('Dialog in error state.', 18, 127); ELSE BEGIN --I want to list the queue after the trigger so I disabled --END CONVERSATION @DialogHandle; COMMIT TRAN; END The Problem is, that the Procedure doesn't even get started! So I tried to receive the Queues manually
WAITFOR ( RECEIVE message_body, message_type_name FROM myQueue INTO @Queue ), TIMEOUT 5000;
and I run always into the timeout and get nothing back. A Select * FROM myQueue gives me some results back. Why I can't recevie? Would be grateful for help, or at least a good tutorial, I haven't found one yet.... thx and greez Karsten
After failing over to the DR replica. All databases are out of sync. DR replicas were setup as async the other 2 are set up as sync. Is this by design. No data has been updated to any of these as they are test dbs. So all dbs should be the same, no data loss.
I have a table which contains approx 3,00,000 records. I need toimport this data into another table by executing a stored procedure.This stored procedure accepts the values from the table as params. Mycurrent solution is reading the table in cursor and executing thestored procedure. This takes tooooooo long. approx 5-6 hrs. I need tomake it better.Can anyone help ?Samir
if for any reason AG fails over to async node, how replication behaves? As data will not be in sync with previous primary replica, how replication will work? I think that we have to reset replication from scratch as there's a high chance subscribers might be more updated than current primary replica as failover to this node causes data loss. How to keep replication in sync without resetting up? Can we achieve this?
I have a 2 node cluster having 4 cores each wherein having 3 instances of SQL 2008 R2 enterprise comprising of 60 databases, 20 on each instance. I need to setup mirroring for each of the databases to a secondary server having 4 cores and 3 instances.
What i understand is that in this case the mirror server will be providing max of 512 worker threads and the 60 mirror databases would consume 240 threads.
What all needs to be checked for looking into the feasibility of going ahead with a async mirror setup as mentioned above.
I get this error when I look at the state of my SQLresults object. Have I coded something wrong?Item = In order to evaluate an indexed property, the property must be qualified and the arguments must be explicitly supplied by the user. conn.Open() Dim strSql As String
I have upgraded a MS SQL database from 6.5 to 7.0. The database functioned fine in 6.5, now I have a table that is locking due to a blocking process. If I kill the process all is fine, but am trying to determine what is causing the process to hang. Has anyone experience any similar situations.
How to kill process??? I turn off workstation but the process still available in the sysprocesses table and in the Curent activity window. I can't to kill this (and I can't to restart serever, because the users) .
If anyone know something about this problem. Thanks
I have an SQL server with several (100) users connected. When I run a long running process, it severely impacts user performance. The long running process is a store procedure with several cursors.
Are there SQL configuration settings that would reduce the impact of these long running processes on other Users?
/* The Following Stored Procedure helps to Kill All Processes in a Particular DataBase With Out Current Process */
Create Proc Sp_KillAllProcessInDB
@DbName VarChar(100)
as if db_id(@DbName) = Null begin Print 'DataBase dose not Exist' end else
Begin Declare @spId Varchar(30)
DECLARE TmpCursor CURSOR FOR Select 'Kill ' + convert(Varchar, spid) as spId from master..SysProcesses where db_Name(dbID) = @DbName and spId <> @@SpId and spId > 50 and dbID <> 0 OPEN TmpCursor
FETCH NEXT FROM TmpCursor INTO @spId
WHILE @@FETCH_STATUS = 0
BEGIN
Exec (@spId)
FETCH NEXT FROM TmpCursor INTO @spId
END
CLOSE TmpCursor DEALLOCATE TmpCursor
end /* The Above Query Helps TO Change a Database in Single USer Mode Quickly*/
I have been needing to disconnect users often, before backing up a database or setting it to restricted users. Each time, I have to go to Current Activity & kill one process at a time. Is there a way, by which I can kill all processes on a database or force out all coonections to it?
Can you kill processes in SQL Server 2K without stopping SQL Server and restarting it? I am using sp_who2 to get a list of active users and I see some accounts that are logged off but still showing up and I am trying to find a way to Kill these accounts and the processes they are doing.
When i see the result of Sp_who2, i see the status of a process as "SUSPENDED" for some of the processes like logwriter, checkpoint etc. and also found the same status for "IF @@TRANCOUNT>0 COMMIT TRAN"
Please let me know what does it mean in the context of SQL 2005
I need to know exactly what VBA lines Create and Destroy SQL ServerProcesses - i.e. those visible in Enterprise Manager under Management[color=blue]> Current Activity > Process Info.[/color]Why?I am experiencing strange behaviour with Processes that are createdwhen I create a DAO Database Object with the following line:Set m_ResDatabase = DBEngine.Workspaces(0).OpenDatabase(strDSN, False,False, strODBC)This creates the process as expected.However the following lines don't always close the ensuing Process:If Not m_ResRecordSet Is Nothing Thenm_ResRecordSet.CloseSet m_ResRecordSet = NothingEnd IfIf Not m_ResDatabase Is Nothing Thenm_ResDatabase.CloseSet m_ResDatabase = NothingEnd IfIf Not m_ResWorkspace Is Nothing Thenm_ResWorkspace.CloseSet m_ResWorkspace = NothingEnd IfIt seems as if SQL Server keeps hold of the first two Processes andthen will release any subsequent ones.Can anyone shed any light in this - or any good web pages where Imight find some answers?Regards Chris
I have a foreach loop which contains a call to an Execute process task. I want the Execute process task to run once for each element in my foreach, but I need to run as many Execute process tasks as my server can handle. What is the best way to do this? Any help would be greatly appreciated.
I had to move my developed software onto another computer in order to demo this. The new computer has IIS installed as well as VS2005 and SQL Server 2005. When I try to execute I get this error.
A connection was successfully established with the server, but then an error occurred during the login process. (provider: Shared Memory Provider, error: 0 - No process is on the other end of the pipe.)
I have worked on this problem for two days now and have to demo it first thing Monday. Any help would be appreciated.
I am receiving an error from my ODBC driver “Maximum number of DBPROCESSes already allocated.”
I confirmed that there are 25 connections and that this is the default. This is caused by error message 10029, SQLEDBPS, when the maximum number of simultaneously open DBPROCESS structures exceeds the current setting. I would like to increase this maximum.
I have found only two ways to do this. One is using dbsetmaxprocs using C and the other is using SqlSetMaxProcs using Visual Basic. My problem is that I am interfacing to SQL Server using a third party tool that is doing the lower level programming.
Is there some way that I can increase the maximum number of DB processes for all databases that are part of the SQL Server 7 environment, or can I set this value using a program that is called from a stored procedure?
Any ideas in this area will be greatly appreciated.
I wish to select processes from sysprocess that are SLEEPING and more than a certain time old (say 10 minutes) so that I may KILL them. I can get the query to do the select, but how do I KILL the process? I have tried selecting the SPID into a local variable and then trying KILL @var_name, but I get "Incorrect syntax near '@var_name'".
I have tried all of the resources that I can find, but without success. Is this possible? If so, how do I go about doing it?
Sorry, didn't quite know where this should go. I'm reading Kimball Group's "Microsoft Data Warehouse Toolkit" Has anyone got any experience in defining business processes? I'm struggling a little to look at my employer's recruitment busines and work out exactly what the real processes are. Or maybe one of you has read something somewhere about this specific skill?
------------------------ Me: What do you want to know from your data warehouse? Client: Err...Emm...Everything Me: OK, that's great. That's all I need to know. I'll see you when it's done.
We currently have a routine that "forks" out (to use the unix term)TSQLcommands to run asynchronously via SQL Agent jobs. Each TSQL commandgets its own Job, and the job starts immediately after creation.Sometimes we can have too many of these jobs running at the same, andthe box crawls to a slow speed until the jobs finish up.Is there a way we can limit the number of active jobs running under theSQL Agent at one time? Or is there away to limit the number of active(runnable) processes on SQL Server, in general?
Is there a common way to document ETL (packages / data flows)? I looked for documentation templates in the internet but didn't find anything which was useful.
I'm just a newbie with XML. Now we're moving all our DTS to SSIS packages. Most of them are simply processes that takes a plain file and loading these data into Sql table or in inverse way. So that I wonder what role will play XML here.
I mean, are there big differences between store data as XML format and nvarchar?