Blocks On SQL Server
Nov 26, 2001I need to test a block-alert and would like to test this by causing a block on our test server. Anyone have a script handy that will do this?
View 5 RepliesI need to test a block-alert and would like to test this by causing a block on our test server. Anyone have a script handy that will do this?
View 5 RepliesHello,we use two instances of the ms sql server 2000 (Version: 8.00.760) on a4 processor windows 2000 (sp 4) machine.A dts-packet on one instance completly blocks the work on the otherinstance. Both instances have 2 dedicated processors. The twoprocessors of the blocked instance are in idle states.How is this possible? Has someone had the same behavior of the sqlserver and found a solution for this problem?Thanks in advance for answersJürgen Simonsen
View 1 Replies View RelatedI'm trying to set the query to send email ONLY when it returns records of blocks and i cant seem to get this going.
declare @blocks varchar(max)
set @blocks = (SELECT spid,
sp.[status],
loginame [Login],
hostname,
blocked BlkBy,
sd.name DBName,
[Code] ....
I want to display data in block. Below is the Create,Insert Script and format of desired output.
CREATE TABLE [dbo].[Test_AA](
[CounterpartID] [varchar](2000) NULL,
[CounterpartName] [varchar](2000) NULL,
[SATName] [varchar](2000) NULL,
[NameOfBO] [varchar](2000) NULL,
[Code] ....
Result Should be like This :
ID:17388
Name:Scottie Berwick
SATName:Scottie Berwick
NameOfBO:
SATAccountNumber:
[Code] ....
We have a massive database with an almost massive amount of traffic to and from it.
I've been requested to implement a sliding window partitioning with 2 partitions an active and passive 1,I managed to test this on a very small testbed last month.
I currently moved 97k table on to the partition function leaving me another 26 k to go
I'm using the following stored procedure to implement the sliding window
CREATE PROCEDURE [dbo].[ManageFactSlidingWindow](@pFunction nvarchar(max),@pSchema nvarchar(max),@FG nvarchar(max),@moveDays int)
/*****************************************************************************
PROCEDURE NAME: [ManageFactSlidingWindow]
AUTHOR: Arshad Ali
CREATED: 02/24/2013
DESCRIPTION: This stored procedure manages sliding window for the partitioned table
VERSION HISTORY:
DATE EMAIL Company DESCRIPTION
[Code] .....
When I try to move the partition even a single day I get loads of locks.
I know blocks and Deadlocks are different but how related are they? Seems like when I get reports of deadlocks I always have blocks and the blocks grow as time passes.
View 2 Replies View RelatedHi, Everyone
Can anyone tell me how to find who blocks the table or record, because the Lock Blocks is extremly high in the performance monitor.
Regards,
I would like to log blocks to see what causes it.
One hack of a way to do it is to run code similar to that which populates the blk column in sp_who; on a job every 5 seconds. It would store the results of code similar to the sp_who output whenever any of the rows has a value greater than zero in the blk column.
This way, I can see what was going on at the time there was a problem.
Does anyone have a better strategy for this?
Is it possiable to go through the history blocks and change field names?...ie
I want to change "PS001_Pump1.RT_Daily" to "PS001_Pump1.Runtime_Daily"
so i can access the historical data thru the new tagname.
If so , is this using an udpate query?
I need to find a way to send mail in blocks of 100 and 500 at intervals of 120 seconds. Does anyone know of any code for this?
Thanks!
Etymon
Dear All,we are running SQL2000 Sever and make use of the xp_sendmail.For any reason the mail service can run into problems and it lookslike that the statemnt below gets not finished.EXEC @Status = master..xp_sendmail @recipients=@TOList,@copy_recipients=@CCList,@subject='the subject goeshere',@message=@MailText,@no_output=TRUEUnfortunately the statement is in an update trigger and hence itblocks the table for any further updates.My questions are:Can I achieve a kind of timeout check in my trigger in order to bypassthexp_sendmail call ?In general, sending mail in a trigger may not be a good idea.How can this be solved better ?Any hint is highly welcomeRegardsRolf
View 6 Replies View RelatedHi
I was trying to clean up some conversation in Service Broker and caused alot of blocking that I seem to unable to kill. there was 1 conversation that I was not able to end, so I wanted to restart sql service, But I can't even restart the SQL service. I get the following in Event Viewer
Timeout occurred while waiting for latch: class 'SERVICE_BROKER_TRANSMISSION_INIT', id 00000001A2B03540, type 2, Task 0x0000000000C2EDA8 : 0, waittime 5400, flags 0xa, owning task 0x00000002DEBCA5C8. Continuing to wait.
Has anyone come across this
thanks
Paul
The following two stored procedures works fine. I need to add try catch blocks in the stored procedures. How to do it? Pls modify my stored procedure with try and catch blocks and in the catch block i need to call the Procedure called ErrorMsg which contains error severity.
Procedure 1------------set ANSI_NULLS ONset QUOTED_IDENTIFIER ONgo
-- =============================================-- Author: C.R.P. RAJAN-- Create date: March 26, 2008-- Description: PROCEDURE FOR IMPLEMENTING SOFT PURGE-- =============================================ALTER PROCEDURE [dbo].[SP_SOFTPURGE] @POID INTASBEGINUPDATE POMASTER SET FLAG=1 WHERE POID=@POIDSELECT * FROM POMASTER WHERE FLAG=0END
Procedure 2-----------set ANSI_NULLS ONset QUOTED_IDENTIFIER ONgo
-- =============================================-- Author: C.R.P. RAJAN-- Create date: March 26, 2008-- Description: Stored Procedure for Hard Purge-- =============================================ALTER PROCEDURE [dbo].[SP_HARDPURGE] @PoId INTAS
BEGIN TRAN INSERT INTO ARCHIVE SELECT * FROM POMASTER WHERE POID = @POID IF @@ERROR != 0 BEGIN ROLLBACK TRAN RETURN END
DELETE FROM POMASTER WHERE POID = @POID IF @@ERROR != 0 BEGIN ROLLBACK TRAN RETURN ENDCOMMIT TRANSELECT * FROM POMASTER
When I am running reports on SQL Server 6.5 database which involves creation of temp tables, whole tempdb is locked. All other users are blocked,
The reports involve queries which fetch data into temporary tables, as,
SELECT c1 , c2, c3 ...
INTO #tmp1
FROM T1
WHERE C1 = xxx
AND .........
When any of these queries are run, it blocks all other operations using TEMPDB, like creation of temporary tables or queries using sorts etc.
It appears that the query locks the system tables (sysobjects, syscolumns) in TEMPDB.
However, the above query works fine if it is constructed as,
CREATE TABLE #tmp1 (a1 ..., a2 ...., a3 ..... )
INSERT INTO #tmp1
SELECT (c1, c2, c3)
FROM T1
WHERE c1 = xxx
AND .........
Could someone please tell me what actually is happening in the first scenario. Is there a way avoid the blocking of TEMPDB usage other than the workaround mentioned above (which means I have to change all my queries) ?
Thanks
Nishant
I'm trying to place records into groups of 100, so I need a query that will return records but not based upon an autonumber type field. For instance, if I had a set of records that had been sequentially numbered (by autonumber) with IDs to greater than 3,600 but there were actually only 300 IDs remaining ranging from 1 to 3600 (say the others were deleted), then I would want to use a query to make three groups of 100 of the remaining IDs.
I can use SELECT TOP 100 for the first 100 records. How do I get subsequent groups of 100?
It's probably obvious and I am thinking too hard.
I wanted to set up a mechanism that would transfer blocks of records (a few dozen to in rare cases a few thousand), with slight modification, from one database to another. It's a sort of custom partial archiving process that would be triggered from a web-based admin application. Records in the target db would be identical except:
-- the primary key in the source table, an identity field, would be just an integer in the target table
-- the target table has an extra field, an integer batch ID supplied by the web application that triggers the process
It's a simple, if not efficient matter to do it within the web application: query the source table, suck the records into memory, and insert them one by one into the target db. This will be an infrequent process which can be done at off-hours, so a bit of inefficiency is not the end of the world. But I wondered if there is a more sensible, orthodox approach:
-- Could this process be done, and done efficiently, as a stored procedure with the batch ID passed as a parameter?
-- Is there any way to do a bulk insert from a recordset or array in memory using ADO and SQL? And if so, is that better than inserting records one by one?
Advice on the best general approach would be appreciated, and I will try to figure out the details.
Hallo,
I'am relatively new to SSIS-programming and experiencing a serious problem with a package (6618 KB large) containing 5 dataflows. These dataflows all start from a datareader-origin and flow trough multiple transformations. While editing the fifth dataflow the developmentenvironment suddenly became unworkably slow. After any modification made to the dataflow it takes several minutes before I can continue editing. Meanwhile in the task manager the process devenv takes a very large portion (50 to 100%) of CPU.
Building the package,shutting down SSIS and even rebooting the PC don't make any difference. When you close the package ands reopen it, this takes at least 10 minutes.
We work with Microsoft SQL Server Integration Services Designer
Version 9.00.1399.00.
Does anyone know if there is a limit to the number of transformations in a flow or the size of a package? Could that be the problem?
Before this problem occured I was multiplying transformations by copying ,renaming and altering them within a dataflow. Could SSIS have some problem with that?
Any suggestion for solving this problem, other than chucking it all away an restarting, will be gratefully accepted.
With kind regards,
Paul Baudouin
Hi:
Somebody tellme that with a format of my database disk with 64K blocks NTFS, i can have better performance, is that true ? there is any problem in SQL with this block size ?
We are working for a long time on optimizing this script, and latest modification I did was splitting script into small blocks and using UNION ALL. This supported a lot but still it takes some 45 mins on prod environment. I'm not able to find anything more in this script to optimize.
View 9 Replies View RelatedI have several stored procedures with a similar query. The SELECT clauses are identical, but the FROM and WHERE clauses are different.
Since the SELECT clause is long and complicated (and often changes), I want to place the SELECT clause in a separate file and then have the stored procedures include the block of text for the SELECT clause when they compile.
Is this possible? Looking through the docs and searching online, I don't see any way to do this.
I have the following SQL that I want to log the number of visits from clients. a session is a block of time, for example, any time between 9-12 is a morning sessions, 12-5 is an afternoon session etc. I need to count the number of sessions in a month not the appointments within the session, make sense?
select sr.resourceid, st.TimeStart, datename(m,sr.scheduledate) as sesmonth, datename(yy,sr.scheduledate) as year,
(CASE
WHEN DATEPART(hour, st.TimeStart) BETWEEN 0 AND 12 THEN 'MORNING SESSION'
WHEN DATEPART(hour, st.TimeStart) BETWEEN 12 AND 17 THEN 'AFTERNOON SESSION'
WHEN DATEPART(hour, st.TimeStart) BETWEEN 17 AND 24 THEN 'EVENING SESSION'
[Code] ....
writing SQL code to convert blocks of columns into rows.
Example,
Id A1 A2 B1 B2
1 1 1 1 1
2 2 2 2 2
3 3 3 3 3
into
Id Group Value
1 A 1
1 A 1
1 B 1
1 B 1
2 A 2
2 A 2
2 B 2
2 B 2
Hi. Periodically I need to run a delete statement that deletes old data. The problem is that this can timeout using ODBC (via the CDatabase and CRecordSet classes in legacy code). Also, while its running the delete, the table its operating on is locked and my application can't continue to run and operate on rows not affected by the delete.
Are there any workarounds for this? Can the timeout be set in the connect string?
Thanks,
Brian
I've read that microsoft.applicationblocks.data for .net v2 can't be deployed to a mobile app, and my experience bears that out. So I 'm wondering if there are application blocks for windows mobile 5 that would know how to both talk to sql server mobile and sql server 2005. I see OpenNetCF has a port but as far as I can tell, they only address sql server mobile and not talking to a sql server 2005 remote database. I can use the OpenNetCF version for my sql server mobile requirements, but I'm hoping there is an encapsulation of sqlclient calls for communication with my server db.
thanks
braden
I wanted to set up a mechanism that would transfer blocks of records (a few dozen to in rare cases a few thousand), with slight modification, from one database to another. It's a sort of custom partial archiving process that would be triggered from a web-based admin application in plain old ASP (not .net alas). Records in the target db would be identical except:
-- the primary key in the source table, an identity field, would be just an integer in the target table
-- the target table has an extra field, an integer batch ID supplied by the web application that triggers the process
It's a simple, if not efficient matter to do it within the web application: query the source table, suck the records into memory, and insert them one by one into the target db. This will be an infrequent process which can be done at off-hours, so a bit of inefficiency is not the end of the world. But I wondered if there is a more sensible, orthodox approach:
-- Could this process be done, and done efficiently, as a stored procedure with the batch ID passed as a parameter?
-- Is there any way to do a bulk insert from a recordset or array in memory using plain ASP, ADO and SQL? And if so, is that better than inserting records one by one?
I realize that the ASP.NET tableadapter and dataset features might provide a good solution, but in the short run I can't rewrite the whole application. Advice on the best general approach from an ASP-ADO platform would be appreciated, and I will try to figure out the details.
hi All,
Code:
begin try
begin transaction trans1
//some transaction
begin try
begin transaction trans2
//some transaction
//Raise Error
//commit transaction trans2
end try
begin catch
//Raise error
end catch
//some transaction
//commit transaction trans1
end try
begin catch
//Rollback trans1
end catch]
In the above code when an error is raised in the transaction trans2 the immediate catch is not invoked where as the outer catch is being invoked. I dont know y?. Can anybody help. Thanx
Hi,We use a database with about 40 related tables. Some tables contain asmuch as 30.000 records. We use Access97 as an interface to thedatabase. Now recently we have the problem that when we want to inserta row in one specific table (alwasy the same) the database makesblocks.Details:- about 10% of the data was inserted using copying from Excel, beforethis action there was no problem, though there is no evidence thatthis causes the problem.- inserting rows via the Query Analyzer works fine, via Access causestrouble.- the tempdb lofile has grown to 48Mb.Has anyone ideas about what is going on and what I can do to solve theproblem?TAV,Jan Willems
View 1 Replies View RelatedWe have data that consists of an employee number, a start time and a finish time, similar to the example below
EMP Â STARTTIME Â Â Â Â Â Â Â Â Â Â ENDTIME
00001 10-Feb-2012 06:00:00 10-Feb-2012 10:00:00
00002 10-Feb-2012 07:15:00 10-Feb-2012 10:00:00
00003 10-Feb-2012 08:00:00 10-Feb-2012 10:00:00
I am trying to come up with a procedure in SQL that will give me each 15 minute block throughout the day and a count of how many employees are expected to be at work at the start of that 15 minute block. So, given the example above I would like to return
10-Feb-2012 00:00:00Â Â Â Â 0
10-Feb-2012 00:15:00Â Â Â Â 0
10-Feb-2012 06:00:00Â Â Â Â 1
10-Feb-2012 06:15:00Â Â Â Â 1
[code]....
I'm not too worried if the date part is not included in the result as this could be determined elsewhere, but how can I do this grouping/counting?
My server is a dual AMD x64 2.19 GHz with 8 GB RAM running under Windows Server 2003 Enterprise Edition with service pack 1 installed. We have SQL 2000 32-bit Enterprise installed in the default instance. AWE is enabled using Dynamically configured SQL Server memory with 6215 MB minimum memory and 6656 maximum memory settings.
I have now installed, side-by-side, SQL Server 2005 Enterprise Edition in a separate named instance. Everything is running fine but I believe SQL Server2005 could run faster and need to ensure I am giving it plenty of resources. I realize AWE is not needed with SQL Server 2005 and I have seen suggestions to grant the SQL Server account the 'lock pages in memory' rights. This box only runs the SQL 2000 and SQL 2005 server databases and I would like to ensure, if possible, that each is splitting the available memory equally, at least until we can retire SQL Server 2000 next year. Any suggestions?
Hi,
We have an old machine which holds SQL server 2000 database. We need to migrate a whole database to a new machine which has SQL server 2005.
When we tried to move whole database using Import and Export Wizard, only tables can be selected to import/export. However we want to import/export the whole database, including tables, stored procedure, view, etc. Which tool should we use?
Thanks.
Hi,
We have an old machine which holds SQL server 2000 database. We need to migrate a whole database to a new machine which has SQL server 2005.
When we tried to move whole database using Import and Export Wizard, only tables can be selected to import/export. However we want to import/export the whole database, including tables, stored procedure, view, etc. Which tool should we use?
Thanks.
Hi,
When I am trying to access SQL Server 2000 database from another machine i got this error
Server: MSg 17, Level 16, State 1 [Microsoft][ODBC SQL Server Driver][DBNETLIB]SQL Server does not exist or access denied
but I could access the database on same server and in that server i could access other databases in different server.
Hi All,
View 14 Replies View Related