According to BOL you can configure an Alert to notify you when the blocked process threshold has been exceeded:
SQL Server 2005 Books Online
blocked process threshold Option
Use the blocked process threshold option to specify the threshold, in seconds, at which blocked process reports are generated. The threshold can be set from 0 to 86,400. By default, no blocked process reports are produced. This event is not generated for system tasks or for tasks that are waiting on resources that do not generate detectable deadlocks. For more information about deadlock detection, see Detecting and Ending Deadlocks.
You can define an alert to be executed when this event is generated. So for example, you can choose to page the administrator to take appropriate action to handle the blocking situation.
Can someone provide some direction on exactly how this is done? Does it require a Service Broker and queue?
We are using SQL server 6.5 and currently have about 100 users connections at a given point in time. The application is Visual Basic 5.0 based and it allows users to create MS Word documents from the application. These documents names are stored in a table which basically acts as a reference table. Every time a document needs to be attached to a record this table is called with an insert/update query. This results in an exclusive page/table lock and ends up into a blocked process. This finally results into a major halt for all the system users.
Manual killing of these blocked processes frees up the resources and brings things back to normal although disruptive to the users.
Any clue as to why the blocked processes are not able to free themsevles up ? Are we missing something in our SQL configuration that will help us with unblocking these processes ?
If I kill a blocked process, why does the current activity window still show the process? Both processes, blocking and blocked, are scheduled tasks. Also, the blocked process is still listed as a running task in the manage scheduled task window.
we have tables that load overnight. Sometimes a user will try to run a query up against that table while the table is loading and a deadlock occurs. I do not notice this until I get into the office. By this time many tables have not loaded. Is there a way to have SQL6.5 automatically Kill deadlock processes.
I have a question about blocked processes. My manager wants to know how you can know the following when a blocked process is encountered.
The job, program and the statement that is causing the blocking. I have figured out how to determine the database involved but not a lot else. Such as in the blocked process report there is a waitresource field. How to you decipher it? The report also tells you the client application, but in this case it just states the Micro focus Net Express as the client, not the job or program that is running.
I have an inputbuf listed but it just shows the database id, and the object id. And I am not sure how to locate the object id in this case, I am guessing sys.objects, but I do not seem to find that one.
I have read many things on MSN about the blocked process report, but it does not seem to go into great detail.
I seem to have all the pieces, but do not exactly know how to quickly tie them all together.
I am looking to create an upload process that will check for an Excel in a specific directly, if exists, upload the file and populate the database. I have never done this before (only data extracts). How to do the upload process, is this something that can be done with SQL Server 2012?
Why is the PasswordRecovery control creating another sqlservr process after I successfully enter the user name, and for a reason I have not found yet it keeps the database in a read only modem then when I answer my security question correctly it gives me the error: Failed to update database "C:INETPUBWWWROOTVER1.0.0.1APP_DATAASPNETDB.MDF" because the database is read-only. I read other post regarding the read only message and I don't if there doing the same thing, but why does this happen and what is the solution to this other not using the control?
is there any threshold manager in the ms sql server? like in the transaction log, where you can add a stored proc that will dump the tran log everytime a threshold is hit.
I have a table with 188376 rows and the data size = 3012 KB, index size = 5884 KB . LE threshold max is set to 2000 and LE threshol percent to 20% I have an index on that table and observed that it is not getting used. I would like to know whether sql optimizer uses the index based on the cost of the query plan or does the table scan once the LE thresholdlimit is reached overriding the optimized plan.
Guys I am really stuck on this one. Any help or suggestions would beappreciated.We have a large table which seemed to just hit some kind of threshold.They query is somewhat responsive when there are NO indexes on thetable. However, when we index email the query takes forever.FACTS- The problem is very "data specific". I can not recreate theproblem using different data.- There is only a problem when I index email on the base table.- The problem goes away when I add "AND b.email IS NOT NULL" to theinner join condition. It does not help when I add the logic to the"WHERE" clause.DDLCREATE TABLE base (bk char(25), email varchar(100))create clustered index icx on base(bk)create index ix_email on base(email)CREATE TABLE filter (bk char(25), email varchar(100))create clustered index icx on filter (bk)create index ix_email on filter (email)QuerySELECT b.bk, b.emailFROM base b WITH(NOLOCK)INNER JOIN filter f ON f.email = b.email--and f.email is not nullData Profile--35120500, 35120491, 14221553SELECT COUNT(*) ,COUNT(DISTINCT bk), COUNT(DISTINCT email)FROM base--16796199, 16796192, 14221553SELECT COUNT(*) ,COUNT(DISTINCT bk), COUNT(DISTINCT email)FROM baseWHERE email IS NOT NULL--250552, 250552, 250205SELECT COUNT(*) ,COUNT(DISTINCT bk), COUNT(DISTINCT email)FROM filter--250208, 250208, 250205SELECT COUNT(*) ,COUNT(DISTINCT bk), COUNT(DISTINCT email)FROM filterWHERE email IS NOT NULL
Hi allI have a large data set of points situated in 3d space. I have a simpleprimary key and an x, y and z value.What I would like is an efficient method for finding the group ofpoints within a threshold.So far I have tested the following however it is very slow.---------------select *from locations a full outer join locations bon a.ID < b.ID and a.X-b.X<2 and a.Y-b.Y<2 and a.Z-b.Z<2where a.ID is not null and b.ID is not null---------------If anyone knows of a more efficient method to arrive at this results itwould be most appreciated.Thanks in advanceBevan
I am doing workload analysis on SSAS - Tabular (2012), I have perfmon logs captured and want to run through PAL. I am looking out for threshold file for SSAS tabular 2012/2014.
The below stored procedure is used to create a vertical benchmark line on the X-Axis which has a hour scale. I use the stored procedure to find out which temperature crosses or equals the threshold temperature (340), then plot the vertical benchmark line at the hour the first temperature is equal to or greater than 340 degrees and less than 1000 degrees.
The logic below works if the temperature is equal to or greater than 340 degrees and less than 1000 degrees. THE ISSUE is I have 8 temperatures if they don't cross the threshold of 340 degrees I need to set a default value for my vertical line. In other words if the temperature is 180 and my threshold is 340 then set my vertical line on the highest temperature close to 340.
I tried removing my Where clause (but then it breaks the logic for those temperatures that are equal to or greater than 340). I tried using Case When but this didn't give me what I want either. I tried UNION as well. All giving me results I don't want.
Here is what I am looking for:
This first example is one where there was a temperature that was equal to or greater than the threshold of 340 degrees. This is CORRECT
If 8 temperatures did not equal or cross the threshold then give me the hour of the highest temperature close to the threshold but do not return 0.
For Example:
temp1 92 temp2 108 temp3 0 temp4 284 <<< this is the closest to the threshold so give me the hour when this occurred. temp5 2192 *Remember I can only count temperatures less than 1000 degrees. Anything above 1000 degrees mean there is nothing in the oven. So it is false/positive. temp6 102 temp7 0 temp8 12
Code: CREATE PROCEDURE [dbo].[AgeScoreCardThreshold_JJ_12232013] -- Add the parameters for the stored procedure here @LicenseNumber int = NULL, @Lot varchar(50) = NULL
I've got an SSIS solution file with project deployment model in VS 2013 and would like to deploy that to SSISDB on different environments.All these days I followed the regular way to create a project in SSISDB and deploy it to that. Now want to find out if i can automate this process and so got some questions
1. Can we automate the process of creating a project on SSISDB based on our SSIS project name? This will be like when we do a deployment it should check if the project exists or not on SSISDB based on our SSIS project name, if the project exists we just deploy the packages in the project and if the project does not exists in SSISDB it will create that project and deploy the packages.
2. Can we also automate the process of creating environments? In traditional way we manually create the environment variables under environment tab of SSISDB, but can we make that also as part of deployment? Like when we are releasing to Dev server we look if that particular Dev variable exists on that server, if it exists we just update the existing stuff and if it does not exists we just create it.
Can you use the below query to get CPU high utilisation alert purposes for both named and default instance? or, do I need to make any changes here (@wmi_namespace=N'.ROOTCIMV2' ) ?
USE [msdb] GO EXEC msdb.dbo.sp_add_alert @name=N'CPU_WM_Utilization_Check', @message_id=0, @severity=0,
Referencing an article regarding MAXDOP and cost threshold for parallelism from Brent Ozar's website: [URL] .....
We have a 2 physical CPUs that are 4 cores each with hyper threading enabled. When looking through the task manager, under the performance tab, I see 16 CPU threads.We have set the MAXDOP value is set at 4.
Reading further, cost threshold for parallelism setting is recommended at 50 to start with.
I have a table that is made up of the sum of medical, mental health and pharmacy claims. I would like to query that to find instances when the sum of the three claims types are greater than a predetermined threshold.
For example: Patient 1 Medical = 10,000 (could be 10 records at 1,000 each) Patient 1 Mental Health = 5,000 Patient 1 Pharmacy = 15,000 Patient 2 Medical = 1,000 Patient 2 Mental Health = 0 Patient 2 Pharmacy = 500
Threshold is 25,000
If I queried the above sample table I would get one record: Patient 1 30,000 - because 10,000+5,000+15,000 = 30,000 and is greater than the threshold.
I am not sure that a having clause would work though.
I am trying to script a case when to achieve the following.
I have a table of measures with certain threshold. The threshold direction can either be > or < so I want to create a field that shows if the measure hits that threshold or not to be later picked up in SSRS. So a nested case when?
CASE WHEN M.[Threshold Direction] = '>' THEN CASE WHEN A.[Value] > M.[Threshold] THEN 'GREEN' CASE WHEN A.[Value] < M.[Threshold] THEN 'RED' ELSE '' END END END AS 'Condition'Is this at all possible?
I was trying to extract data from the source server using OLEDB Source and SQL Server Destination when i encountered this error:
"Transaction (Process ID 135) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.".
What must be done so that even if the table being queried is locked, i wouldn't experience any deadlock?
Hello all, I am running into an interesting scenario on my desktop. I'm running developer edition on Windows XP Professional (9.00.3042.00 SP2 Developer Edition). OS is autopatched via corporate policy and I saw some patches go in last week. This machine is also a hand-me-down so I don't have a clean install of the databases on the machine but I am local admin.
So, starting last week after a forced remote reboot (also a policy) I noticed a few of the databases didn't start back up. I chalked it up to the hard shutdown and went along my merry way. Friday however I know I shut my machine down nicely and this morning when I booted up, I was in the same state I was last Wenesday. 7 of the 18 databases on my machine came up with
FCB:pen: Operating system error 32(The process cannot access the file because it is being used by another process.) occurred while creating or opening file 'C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDataTest.mdf'. Diagnose and correct the operating system error, and retry the operation. and it also logs FCB:pen failed: Could not open file C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDataTest.mdf for file number 1. OS error: 32(The process cannot access the file because it is being used by another process.).
I've caught references to the auto close feature being a possible culprit, no dice as the databases in question are set to False. Recovery mode varies on the databases from Simple to Full. If I cycle the SQL Server service, whatever transient issue it was having with those files is gone. As much as I'd love to disable the virus scanner, network security would not be amused. The data and log files appear to have the same permissions as unaffected database files. Nothing's set to read only or archive as I've caught on other forums as possible gremlins. I have sufficient disk space and the databases are set for unrestricted growth.
Any thoughts on what I could look at? If it was everything coming up in RECOVERY_PENDING it's make more sense to me than a hit or miss type of thing I'm experiencing now.
Dear list Im designing a package that uses Microsofts preplog.exe to prepare web log files to be imported into SQL Server
What Im trying to do is convert this cmd that works into an execute process task D:SSIS ProcessPrepweblogProcessLoad>preplog ex.log > out.log the above dos cmd works 100%
However when I use the Execute Process Task I get this error [Execute Process Task] Error: In Executing "D:SSIS ProcessPrepweblogProcessLoadpreplog.exe" "" at "D:SSIS ProcessPrepweblogProcessLoad", The process exit code was "-1" while the expected was "0".
There are two package varaibles User::gsPreplogInput = ex.log User::gsPreplogOutput = out.log
How do I use the execute process task? I am trying to unzip the file using the freeware PZUnzip.exe and I tried to place the entire command in a batch file and specified the working directory as the location of the batch file, but the task fails with the error:
SSIS package "IngramWeeklyPOS.dtsx" starting.
Error: 0xC0029151 at Unzip download file, Execute Process Task: In Executing "C:ETLPOSDataIngramWeeklyUnzip.bat" "" at "C:ETLPOSDataIngramWeekly", The process exit code was "1" while the expected was "0".
Then I tried to specify the exe directly in the Executable property and the agruments as the location of the zip file and the directory to unzip the files in, but this time it fails with the following message:
SSIS package "IngramWeeklyPOS.dtsx" starting.
Error: 0xC002F304 at Unzip download file, Execute Process Task: An error occurred with the following error message: "%1 is not a valid Win32 application".
The command in the batch file when run from the command line works perfectly and unzips the file, so there is absolutely no problem with the command, I believe it is just the set up of the variables on the execute process task editor under Process. Any input on resolving this will be much appreciated.
I am designing a utility which will keep two similar databases in sync. In other words, copying the new data from db1 to db2 and updating the old data from db1 to db2.
For this I am making use of the 'Tablediff' utility which when provided with server name, database, table info will generate .sql file which can be used to keep the target table in sync with the source table.
I am using the Execute Process Task and the process parameters I am providing are:
The customer.bat file will have the following code: tablediff -sourceserver "LV-SQL5" -sourcedatabase "TC_CTI" -sourcetable "CUSTOMER_1" -destinationserver "LV-SQL2" -destinationdatabase "TC_CTI" -destinationtable "CUSTOMER" -f "c:SQL_bat_Filessql5TC_CTIsql_filescustomer1"
the .sql file will be generated at: C:SQL_bat_Filessql5TC_CTIsql_filescustomer1.
The Problem: The Execute Process Task is working fine, ie., the tables are being compared correctly and the .SQL file is being generated as desired. But the task as such is reporting faliure with the following error :
[Execute Process Task] Error: In Executing "C:SQL_bat_FilesSQL5TC_CTIpackage_occurrence.bat" "" at "C:Program Files (x86)Microsoft SQL Server90COM", The process exit code was "2" while the expected was "0". ]
Some of you may suggest to just set the ForceExecutionResult = Success (infact this is what I am doing now just to get the program working), but, this is not what I desire.
I'm pulling data from Oracle db and load into MS-SQL 2008.For my data type checks during the data load process, what are options to ensure that the data being processed wouldn't fail. such that I can verify first in-hand with the target type of data and then if its valid format load it into destination table else mark it with error flag and push into errors table... All this at the row level.One way I can think of is to load into a staging table then get the source & destination table -column data types, compare them and proceed.
should I just try loading the data directly and if it fails try trouble shooting(which could be a difficult task as I wouldn't know what caused error...)
I am having this table locking issue that I need to start paying attention to as its getting more frequent.
The problem is that the data in the tables is live finance data that needs to be changed and viewed almost real time so what I have picked up so far is that using 'table Hints' may not be a good idea.
I have a guy at work telling me that introducing a data access layer is the only way to solve this, I am not convinced but havnt enough knowledge to back my own feeling up. (asp system not .net).
I get the error message below. Books online doesn't say any more. Can anyone explain? I am the only user connected to the db at the time, no jobs are executing.
Cannot shrink log file 2 (log) because all logical log files are in use.
(1 row(s) affected)
DBCC execution completed. If DBCC printed error messages, contact your system administrator.