SQL Tools :: How To Trace Or Monitor Previous Actions In 2012
Sep 12, 2015how to trace or monitor previous actions in sql server 2012. Like previous using delete command . I want to know who is used that command in server.
View 2 Replieshow to trace or monitor previous actions in sql server 2012. Like previous using delete command . I want to know who is used that command in server.
View 2 RepliesI am looking for the right tools to do application monitoring.I'm hoping to find one single tool that can do the entire job but ifit does not exist then a few monitoring apps would do as well.I need the ability to do the standard things like testing for ping,checking for windows services running and restarting them after somethreshold is met, and wmi.Some of the more tricky things I need it to do are:* Parse log files looking for specific error codes, and the ability toset an alert only if it sees that error X times over some period oftime.* Run custom slq queries like row counts and max values returned froma query and if that condition is met X times over some period oftime.* Run an external app that does its own custom testing and act on itsresults, which could be to parse the result file from the app.Keep in mind cost is not the isssue right now - so this can befreeware to some type of enterprise solution that our company can use.Does anyone know of any tools that you can point me towards?Thanks...
View 1 Replies View RelatedI have an SSAS cube that i want to add some actions to. I've had problems adding a reporting action to the cube so decided just to add a URL action instead. Start simple and build on the concepts...
So i add a new action, give it a name, set the Target Type to Cells, Taget object to All Cells. I've put no condition on the action since i want it to appear all the time.
The action content type is set to URL the action expression is set to [URL] I've also set a caption of "Google" under the additional properties and said that the caption is MDX (I'm aware that it isn't but i do intend to expand on this...).
I then build and deploy my cube, call up excel (2010) and then create a pivot table off the back of the cube but when i right-click the cells in the pivot table and go to "additional actions" it tells me that there are none specified.
[URL]
I have a cube where i would like to define some actions. I have done report Actions but they are limiting in terms of what i am trying to achieve.I want a popup window from the action from relevant cells at the moment i can use the urlaction type to open the link in a new window which is ok but i want this as a popup.
View 0 Replies View RelatedI have been using database mail for quite a few years now. I started using it with SQL Server 2005 and then it worked fine. Then we went over to SQL 2012 and the problems started. At first I could not get any e-mails out and read a post by Microsoft that a patch needs to come out but not available yet. That was in 2012. Presumably the patch came out with windows updates and my DB Mail started working. Now, however, I have the problem that some e-mails do get sent but the majority failed.
I use this so that whenever a user captures certain data to update a person's status on the database then an automatic e-mail will be sent to that person to inform them of their status at this company.
When I try to open Activity Monitor from SSMS I receive the message "Access is denied. (Exception from HRESULT: 0x80070005 (E_ACCESSDENIED)) (mscorlib)". - more details below.
I have a SQL Server 2012 Enterprise SP1 installed in an Active/Passive cluster configuration on Windows 2008 R2 Enterprise SP1. The problem happens using sa and a domain administrator.
------- more details -------
Access is denied. (Exception from HRESULT: 0x80070005 (E_ACCESSDENIED)) (mscorlib)
at System.Runtime.InteropServices.Marshal.ThrowExceptionForHRInternal(Int32 errorCode, IntPtr errorInfo)
at System.Management.ManagementScope.InitializeGuts(Object o)
at System.Management.ManagementScope.Initialize()
[Code] .....
This is for SQL Server 2005 SP4 Build 5266. We have been having performance issues in production. There are tight deadlines to be met and it is important that they are solved promptly.
Yesterday we replicated the situation in the acceptance testing environment. The jobs take 8 hours to run and we started at 2:00 PM.
Just before the jobs ran I set up an SQL Server Profiler trace to catch processes with a duration of longer then 12 seconds. I set it to save the results to a database table.
Last night I checked the table at 5:00 PM and there were entries in the table. However, I could be mistaken.
At 9:00 PM I checked the table and it was empty.
This morning I arrived at work and checked SQL Server Profiler. The trace was running and within SQL Server Profiler, there are 100s of results. I stopped the trace. However, checking the table, it is empty.
I thought I would be able to save the trace results to a file. When I chose "Save As" from the file menu, all the options are greyed out (trace file, trace template, trace table, etc).
The results are there but there is no way of saving them and no way of exporting them. How could this have happened?
Is there a location, where SQL Server Profiler saves the results in a temporary space. I may be able to open them and retrieve them. How can I save the results? Why are all my options greyed out?
Am customizing SQL server MGMT tools 2012 for Mass deployment.Client had asked to remove Customer Feedback option from help menu.how to disable that.
View 6 Replies View RelatedAny TSQL script to monitor replication(Transactional, Snapshot) with current status ? I have tried below script but it giving error.
"Msg 8164, Level 16, State 1, Procedure sp_MSload_tmp_replication_status, Line 80
An INSERT EXEC statement cannot be nested."
DECLARE @srvname VARCHAR(100)
DECLARE @pub_db VARCHAR(100)
I have created a session in extended events and want to frequently monitor the events that i have filtered .The problem i am facing is i am not able to clear the previous data as we do it in SQL trace. i am able to see all the data that is occupied till now.whenever i start the test i want the current data only to be displayed.The work around is either delete the session and create a new one every-time which i do not want to do .I am able to see the CLEAR DATA FROM TABLE button but it is in Disabled state all the time.
View 2 Replies View RelatedSQL Server 2012 Data Tools was working fine for me but something must've changed, now every time I try to create a new SSIS project I get:
The server threw an exception. (Exception from HRESULT: 0x80010105 (RPC_E_SERVERFAULT)).
When I try to open an existing project I get:
exception has been thrown by the target of an invocation
external component has thrown an exception (SSISUpgrade)
The issue seems to only arise with SSIS projects.I have already uninstalled SQL Server 2012 and reinstalled it and that didn't work.I tried to install Visual Studio 2012 Data Tools with BI and that also crashes when I try to create an SSIS project.Output of SQL Server SELECT @@VERSION is:
Microsoft SQL Server 2012 - 11.0.2100.60 (X64)
Feb 10 2012 19:39:15
Copyright (c) Microsoft Corporation
Enterprise Edition (64-bit) on Windows NT 6.2 <X64> (Build 9200: ) (Hypervisor)
SQL Data Tools page info:
Microsoft SQL Server Integration Services Designer
Version 11.0.2100.60
Microsoft Visual Studio 2010
Version 10.0.40219.1 SP1Rel
Microsoft .NET Framework
Version 4.5.51641 SP1Rel
I have this script running to list all FK relationship but it's only for single db,tried to make it work for inter DBs
DECLARE @Tname varchar(40) = 'CustAccount' -- <@>< Table Name to trace
-------------------------------------------------
SELECT DISTINCT No ,
ist1.table_catalog AS [Parent DB],
ist1.table_schema AS ParentSchema,
ParentTable ,
ReferencedColumnName ,
ConstraintName ,
link,
ForeignTable,
ForeignKeyColumn ,
[Code] ....
I am in the middle of capturing a workload to try and tune a SQL instance and was wondering what kinds of sizes people capture in terms of traces. I am only 1 day into a capture and I believe a typical workload would be a week long capture and I am already at 10GB of files. I am only capturing rpc_completed and sql_batch_completed.
What sizes of workloads do other people capture and then where do you analyse them, do you have particular dedicated server for this kind of thing as at present I am looking to use my local PC. Also what rollover file sizes do people tend to use, I am currently using 1GB.
Where would i find my particular database Error log file, Event log file and Trace file etc?
View 5 Replies View Relatedwhat the ideal CPU count and Max Degree of Parallelism are for a 3rd party database server.The server has 12 CPUs, 32GB RAM and all database sizes add up to < 30GB so they can all fit in memory (I tried to force this by doing a select * from every table). On certain payroll days, the CPU gets maxed out to 100% for a few seconds.
MAXDOP was originally set to the default 0. We later changed it to 8 based on several 'best-practices' articles. However the vendor suggests to change it to 1 (no parallelism), while others suggest changing it to 4, so that one run-away query doesn't hog most of the CPUs.
I'd like to find out how many CPUs are actually being used by queries. There is a Degree of Parallelism event in URL.... The BinaryData column says :
0x00000000, indicates a serial plan running in serial.
0x01000000, indicates a parallel plan running in serial.
>= 0x02000000 indicates a parallel plan running in parallel.- What does "parallel plan running in serial" mean ?
I see a lot of 0x01000000, and a few 0x08000000's in my trace.How can i determine whether one query is hogging CPUs and if reducing it to 4 will work?
Somehow someone turned on a audit on the sql server and it is filling up our hard drive and shutting down sql server eventually. Been trying to google how to shut this audit off but coming up with no via soolution yet. how can I turn this trace off. Each fiel says AuditTrace and date and they happen every other minute. I went into the sql profiler and can pull up the files but how to shut the trace off, it does not say.
View 9 Replies View RelatedI am getting deadlock in my production, i was taken deadlock information from trace file , i found deadlock graph but i am unable to find exact scenario . I am attaching deadlock trace file.
View 5 Replies View RelatedIs there a way to setup a trace to show only direct TSQL statements triggered on my server? note I don't want to capture Procedure calls or the statements called within the procs.
Actually many people are firing direct SQL statements on server. And some are coming from entity framework as well. I just want to capture those.
I have a SSIS package set up that will transfer a file from a location on the network drive and transfer it over FTP to another location.
When I manually run the package, the file is transfer with no errors. But when the job is automated (via Job Activity Monitor) the transfer fails?
I have set the ProtectionLevel of the package to "EncryptSensitiveWithUserKey" and also converted the package to a Development Model. The settings for the FTP is saved within the package.
What am I missing? below is the error message
Executed as user: UHBInfoSQLAgent. Microsoft (R) SQL Server Execute Package Utility Version 11.0.5058.0 for 32-bit Copyright (C) Microsoft Corporation. All rights reserved. Started: 08:43:02 Error: 2014-10-13 08:43:03.72
Code: 0xC001405F Source: ResearchWebsite
[Code] .....
We have killed a job which is now in KILLEDROLL BACK state. Job activity monitor is not showing any running jobs but I can see the SPID of that job. When tried to kill again its giving the message ‘command completed successfully‘, not able to get the percentage or time for the roll back to complete.
Another DBA tried to create a snapshot and it was stuck and I believe it was because of this ROLLBACK as both were using same Database.
I have an interesting problem. A number of spids are being blocked by a single select statement. The select statement is the same as returned from sp_who2, sysprocesses, sp_whoisactive of dbcc inputbuffer. It is not waiting on anything and has status as sleeping.
Clearly it is not 'just' a sleeping select statement as I can see over a thousand locks held by the spid on 2 user databases and tempdb. I'm working on the theory there is a begin transaction with a bunch of statements and no closing commit. But I want to be able to prove that. How can I show what statements were previously executed as part of this transaction?
Additional Info: SQL 2012 Enterprise Edition. This is a test server but is a reproduction of a live issue. At this point the application team cannot isolate the code causing the problem, only the set of processes the code resides in.
Getting events in the default trace saying missing column statistics on a column...
1.The column is the primary key column ( identity )
I am attempting to create a new trace but I get the following error message: "failed to start a new trace".
I have been doing some digging and as I understand it, I had to find the directory Profiler uses for temporary files. So, I typed the following in the command window "SET TMP" and I received the following reply:
C:UsersRossAppDataLocalTemp
Now, according to the forum: [URL] ...
I am supposed to check that the system folder pointed to by the TMP environment variable exists and is not crammed with files.
Well, when I went to the directory C:UsersRossAppDataLocalTemp, it is indeed full of both files and directories. The size is 16.3 MB and has 133 files and 63 folders.
When I had a look at the Environment Variables window and chose TMP the value is "%USERPROFILE%AppDataLocalTemp" which according to my limited understanding is the equivalent to C:UsersRossAppDataLocalTemp.
So, what I am wondering is am I supposed to totally clear out this directory? I am not too keen on doing this because I don't want to stuff my PC up.
Currently there are various teams accessing the database. For costing reasons, we need to track usage.Is there an efficient way to monitor User access to the database.Can we track which user has executed which query(SELECT,insert etc),the login time and such parammeters?
View 5 Replies View RelatedIs there anywhere in the SSMS to see the start time of a currently running job, currently running it's first step, without needing to run a query?
If not, why would Microsoft decide to not show it in the job activity monitor?
SSMS = Sql Server Management Studio
I Want to monitor Replication count of object (Table )if it is not equal to Publication (Table ) and subscriber (Table ), It have to send mail with count difference.
View 9 Replies View RelatedSet up a trace with the events RPC:Completed, SQL:BatchCompleted, SQL:BatchStarting, and SQL:StmtCompleted.
When I issue the statement: SELECT * FROM XyzView there is nothing captured in Profiler. If I script out the view and then execute the select statement that defines the view, it does show up in Profiler.
I've tried adding a lot of the other events, i.e. SP:StmtCompleted and the various other StmtStarting events and the trace still does not capture anything.
Am I capturing the wrong events or is this known behavior? My goal is to see what the overhead is for using a view versus persisting the results of the view as a table and referencing that instead. The view in question is against static data, joins 9 tables, and is referenced a lot.
I can use the stats generated when I execute the select that defines the view but I still find this to be curious behavior so I assume I'm doing something wrong.
Is there a dbcc flag that will capture all error messages in the log
e.g. when inserting data into a table, a PK violation occurs throwing, Msg 2627
..similar to trace flag 1222 to capture deadlock info..
After carefully analyzing the situation for almost a month, I pulled the trigger and made an exception at work; we did enable trace flag 2371. We have some tables with billion of rows and outdated statistics causing horrible plans. Tried several methods to update those, but did not solve the problem or was too CPU intensive, causing other issues.
Anyway, one of the side effects I am seeing so far is average vCPU went down by almost 40%. Nothing out of usual (besides the flag) has been enabled or was executed. So my assumption is, CPU hungry plans are now gone or reduced.
In the past, I've combined server side traces with Perfmon successfully, which is pretty useful, I know that. I would like to do the same with Extended Events, so I can correlate and analyze the server side as well.
View 4 Replies View RelatedAt one of our client side a wired log shipping issues has come up.while monitoring those two server i noticed that although log-shipping report says both server are in sync, report displays information related to both backup and restore , it doesn't shows information related to copy that is when was last file copied and last file copy column is showed blank. Same is when i execute proc "sp_help_log_shipping_monitor" . I get same result ...
When i expand copy job history to analyse its what i found is although job has executed successfully , but in depth reading each steps says that no .trn file was copied .
My copy directory is at secondary server itself where .trn files are placed.And from this location itself files are begin restored.
SQL server and agent on both servers are run by same domain account ....
CREATE TABLE #MyTable
(
Teams VARCHAR(10),
StartDate DATETIME,
Count INT
)
INSERT INTO #MyTable (Teams,StartDate,Count)
SELECT 'Team A', '01/01/2014',10
[Code] ....
'Team A' Has No records for '01/02/2014' and '01/03/2014' so I need to insert values of '01/01/2014' StartDate for Team A for the missing dates
So 'Team A' will now have 3 records
Team A2014-01-01 00:00:00.00010
Team A2014-01-02 00:00:00.00010
Team A2014-01-03 00:00:00.00010
'Team B' Has No records for '01/03/2014' so I need to insert values of '01/02/2014' StartDate for Team B for the missing date
So 'Team B' will now have 3 records
Team B2014-01-01 00:00:00.00030
Team B2014-01-02 00:00:00.00040
Team B2014-01-03 00:00:00.00040
As for 'Team C' we have values for all 3 dates, no inserts needed.
Given this table;
DECLARE @table TABLE (HolidayDate DATE, HolidayName NVARCHAR(50))
INSERT INTO @table
( HolidayDate, HolidayName )
VALUES ('2012-01-01','New Years Day'),
('2012-01-16', 'MLK Day'),
('2012-02-20', 'Presidents Day'),
[code]....
How to get a result set that shows a new column called PreviousHolidayDate with the corresponding holidays last years date?
HolidayDateHolidayNamePreviousHolidayDate
1/1/2012New Years DayNULL
1/16/2012MLK DayNULL
2/20/2012Presidents DayNULL
4/6/2012Good FridayNULL
5/28/2012Memorial DayNULL
[code]....