Custom Events While Retaining Original Event Arguments (SqlDataSourceStatusEventArgs)
Dec 26, 2007
Hi,
I am looking to implement a custom event handler that will also retain the original event arguments (in addition to several custom arguments).
Specifically, I am looking to pass custom arguments into a SqlDataSourceStatusEventHandler, but also want to be able to access the Command.Parameters.
I have implemented a new Event Arguments class (derived from System.EventArgs), new Event class and delegate, but do not know how to retain the SqlDataSourceEventArgs. I would really appreciate your suggestions!
I have an SqlDataSource control on my aspx page, this is connected to database by a built in procedure that returns a string dependent upon and ID passed in. I have the followinbg codewhich is not complet, I woiuld appriciate any help to produce the correct code for the code file Function GetCategoryName(ByVal ID As Integer) As String sdsCategoriesByID.SelectParameters("ID").Direction = Data.ParameterDirection.Input sdsCategoriesByID.SelectParameters.Item("ID").DefaultValue = 3 sdsCategoriesByID.Select() <<<< THIS LINE COMES UP WITH ERROR 1End Function ERROR AS FOLLOWS argument not specified for parameter 'arguments' of public function Select(arguments as System.Web.DatasourceSelect Arguments as Collections ienumerable
Help I have not got much more hair to loose Thanks Steve
I have a table that holds log event records, that keep getting appended. I need to get the duration of the each event 0 which has other events before and after. How can I do this - to get event duration of event 0 and the cumulative.
ID Event  Date 1   1       2015-06-21 21:01:44.457 2   1       2015-06-21 21:01:44.457 3   0       2015-06-21 21:02:04.780 4   1        2015-06-21 21:02:32.600 5   0       2015-06-21 21:02:57.967 6   1       2015-06-21 21:03:30.513
Hi everyone, I was wondering how do for 'Log Events' info going to Event Viewer Windows visor. Is it possible or it's just only private info for you SSIS.
I am backing up my all the databases (around 50 servers and 500 database) using scheduled jobs. I backup my transaction log files at every two hours of interval and databases on daily basis.
I have set option of "Write to Windows Application Event Log" - When job fails during the creation of job in the "Notification Tab" of Create New Job.
As per documentation SQL server automatically records some of the events to windows application event log. After succesful completion of Log backup and DB backup, event gets logged in Windows Application Event Log. And when job fails, I get two events logged in the Windows Application Event Log file, one due to my setting and other automatically by SQL Server.
Due to this Windows event log file is growing much faster and I have to clear it in every 3-4 days.
My quesiton - Is there a way such that to get only failures event logged in Windows Application event log file and any successful backup jobs should not go to Windows application event file?
Can any suggest some idea what should I do?
I am using SQL Server 2000 with SP3 on Windows 2000 Advanced server. Some of my databases are still using Version 7.0/6.5 on NT.
I was wondering if anyone knew of a way to disable the following, SQLISPackage start/finish events sent to windows event logs everytime a SSIS package is executed and completed.
Hi all - so I know how to raise a custom event or information message from a script. It this the only way it can be done. I'd like to throw an informative message when a package starts, grab that in the event handler and send out email message when the package starts and when it finishes. I would like to pass on some of the variables of the package so custom would be the choice. I know if I throw a scipt in there I can send oninformation, onerror, onwarning but can it be done any other way
I am trying to use xquery to get event data back from extended events. I am trying to use some sample data from Grant Fritchey but I am getting null records back. Below is the xml - I just want to retrieve a distinct list of the client_hostname and client_app_name.
WITH xEvents AS (SELECT object_name AS xEventName, CAST (event_data AS xml) AS xEventData FROM sys.fn_xe_file_target_read_file ('C:LoginTraceShared_0*.xel', NULL, NULL, NULL)) SELECT distinct top 1000 xEventName, xEventData.value('(/event/data[@action_name=''Client_APP_Name'']/value)[1]','varchar') Client_APP_Name, xEventData.value('(/event/data[@action_name=''Client_Host_Name'']/value)[1]','varchar') Client_Host_Name FROM xEvents
We know we can use the event lock_deadlock and xml_deadlock_report to capture the deadlock info, however I also want to capture the execution plans for all of the SPIDs in the deadlock graph, how to output the execution plans to the extended events trace results either ? such as if there is an action for execution plan or workaround for it ?If there is no built in action for execution plan , may I know if we can add the customized info to the extended events results file also ? Such as when the deadlock related event happens , then we can run a query to get some info ,then added the info along with other info such as sql_text, dbname etc  to the events trace results file either ? The reason is if we also know the execution plans when the deadlock happens, it is useful to turning the query based on the execution plans to reduce deadlock happening .
One of my plan is to develop a custom task compoent to log the errors into the destination provided as paramter to the component. Now how can I restrict the use of this component in the event handler tab only. can something like this be done?
Also extending this a little further, is the below thing possible?
The custom component should expose custom properties which should allow the user to add the destinations for logging the errors. It will be a predefined list (window event log, text file, sql table etc) and the user will be allowed to check what all he/she wants. Then if the user has selected sql table then ask for the oledb connection and the table name, if text file is selected the ask for the path of the file.
Apology if I am asking too many questions around the same thing (error handling). There may be a better way to acheive what I am trying to acheive but then I have no idea about it. Your guidance will be of great help.
Again, Thanks a lot for helping this extra curious guy who wants to try and develope generalized compoenents if possible.
I'm trying to implement a custom log table. To keep the discussion simple, let's say I only have 1 column in this table and all I want to write in it are
"Start" when the package starts "Error" when it encounters an error "Finish" when the package finishes. Even if there was an error, I still want to enter "Finish'.
My Control Flow has 3 task objects, 2 Execute SQL Tasks, and 1 Data Flow Task in between them.
The first Execute SQL Task does an insert statement for the Start and the second Execute SQL Task does an insert for the Finish.
To capture any package errors, I also have an Execute SQL Task (to insert "Error") in the Event Handler for OnError. I see that when I cause an error in my package it can raise multiple OnError events, which will envoke my Execute SQL Task multiple times. (This is good because it will allow me to write a line per error event with the error description.)
The problem I have is, how do I write the "Finish" log when I have an error? If I put the insert for the finish in the same Execute SQL Task with the errors, then it will write a "Finish" for every error. But I can't put it anywhere else because if I put it anywhere else, the package never makes it there because it stops at the OnError Event Handler.
Or is there a way for me to tell the package to do the 2nd Execute SQL Task all the time?
Lastly, is there a better way to do this kind of custom logging?
I'm confused about the e.AffectedRows documentation. The documentation says that e.AffectedRows is, "The number of rows affected by a database operation. The default value is -1."I am getting -1 returned after a successful update. Am I supposed to get -1 returned after a successful operation or a 1 because 1 row was affected by the update?
Hello experts. I have been searching for anything about this but found very little. What are the events logged in SQL Server Error Logs aside from Successful/Failed Login, Backup/Restore/Recover database and, start/init sql server? Can we configure this to log other events, like CREATE or DBCC events for example? If so, how? Thanks a lot.
For my db connection managers the password is not being saved. I have 'remember password' checked for each connection manager, but when I close down the BIDS and open it again, I have to re-enter the password again. Even when I import the dtsx file into SQL IS, its not in the connection manager information. I have to re-enter the password again every time prior to running it,.
What is causing this? I don't want to have to enter in the password everytime I run it because the jobs are going to be scheduled and no user interaction unless one fails.
Hello, I need some help to solve an issue we have with SSIS, perhaps someone could help us.
We need to extract data from a remote database. We would like to use SSIS to extract data from that DB. But our actions are very restricted in that DB; we can query the DB views and create (local, global) temp tables only.
So, we created an SSIS package with two SQL execution tasks. In the first task, we create two local temp tables and insert some results into them. In our second task, we need to access the values stored into the temp tables, so we have the two tasks connected and we set the RetainSameConnection to true.
But when we execute, the second task always fails reporting €œinvalid object name€? (referring to the temp table names). It looks like each task is using a different DB, making impossible for us to use the temp tables we created in the first task. We tried both ADO.net and OLEDB without much luck so far.
Any ideas how to solve this problem? Any hints and recommendations are much appreciated.
As a side note, we are trying to avoid global tables because our test team will need to test our code and if the code is run by both SSIS and our test team at the same time, then we€™ll have problems. Unless we synchronize access to the tables, or pass some sort of caller identifier to prefix or suffix each temp table name with that caller identifier. No option looks very elegant so far, but anyway.
So, do you know if just one database connection could be used for the whole flow (or not)? And if yes, how?
Is is possible to upgrade from 6.5 to 7.0 and have all the logins that have been granted the ability to make a trusted connection to 6.5 be created the same capability in 7.0?
When I did it the logins were created as standard logins in 7.0
Hey guys, I have a table full of data that has duplicate records except for two date columns (date1 and date2). What I would like to do is remove the duplicates while retaining the most recent record, how can I do this?
So record 1 looks like this:
Code:
John | Smith | 08/08/2000 | 10/10/2000
Record 2 looks like this:
Code:
John | Smith | 08/10/2005 | 10/10/2005
I'd like to remove the first instance and keep the second (most recent one).
My SQL Server 2005 SP4 on Windows 2008 R2 is flooded with the below errors:-
Date  10/25/2011 10:55:46 AM Log  SQL Server (Current - 10/25/2011 10:55:00 AM) Source  spid Message Event Tracing for Windows failed to send an event. Send failures with the same error code may not be reported in the future. Error ID: 0, Event class ID: 54, Cause: (null).  Is there a way I can trace it how it is coming? When I check input buffer for these ids, it looks like it is tracing everything. All the general application DMLs are coming in these spids.
I have been testing with the WMI Event Watcher Task, so that I can identify a change to a file. The WQL is thus:
SELECT * FROM __InstanceModificationEvent within 30 WHERE targetinstance isa 'CIM_DataFile' AND targetinstance.name = 'C:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\Backup\AdventureWorks.bak'
This polls every 30 secs and in the SSIS Event (ActionAtEvent in the WMI Task is set to fire the SSIS Event) I have a simple script task that runs a message box).
My understanding is that the event polls every 30 s and if there is a change on the AdventureWorks.bak file then the event is triggered and the script task will run producing the message. However, when I run the package the message is occurring every 30s, meaning the event is continually firing even though there has been NO change to the AdventureWorks.bak file.
Am I correct in my understanding of how this should work and if so why is the event firing when it should not ?
I have a table employee_test having the sample data. The rows with EmployeeID=6 are duplicate rows. I want to delete the duplicates retaining one row for the employeeid=6. Note :- I don't want to use a temporary table. I want to do this using a single query or at the most in a SP query batch. Please advise.
I'm designing a database to store information about jobs that are in progress at a property. More than one job can be in progress at a property at one time and each different kind of job can contain different data, although they all share some common fields such as StartDate.
So I have a table that stores the property details PropertyDetails:
*ID PropertyAddress PropertyPostCode
Then I have a table that stores all of the jobs' shared details:
*PropertyID *JobID - These three make up a compound primary key *JobType StartDate EndDate
Then I have individual tables for each of the Jobs, for example BuildingWork:
*JobID BuildingContractor InsuranceCompany
Which works great, and enables me to query all basic job details from one table (JobDetails) rather than multiple tables for every job type.
BUT: I don't know how to enforce the referential integrity of the database. Obviously I can use a constraint to cascade deletes from the PropertyDetails table to the JobDetails table through the PropertyID, but there doesn't appear to then cascade the deletes from the JobDetails table to the individual Job tables as JobDetails has no idea what tables are there.
If I store the relevant individual table name as the JobType in the JobDetails table, could I use a trigger to somehow delete the related record from that table?
Hi All, I am writing a SP where I need to pass an value to maintain records of last n days. In this SP I am deleting a couple of tables based on the value passed to this SP. For e.g. If the SP is passed the value 10, then only TOP 10 records is maintained, the rest are deleted. I have formed the following logic, which I feel can be improved vastly. I create a temp table and
CREATE TABLE #TempAuditTbl (Rownum int PRIMARY KEY, Orderid uniqueidentifier)
INSERT INTO #TempAuditTbl
SELECT ROW_NUMBER() OVER (ORDER BY orderdate desc) AS rownum, Orderid FROM Orders
DELETE Orders FROM Orders INNER JOIN #TempAuditTbl adt ON adt.Orderid = Orders.Orderid AND rownum > @TopnRows
DROP TABLE #TempAuditTbl
OR
DELETE FROM Orders WHERE orderid NOT IN ( SELECT TOP @TopnRows OrderID FROM Orders ORDER BY OrderDate desc)
This way I am able to keep the top n records. Which of these two solutions is more efficient? Is there a more efficient way to achieve the same. Please help.
I want to retain the formulae defined in the Reporting Services to be retained when I export the report to Excel. I want to know the best possible way to achieve the same.
Scenario: I have a report which has 4 Columns.Column 1 and Column 2 are fetched from Database. Column 3 is empty and Column 4 (Formulae defined using expressions) is computed with the formulae using the previous 3 Columns.
My requirement is that upon rendering of the report, I'll download the report to excel and the end user enters some values in the column 3 and based on the value entered, Column 4 Formula has to be computed. But when I download the Report to Excel, my Formulae expression is not retained.
Kindly let us know if there is any other means of doing this in reporting services itself. Else please suggest an appropriate alternative, either through third party or in any other way.
Hi all, I've developed few reports I'm passing values to few parameters in a report from menu report. when I click on "View Report" button values are changed to default for parameters eventhough I've not changed specifically any values for parameters. thus report is missing few parameters and not able to execute properly.. this error occurs only in web environment.. after publishing reports.. they are working fine in developer suite(Visual Studio)
Server 2003 SE SP1 5.2.3790 Sql Server 2000, SP 4, 8.00.2187 (latest hotfix rollup) We fixed one issue, but it brought up another. the fix we applied stopped the ServicesActive access failure, but now we have a failure on MSSEARCH. The users this is affecting do NOT have admin rights on the machine, they are SQL developers. We were having
Event Type: Failure Audit Event Source: Security Event Category: Object AccessEvent ID: 560 Date: 5/23/2007 Time: 6:27:15 AM User: domainuser Computer: MACHINENAME Description: Object Open: Object Server: SC Manager Object Type: SC_MANAGER OBJECT Object Name: ServicesActive Handle ID: - Operation ID: {0,1623975729} Process ID: 840 Image File Name: C:WINDOWSsystem32services.exe Primary User Name: MACHINE$ Primary Domain: Domain Primary Logon ID: (0x0,0x3E7) Client User Name: User Client Domain: Domain Client Logon ID: (0x0,0x6097C608) Accesses: READ_CONTROL Connect to service controller Enumerate services Query service database lock state
I am creating a custom transformation component, and a custom user interface for that component.
In my custom UI, I want to show the custom properties, and allow users to edit these properties similar to how the advanced editor shows the properties.
I know in my UI I need to create a "Property Grid". In the properties of this grid, I can select the object I want to display data for, however, the only objects that appear are the objects that I have already created within this UI, and not the actual component object with the custom properties.
How do I go about getting the properties for my transformation component listed in this property grid?
HiI have been working since 2 days to device a method to export sql tableinto csv format. I have tried using bcp with format option to keep thecolumn names but I'm unable to transfer the file with column names. andalso I'm having problems on columns having decimal data.Can any one suggest me how to automate data transfer(by using SP) andretaining column names.ThanksNoor
I have an SSIS package that uses an FTP connection manager. When running the package in BIDS, it runs fine and maintains the password for the remote ftp site's user account. Once I deploy the package and attempt to run it, it fails with the following error:
Started: 4:06:15 PM Error: 2015-08-27 16:06:20.09   Code: 0xC001602A   Source: Export and FTP New Jobs Connection manager "FTP Connection Manager"   Description: An error occurred in the requested FTP operation. Detailed error description: The password was not allowed End Error Error: 2015-08-27 16:06:20.09   Code: 0xC002918F   Source: FTP Jobs Listing to Concur FTP Task   Description: Unable to connect to FTP server using "FTP Connection Manager". End Error DTExec: The package execution returned DTSER_FAILURE (1).
I've tried Don't Save Sensitive With Password and that still fails.
Does the FTP connection manager just not retain passwords outside of BIDS?