I've setup the option to mail the error to a person. When the option is on I get the error message by mail but the package does not finish (eg. the failing task does not become red and the output windows never says anything about the error) - if I set the option to off the task fails as expected.
I have several DTSX packages which send out report files by mail. For security purposes, the SMTP server can only be reached from specified IP addresses, from specified processes, with specified sender address. I know all this information, except the process name that sends out the emails in a DTSX package. So my question is, what process can be identified as the one that sends out the email?
Can anyone assist me? I am trying to create a trigger within our local database that will allow me submit an email to a local exchange server. Its an internal email that our company has created for responses by candidates when the original stored procedure meets a condition of TRUE.
Is SQL 7.0 capable of sending emails to a specified address through a Trigger? I was told it could but have yet to come across any documentation backing this up.
Would anyone know the generic syntax I can use to create such a trigger?
Any suggestions would be appreciated.
Thanks in advance, Claude Johnson cjohnson@staffmentor.net
as declare @From varchar(8000) declare @Subject varchar(8000) declare @Body varchar(4000) declare @smtp varchar(8000) declare @counter int, @tbl varchar(8000) Declare @MailID int Declare @hr int Declare @To varchar(8000) Declare @tblquery varchar(8000) declare @id int, @deptemail varchar(8000), @tmpmth varchar(8000), @usedb varchar(8000) set @from = 'name@mail.com' set @subject = 'testheader' set @body = 'testing successful' set @smtp = 'smtp.com'
--========================================================================= --============================ get database name ======================= IF (LEN(MONTH(GETDATE())) = 1) BEGIN Set @TmpMth = '0' + CAST(MONTH(GETDATE()) AS varchar(2)) --01 END ELSE BEGIN Set @TmpMth = CAST(MONTH(GETDATE()) AS varchar(2)) --12 END SET @UseDB = 'DATA' + CAST(YEAR(GETDATE()) AS varchar(4)) + @TmpMth --aia_DATA200712 --=================================================================== --============================ get table number ======================= set @counter = 1 while @counter >= 59 begin IF (LEN(@counter) = 1) BEGIN Set @Tbl = '0' + @counter --01 END ELSE BEGIN Set @Tbl = @counter --12 END
--=========================check if table being created exists IF EXISTS (SELECT 1 FROM information_schema.schemata WHERE catalog_name = 'temptable') GOTO table_1 --=================================== get all email accounts ===================== set @tblquery = ' select ID, Email INTO temptable FROM ( select distinct d.id, d.email from tt32_aia.dbo.department d right outer join tt32_aia.dbo.extension e ON (e.parentid = d.id) right outer join ' + @usedb + '.dbo.other' + @tbl + ' data ON (e.extn = data.extn) where callclass = '''' or callclass is null UNION ALL select distinct d.id, d.email from tt32_aia.dbo.department d right outer join tt32_aia.dbo.extension e ON (e.parentid = d.id) right outer join ' + @usedb + '.dbo.inward' + @tbl + ' data ON (e.extn = data.extn) where callclass = '''' or callclass is null UNION ALL select distinct d.id, d.email from tt32_aia.dbo.department d right outer join tt32_aia.dbo.extension e ON (e.parentid = d.id) right outer join ' + @usedb + '.dbo.local' + @tbl + ' data ON (e.extn = data.extn) where callclass = '''' or callclass is null UNION ALL select distinct d.id, d.email from tt32_aia.dbo.department d right outer join tt32_aia.dbo.extension e ON (e.parentid = d.id) right outer join ' + @usedb + '.dbo.other' + @tbl + ' data ON (e.extn = data.extn) where callclass = '''' or callclass is null )data DECLARE deptemail_cursor CURSOR FOR select id, email from temptable where id is not null '
--============================================================================= --====just above you can see the cursor.. below is sending the emails --======================================================================
exec(@tblquery)
OPEN deptemail_cursor
FETCH NEXT FROM deptemail_cursor INTO @id, @deptemail
WHILE @@FETCH_STATUS = 0 BEGIN --If LEN(@deptemail) > 0 --BEGIN set @to = @deptemail EXEC @hr = sp_OACreate 'CDo.message', @MailID OUT --CDo.message |CDONTS.NewMail <-- different mail server EXEC @hr = sp_OASetProperty @MailID , 'Configuration.fields("http://schemas.microsoft.com/cdo/configuration/sendusing").Value','2' EXEC @hr = sp_OASetProperty @MailID , 'Configuration.fields("http://schemas.microsoft.com/cdo/configuration/smtpserver").Value', @smtp EXEC @hr = sp_OASetProperty @MailID , 'From', @From EXEC @hr = sp_OASetProperty @MailID , 'HTMLBody', @Body EXEC @hr = sp_OASetProperty @MailID , 'Subject', @Subject EXEC @hr = sp_OASetProperty @MailID , 'To', @to EXEC @hr = sp_OAMethod @MailID , 'Send', NULL EXEC @hr = sp_OADestroy @MailID --END FETCH NEXT FROM deptemail_cursor INTO @id, @deptemail END
CLOSE deptemail_cursor DEALLOCATE deptemail_cursor
Table_1: drop table temptable return
set @counter = @counter + 1
end
this is suppose to send email automatically to every email account that it will get from all the tables (around 250 tables). the problem is its not sending, but if i try to take my code outside of the "SET @COUNTER = @COUNTER + 1 END" and close the if statement above, i can produce the correct result.. i'm thinking maybe its the positioning? but, there could be some overhauling needed to do with this.. sorry for posting the sp. sorry for the trouble.. please help me
I receive the following error when trying to invoke xp_sendmail 'xp_sendmail: failed with mail error 0x80070005'. However, for alert notifications the email works fine. SQL is using an alias account to send mail. This account doesn't have any funny characters. Any ideas why the alert system works but xp_sendmail does not? The invokation of xp_sendmail is being done by SA.
I have included the Send email task and sending email from SSIS package on Error event.
But for an error I am receiving multiple atleast 10-15 email generated. i got only 2 emails with revelant error message and other email gives message like "
Thread "WorkThread1" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown."
I have included ''@[System::ErrorDescription]' as an email attachment.
Is there any way to get only specific relevant error email.
I used onError event to send email in case ssis pckage fails
but it send multiple email with errordescription. for ex below are the errordescription of four diferent emails i received.
Thread "WorkThread0" has exited with error code 0xC0047039. An error occurred with the following error message: "The connection "{01AF859A-CF97-4F6C-9C78-1AA4B1C9C27B}" is not found. This error is thrown by Connections collection when the specific connection element is not found.". Thread "SourceThread0" has exited with error code 0xC0047038. The PrimeOutput method on component "Flat File Source - Read from source file" (1) returned error code 0xC0202092. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
Can anyone suggest if we can combine all this error description and send this as one email.
I began installing SQL Server 7.0 the other day on a standard NT Server. When I got to the installation options, Enterprise Edition was the only server option. It wouldn't install on standard NT. Nothing on the box, in the box, or printed on the cd indicated that it was the Enterprise Version. The error message says something about not being able to install on this server from this CD.
We called MS and they did admit to sending an unknown number of CD's out with the wrong setup. I just thought I would pass that on to those of you who might be questioning your own sanity.
Good luck getting the replacement copy. They are telling us that it will be 4+ weeks before they overnight our copy to us. I won't even try to figure out why it would take that long to get a copy. Let us all know if any of you have run into this. I would like to see how common this is.
We have SQL server 2005 Reporting Services SP1. In Report Manager I have a shared subscription that runs every hour between 8am and 6pm (I created the shared subscription in Report Manager, then modified it in Sql Server Management Studio to run the hours I wanted). I have a second shared schedule that runs Tuesday and Friday at 7am. For 2 separate report subscriptions - one uses the hourly schedule and the other uses the 2-day schedule, the first time they run during the day it sends 2 email - after that it sends only 1 email, as it should. I created a subscription to email a report when the report content is refreshed from a report snapshot. In Properties...Execution, I selected to render this report from a report execution snapshot, and selected my shared schedule. For the History property I have selected Allow report history to be created manually and Store all report execution snapshots in history; I am not using a schedule to add snapshots to report history in the History property.
When I look at the job in Sql Server Management Studio, it has only ran once each hour. When I look at the snapshot history, there is only one record for each hour. Why is the first scheduled instance sent twice instead of once?
I have a package which truncates files, fills them with data andshould send a mail with the files as attachment when it finishesrunning. In between, when the task is still running mail is sent withempty files. When I take a look at the directory (at the end of thetask), where the files are created I find out that the files are notempty.(e.g. in task: doA->doB->doC->SendMail. But instead of waiting tilldoB->doC finishes, it does the following: doA->SendMail ... may bebecause doB takes long.)Can I indicate anywhere that it should delay the mail until at the endof the task?Thanks in advance.Harp
I am having a few problems with Database Mail and wondered if anyone could assist. It sends test mails fine, but when I run the following script:
(I've changed my email address in this post to test@test.com they are accurate in the scripts.)
EXEC msdb.dbo.sp_send_dbmail
@profile_name = 'Test Mail',
@recipients = 'test@test.com',
@body = 'Sent by sp_send_dbmail',
@Subject = 'SQL Server Email Test Email';
It gives the following errors.
(from the above script)
mailitem_id = 16 profile_id = 2 recipients = test@test.com copy_recipients = NULL blind_copy_recipients = NULL subject = SQL Server Email Test Email body = Sent by sp_send_dbmail body_format = TEXT importance = NORMAL sensitivity = NORMAL file_attachments = NULL attachment_encoding = MIME query = NULL execute_query_database = NULL attach_query_result_as_file = 0 query_result_header = 1 query_result_width = 256 query_result_separator = exclude_query_output = 0 append_query_error = 0 send_request_date = 2008-04-15 10:50:03.827 send_request_user = COMPANYTest.Test sent_account_id = NULL sent_status = failed sent_date = 2008-04-15 10:50:04.000 last_mod_date = 2008-04-15 10:50:04.513 last_mod_user = sa
(from the test mail)
mailitem_id = 11 profile_id = 2 recipients = test@test.com copy_recipients = NULL blind_copy_recipients = NULL subject = Database Mail Test body = This is a test e-mail sent from Database Mail on UKDEVSQL1 body_format = TEXT importance = NORMAL sensitivity = NORMAL file_attachments = NULL attachment_encoding = MIME query = NULL execute_query_database = NULL attach_query_result_as_file = 0 query_result_header = 1 query_result_width = 256 query_result_separator = exclude_query_output = 0 append_query_error = 0 send_request_date = 2008-04-15 09:51:46.530 send_request_user = COMPANYTest.Test sent_account_id = 4 sent_status = sent sent_date = 2008-04-15 09:51:46.000 last_mod_date = 2008-04-15 09:51:46.610 last_mod_user = sa
sysmail_event_log gives this error for mail item 16 (the scripted email):
The mail could not be sent to the recipients because of the mail server failure. (Sending Mail using Account 4 (2008-04-15T10:11:41). Exception Message: Cannot send mails to mail server. (Mailbox unavailable. The server response was: 5.7.1 Requested action not taken: message refused). )
I am having a problem importing data into SQL 7 from any type of source. I go through the whole import process no problem. When I click the finish button to start the import, nothing at all happens. Enterprise Manager and the DTS just hang and I must use crtl+alt+delete to end the program. Can anyone give me any suggestions as to what might be happening. Big Thanks in advance, I've been working on this for days.
I am a VoIP phone system using SQL on the back end. I am trying to get a Trigger to fire and email me when a certain number has been dialed.
Create Trigger trg_Emergency_Calls on dbo.CallDetailRecord for Insert
IF @@ROWCOUNT=0 RETURN ---NO rows affected exit proc
IF (SELECT finalcalledpartynumber FROM inserted)='95593684' BEGIN --RAISERROR ('Call Stored Procedure Here',16,10) EXEC WEB_SRVR03.master.dbo.sp_SMTPMail @body='This is a test Email'
END Return
GO
The problem is I have to execute the actually email SP is on another server and has to be that way. The trigger actually runs each time but if the IF statement becomes true then the trigger hangs and never completes. Watching the other server(WEB_SRVR03) there is never a request to execute the sp_SMTPMail. I have been trying to troubleshoot this with profiler but I never see any locks or anything that would give me a problem. Also the insert statement that caused the trigger to fire also never finishes and so the record isn't written to the db. If anyone has any suggestions I would appreciate it. Thanks
I want to finsih the execution of a DTS package from an ActiveX task. If a condition is ok, the package would continue as normal. If not, I want to finish the package without any error. Just without executing the next tasks. Do you have any idea?
If an error occurs in ScriptTaskA from PackageB, the OnError event handler in PackageA fires once and catches the event from ScriptTaskA; that is, the output of the SourceName system variable is "ScriptTaskA" from PackageB. So far so expected.
Now, the same error is handled differently by PackageA's OnTaskFailed handler. The OnTaskFailed handler fires twice - once for ScriptTaskA and once for ExecutePackageTaskA; that is, two outputs are returned - one for "ScriptTaskA" and the second one for "ExecutePackageTaskA". That's strange to me.
Why does the OnError handler only fire once and the OnTaskFailed twice? Is there a setting that does this?
I have a Stored Procedure which Ideally should run when a Customer Logs in, the Procedure will check the available stock and create a Temp Table for the Information, which allows many other Queries in the site to run a lot faster, (due to no joins). The Query has taken as much as 30seconds (Lots of Records and 1/2 dozen Joins) to run upon log in and causing a Timeout for the web application.
I want the procedure to run as it is, but for the login method to not be dependent on the Process tried this in .NET cmd.BeginExecuteNonQuery() (cmd=SQLCOmmand) which doesn't do what I want it just allows me to run heaps of QUeries at the same time.
Can anyone help me with getting this procedure to run and not hold up the Web application? Not sure whether I need to do this in .NET, or whether I can get .NET to run a Batch File or something, but someone must have had a similiar problem, please help.
I have a scheduled job to run daily at 4 am. the job imports data from client side from text files and puts data in our sql table.It takes around 1 hour. the problem is it doesnt stops after the import process completes.so after one hour i can see the data is imported into my sql table so thats fine but the job keeps running. I tried observing that job's spid in activity monitor in sql 2005 but after one hour i cant even see that spid but still job runs. its weird.and after that when i stop the job manually then it stops saying job completed succcessfully. that step is a last step and it uses windows cmd.my understanding is the job step doesnt understands that it got finished. what should i do in here?? any ideas r appreciated we are running the same job for another servers and its fine.
No Error but Export to Excel does not finish When the report has 2 pages with total 500 rows exporting to Excel is not a problem. If it has 100 pages 5000 rows exporting to excel does not end and it does not return any error but the process does not end either. What might the problem be?
I have a database A include five tables, and have more than 1,500,000 rows. There is a replica database of A. First of all, there is no data in the two dbs. When I finish inserting data into A, the replica db seems still work, the log file size still changes. How can I know the replication finished or not? How long it will take to finish replicate 1,500,000 rows of data?
I have been working with this for about a month now, and no similar problems to date. Today I am trying to introduce 4 configuration flags that control whether optional ETL stage feeds are executed. I did this by adding a do-nothing script component. The precedent and constraint is used, and it checks the boolean variable flag. The first package executes fine. But it never returns from there. This precedent has nothing fancy on it either. It simply does not run any more of the package, make any more conditional checks, nor the common completion tasks. It just seems to think it is done.
The optionals all fire execute package tasks. One thing that might be tripping it up is that I attempt to run one package twice, each time with varying parent package variable set to control it to use a different destination database for each run. Should this not be OK to do?
We are generating log file in our SSIS package by enabling the built-in feature of SSIS tool. We are generating log for the "OnError" event. This also recorded the error/failed task messages in the text file "log.txt". That error information is too complex with more unwanted information like below
----------------OnError,,,pkgExtract,,,8/30/2006 11:50:04 AM,8/30/2006 11:50:04 AM,-1071636471,0x,An OLE DB error has occurred. Error code: 0x80040E21. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E21 Description: "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.".
OnError,,,pkgExtract,,,8/30/2006 11:50:04 AM,8/30/2006 11:50:04 AM,-1071607780,0x,There was an error with input column "create_user_id" (116) on input "OLE DB Destination Input" (103). The column status returned was: "The value violated the integrity constraints for the column.".
OnError,,,pkgExtract,,,8/30/2006 11:50:04 AM,8/30/2006 11:50:04 AM,-1071607767,0x,The "input "OLE DB Destination Input" (103)" failed because error code 0xC020907D occurred, and the error row disposition on "input "OLE DB Destination Input" (103)" specifies failure on error. An error occurred on the specified object of the specified component. ---------------------------------
This is infact not in a better readable format. We also don't want to do our error logging in database.
Is there any way of defining our error log and create error error log with customization of our messages . Can we do it using OnError event handler.
Please help us with some good solution to avoid giving this confused error log messages.
I am having a few problems getting the opomum configuring the OnError event hendler for my new package. What im trying to do is log the Error event and sound out an email (which i am achiving) where i am encountering a problem is trying to determin the severity of the error.
What i would like to achive is to send an email and log the error (as i currently do) but if the error has the power to stop the package from executing i would like to fire an additional script to move the loading files to a failed location.
My initial thoughs were to put an expression the on the procdural constraint in he error handeler but i cant find which System variable to apply the logic to.
Please note that "Errordesc" is the variable which I declared on script as readandwrite variable.
But on execution of this script I get error
Error: A deadlock was detected while trying to lock variables "User::Errordesc" for read/write access. A lock cannot be acquired after 16 attempts. The locks timed out.
any sugestion...
Please note that without ErrorDesc, the script runs successfully
I have a package name "Package1" and I have a few data-flow tasks in it. whenever anything fails in the "Package1", I want to send out the failed email to alert myself. So first step, I went to EventHandler tab, and select Executable as "Package1" and Event Handler as "OnError". and then I added SendMailTask. I manually ran SendMailTask, failed email send to me okay. But when I run package from Microsoft visual Studio (IDE), the package failed but it doesn't send out failed email.
Error code: [174]] Error: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "xxx" failed with error code 0xC0202009. There may be error messages posted before this with more information on why the AcquireConnection method call failed.
Maybe I misunderstand OnError eventhandler, or I set up it wrong.
Here's my dilema, I want to run a stored procedure that starts another stored procedure running, but does not wait for the stored procedure to complete execution.
The stored procedure should execute immediately, and leave the other procedure to complete running in the background. Is there any way to do this?
We recently upgraded out SQL version from SQL2008R2 to SQL2014. As such, the compatibility mode changed to SQL2104 (120).
We have several queries that used to run fine that now take forever to bring back results. There are no errors (which surprised me). They just take way too long now. PLus they seem to be causing high I/O and CPU.
If I change the compt level back to SQL2008 - these queries run fine.
QUERY with SQL2008 compt level - finished in 2 minutes. QUERY with SQL2014 compt level - finishes in 3 hours 22 minutes.
same exact query - same server - only thing changed was compatibility level.
WHat do I look for in the queries that could be causing this? (they look fine but obviously I'm missing something here)..
I have a package which have two sequence container, first container is used to transfer data to a staging area and second sequence container is used to transfer to destination from that staging area. And I also apply transaction required to second sequence container. There are several execute sql tasks and several data flow tasks inside two sequence container. first sequence container( 1.execution sql task-> 2.data flow ->3.execution sql task) -> second sequence container(4.execution sql task-> 5.execution sql task-> 6.data flow-> 7.data flow -> 8.execution sql task-> 9.data flow...)
I create ExecutionLog table which is used to log status for this package on our sql server. First this status field is null, then during this package run , it change to 'in process', and after this package finished, it change to 'success' or 'failure' depending this package can run successfully or not. This package can be run only if status is 'success' ,'failure' or null. So I need to change this status field during package execution. For updating package to failure, I need to add event handler to change that status using execute sql task.
First time I perform to execute sql task on onError event handler tab (this event handler is applyed on package level ) . And for testing envent handler I use old schema version to make sure I get failure for '8 execution sql task'. But package seems to get stuck at '8 execution sql task' inside second sequence container( always yellow when I run from ssis) and never fire envent handler. '8 execution sql task' is used to update related table.
Second time I remove onError envent handler and change to use on onTaskFailed event handler tab (this event handler is applyed on package level ) . But everything is the same as using onError event handler except I got error output but still can not fire event. Why '8.execution sql task' can not fire onError or onTaskFailed? For this case what kind of event handler I need to use, what kind of level I need to apply for this event handler.
I have a package the looks for any Excel files in a folder, moves the data to a SQL table, then archives the file to one of two archive folders--a success folder or an error folder. I have an OnError handler on the Data Flow that sets a flag that lets the archive process know where to move the file.
This works when the processing is successful. It also works when the error in the Data Flow occurs right off the bat, i.e., in the Source. When the error occurs later on, say in the Destination, it doesn't work correctly. In this case, the OnError sets the flag, but when the archive process tries to move the Excel file, it can't because it's locked. I assume this is because OnError interrupted the Data Flow before the Excel file could be closed properly.
Any ideas on how I can avoid this problem? Can I manually get the Data Flow to close the Excel connection somehow?