We have an application comprising of a number of SSIS packages that run every few hours. When a package fails, we would like an email to be sent to a pre-configured address with the required information.
I would like to use Send Mail task with SMTP server like smtp.sbcglobal.yahoo.com, but this server requires authentication and I need to provide my user_id and password for that. I have not been able to figure out where I can configure user_id and password for getting authenticated by the SMTP server.
If SMTP connection cannot be used, is there any other way to notify admin of the failure?
Hi, I have scheduled a job that runs every minute. If the job does not succeed, then I would like my front end application to be notified. I am not sure if this is a reasonable way but I am thinking of somehow populating a table (Only if the scheduled job fails) in sql server and then read that table every minute from the front end application.
So, is it possible to populate a table if a job fails? I do not see any options for this in the properties of the job.
I had configured SQL Agent and SQL Mail to use the profile that I created by logging in as the SQL Agent domain admin. service account. The mail service was successfully working. Now, when I create an operator with my mail account and test it, Agent gives the message that the mail was successfully sent, but I don't actually receive any mail.
I have numerous jobs running on my SQL Server machine. Due to the fact that my company will not sanction an e-mail account for a machine (damn them!!) I have no way of knowing (quickly) whether a Job has failed. Does anyone hae any suggestions on how I can create a file or SQL table that is updated every time a job fails? (writing to NT application log is not an option beacause Operations refuse to trawl thru it).
I tried using Query Notification on my computer at home: * Win XP Pro with all the SPs and hotfixes * SQL 2005 with SP1 qand hotfix
Query Notification worked fine.
Then I tried using it at work:
* Win XP Pro with all the SPs and hotfixes
* SQL 2005 with SP1 qand hotfix
and I see the following error in the SQL server log file and notification does not get to the client app:
---------------------------------------------------------- Date 9/1/2006 10:18:30 AM Log SQL Server (Current - 9/1/2006 10:18:00 AM)
Source spid17s
Message An exception occurred while enqueueing a message in the target queue. Error: 15404, State: 19. Could not obtain information about Windows NT group/user 'domainmyuser', error code 0x6e. ----------------------------------------------------------
A similar error shows up in the machine's Event Log.
I am sysadmin and full OS admin on both boxes. The difference is that the computer at home is standalone while the one at work is part of a domain.
Does anybody have any experience/advice on how to ensure that SOAP service call success/failures are returned to the calling app?
Consider a client that calls a SOAP service during which the client goes down and is unable to receive the SOAP response, the work having been done by the service. Similarly, the SOAP service may perform the task but a failure in the return makes the client think the process failed.
What would be the best way to ensure that the client is notified to avoid the call having to be made again?
Are there middleware tools that can be used to provide a form of message queuing for SOAP service calls?
I have a SSIS job, one of the last steps it performs is to execute a SQL 2000 DTS package. This has to be done as a SQL 2000 DTS package as it is performing rebuilds of SQL 2000 Analysis Services dimensions and cubes. We've found that when the DTS fails the SSIS job is happily completing showing as a success, we would prefer to know it went wrong.
As far as I'm aware SSIS merely starts the DTS off and doesn't care about it's result. I've taken a look in to turning on the logging for the execute DTS package and thought that the ExecuteDTS80PackageTaskTaskResult would give me the answer I need...but is merely written to the log not available as an event-handler. It also looks like it is not safe to put a SQL task in as the next item to go look at the SQL 2000 system tables to look at the log for the DTS package as the SSIS documentation warns that the DTS package can continue to run after the execute DTS package task has ended.
Ideally I want any error raised within the DTS package to cascade up to be an error in the SSIS job, I can then handle it appropriately. I cannot find a way to do this. Is there a way?
If not, can anyone suggest how in the remainder of the SSIS tasks I can be sure that the DTS has completed before I start any other tasks that will check for the SQL 2000 log of its execution?
I have developed an SSIS package for ETL purpose. I am invoking the SSIS package through .Net console application by referencing the ManagedDTS Assembly. I am able to execute the package in Sql Server 2005 Developer Edition and it runs fine till completion.
But when i try to execute the packahe in Sql Server 2005 Standard edition, by invoking the package through .Net console application the status of the package is failure.
Can any one help me how to over come this problem.
I'm working on an application that involves various components two of which are an SSIS package and a C# windows service.
The SSIS package imports data from a number of structured flat files into database tables and then runs a number of stored procedures that rearrange/normalize the data and perform certain calculations the results of which are also stored in the database. The package will run at regular intervals as new data structured flat files become available.
The C# windows service is a caching server and the data it needs to cache is the results of the stored procedures run by the SSIS package. I'm therefore looking for a way to notify the C# app when the SSIS package has completed.
I've looked at Sql Servers Query Notification but, if i've understood correctly, that seems to depend on the client code issuing a SQL statement and attaching a sqldependency object that will check for changes to the resultset of the particular query which is not really what i'm after.
I've considered SSIS events but again i think for a C# app to monitor SSIS events the package needs to have been executed from within the C# app. If there's a way for the SSIS package, running completely independently of the C# process, to fire an event that the C# windows service can subscribe to then that would be ideal.
I don't know whether other options like Sql Server Notification Services or even something quite crude like writing a flag to a flat file that the C# service can read would be more ideal?
I have a SSIS Package with a "Execute Package Task" to call a child package. I am trying to have the master/parent package complete its execution regardless the outcome (failure or success) of the child package. The overall structure of the master package is:
1. Perform Pre-load tasks (stored procedure).
2. Execute Package Task (call child package)
3. Perform Post-load Tasks (stored procedure)
I have try everything and cannot get the results that I want... I have tried the combination of "failparentonfailure", "forceexecutionvalue", etc. The master package stops at childs failure. I would like to resume to completion
SQL 2012. I am working in understanding more the feature <AlwaysOn>. I have already worked with FCI (Failover Cluster Instance) and I would like to understand more about <AlwaysOn>.
Browsing internet/documentation I am getting more knowledge about <AlwaysOn> but a doubt persists:
1) Combination of FCI + AlwaysOn (2 SQL servers for FCI providing cluster instance: SQLClusterSQLReporting + 1 SQL server (at least) for <AlwaysOn>: SQLAOn/SQLReporting)
If the FCI instance is called SQLClusterSQLReporting, all the connections are mapped to this name.
But if the<AlwaysOn> feature is added to the system, in case of failure the name of <SQL Instance AlwaysOn> must be used therefore I am obliged to re-map all my connections to SQLAOn/SQLReporting.
QUESTION: IS it correct?
If yes, I do not understand the usefulness for the <AlwaysON> feature that lead to re-map all the connection.
In my opinion it should be a <MAIN> SQL Instance covering both <FCI> + <AlwaysOn>.
Is there any built-in way of kicking off a job on SQL Server 2005 Agent whenever a package/job completes in Oracle? Are there any (Triggers? Msft queue? Event Notification?) mechanisms to automate running a job on the SQL side? Any article or knowledge articles would be appreciated also.
If not are there any built-in stardardized polling techniques? Or are there any timers in SSIS? That way I can delay executing a child package until a certain record has been inserted into a control table in Oracle. I don't want to write an inefficient for loop that blocks all other processing on the server and iterates once every second.
When I try to run any dts package that tries to copy stored procedures from one db to another, it fails and i get this message:
Need to run the object to perform this operation [SQL-DMO]Code execution exception: EXCEPTION_ACCESS_VIOLATION
The problem appeared a couple of weeks ago, the same package that now fails ran with no problems for some 6 months or so.
When i searched the Microsoft site i found a bugfix that fit the problem exactly but the workaround and SP3 installation (which supposedly fixes the problem) didn't work. Please help!!
I backed up my database on the live server copied it to the backup server and restored it successfully. I then set up a DTS export on the live server to copy objects and data to the backup db. The job keeps failing. Each time it fails it reports an "ALTER TABLE statement conflicted with COLUMN FOREIGN KEY constraint ...... However, there is nothing wrong with any foreign key, I have checked and the data in the tables are definitely fine and key is definitely fine (I removed it and added it again checking the data). I am able to DTS these tables which are supposed to have problems as a separte DTS job. I believe the job is inserting data in the table which has the foreign key before it inserts data in the table with the primary key and this is causing the problem. As an aside this job used to work perfectly well.
Can someone offer a suggestion on what I should do.
I have a DTS package which keeps failing giving me the error message “A connection with the transaction manager was lost". The DTS package is using two different servers as connection and the DTS package has Use Transaction, Rollback transaction on failure and Join transaction if present ticked. I have checked the MSDTC service is running on both servers, the following options on both servers have been checked; Network DTC Access, Allow Inbound and Outbound. Does anybody know what is causing this error message?
Because of recent changes in the company, I have inherited the SQL Server. I have a little bit of experience, but I will be knee deep in study for the foreseeable future. In the meantime, I do have to manage the SQL Server and deal with all of the problems. It has been fairly self-sufficient until now. I have a regularly scheduled SSIS package that has recently been erroring out. I just don't even know where to start figuring this out. When I look at the History of this two step job, I see that it's failing on the second step, which is a T-SQL script. I am way more of a GUI kinda guy (at least for now) and I am lost when looking at the script. I know that I am not giving enough info in order for someone to help me, but my problem is I just don't know where to even look. Can someone at the very least point me to a place where I can get a better clue as to what is happening and why???
Thanks for listening and God bless anyone who understands at all what I'm even trying to ask!!!!!
Package Load Failure Package 'Microsoft.SqlServer.Tools.PublishWizard.VSCommands.MenuCommandsPackage, MenuAndCommands, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bg3856ad364e35' has failed to load properly ( GUID = {A38CF415-6AD2-45F6-8132-9CC33EBE0629}).
I get this error after I have opened VS and then open the Server Explorer Window.
I have been using VS2005 Professional since December of 2005 and have never encountered this before. I have never had a beta version of anything on this PC. I have been searching the Net but haven't found a solution. At this point, I haven't tried reloading VS because even if it does fix it, what is the guarantee that I will not encounter it again. I would rather find and fix the cause.
Does anyone have any ideas?
Additional: I have now uninstalled and reinstalled VS2005 and SQL Server 2005 including service packs and am still getting the error.
Additional: When I ran devenv.exe, the section of the resulting log referencing the GUID indicated an invalid Package Load Key:
i'm using a Foreach Container to run certain tasks, these ones are inside a Sequence Container. I'd like to know if there is a way to have a task failure without a foreach failure, i don't need the hole process to fail, just the current iteration so i can log it.
Anyone seen this error or know what this could be? I got this when executing a package. It works on my dev machine, however failed with this error on another test machine/environment.
The session was canceled. (Exception from HRESULT: 0x800700F0) (Microsoft Visual Studio)
------------------------------ Program Location:
at System.Runtime.InteropServices.Marshal.ThrowExceptionForHRInternal(Int32 errorCode, IntPtr errorInfo) at Microsoft.DataWarehouse.VsIntegration.Interop.NativeMethods.ThrowOnFailure(Int32 hr, Int32[] expectedHRFailure) at Microsoft.DataWarehouse.VsIn
I am running a DTS Package from a stored procedure using xpcmdshell. The DTS Package begins with a SQL Task to delete records from 2 tables (this works fine), but the data transfer task for importing records from a SQL Anywhere 5.0 database gives me the error 'Unable to connect to database server: Unable to start database engine'. the weird thing is that from Enterprise Manager I can execute the DTS Package and it works fine. What am I missing here?????
Trying to build a deployment package. I have a number of dtsx in a project that share a connection config file. When I build, the error states: 'Could not copy file "whatever.dtsconfig" to the deployment utility output directory. ... The file already exists'
I have a package that is failing because of a truncation error. Now, by default (and I leave this for ALL my packages) if one row fails processing the entire package should fail and nothing gets loaded into db. But instead I am actually getting a partial db load.
I have confirmed the "Rows per btach" value (blank) and the "Maximum insert commit size" value (0) for the OLE DB Destination Editor so I have no idea what is going on. Are there any other properties I should be checking?
I am developing an SSIS package and need the execution of the package to continue even if one of the tasks within the package fails. I have an OnError event handler for this task which fires when it fails but want the rest of the package to continue.
I have a job that moves a file to import directory where the SSIS package picks it up and processes it. 95% of the time is works flawlessly and fast.
The other 5% of the time the process fails. It appears that the file is inaccessible to SQL Server. I run it again and it works perfectly. It appears to be completely hit and miss.
The last step in the package is on-completion delete the file. When the package fails, the file (as directed) is not deleted.
I've included a snippet of the error log. I can place the log out here if it would help more.
Any ideas what could possibility going wrong? I hate inconsistent failures.
Warning: 2008-04-06 15:05:30.80 Code: 0x80070002 Source: Data Flow Task Flat File Source [317] Description: The system cannot find the file specified. End Warning Error: 2008-04-06 15:05:30.80 Code: 0xC020200E Source: Data Flow Task Flat File Source [317] Description: Cannot open the datafile "D:Processes 40720080105260140306.txt". End Error Error: 2008-04-06 15:05:30.80 Code: 0xC004701A Source: Data Flow Task DTS.Pipeline Description: component "Flat File Source" (317) failed the pre-execute phase and returned error code 0xC020200E. End Error
I had a Send email task linked to my Sequence Containers in my package and it was working fine. Everytime the container fails it would send an email to myself.
At some point all Failure constraints stopped working. Failure constraints work if I add brand new tasks, but with the existing tasks, they don't work. The Task which fails, turns red and execution stops. Next failure task is not executed.
I am not sure what triggered it to stop working. I cannot get anything on the log
i have a package that contains a foreach loop container, in this container i have sql tasks, and execute package tasks, that end with a send mail task. if there something wrong with the smtp server, or it's down, the send mail task fails the package. i don't want this to happen, what i want that if the send mail taks fails, the package will continue it's execution.
--i thought of using the event handler... but i don't know if it works...
This means, I am using the Application object and the Package object to actually load the package using the path to it.
So my code looks something like this:
-----
pkgLocation = @"<package_path>/Package1.dtsx";
app = new Application();
pkg = app.LoadPackage(pkgLocation, null);
pkgResults = pkg.Execute();
Console.WriteLine(pkgResults.ToString());
Console.ReadKey();
---
My package reads in a flat file (located on another server) and transforms it and saves it to a database (on that same server). And mind you, I can execute this package just fine when I do it manually from within BIDS.
But when I try to execute the above mentioned code in my C# solution (compiled to a command line executable), I always get a "Failure".
Can somebody point out what I am doing wrong here?
I am using SQL Server 2012 SP1. I have built an SSIS package that imports flat file data from various files to SQL Server. I have got it to do everything I want it to do when things are going well, and am now on what I want it to do when it encounters a failure executing specific tasks and containers. For example, I have a Foreach Loop container that executes a dedicated stored procedure for each csv file in the target folder. If any of the store procedures fail to run for any reason I want to carry out certain actions.
For the most part I think I will be fine using the Event Handlers. What I can't seem to find is how to tell the package to stop executing on a Failure event after carrying out the actions defined by the relevant Event Handler. Or, perhaps it isn't necessary as that would be the default behaviour on a failure?
Hi everybody, I'm newby in SSIS. I have been using DTS in SQL 2000. Trying to learn how to execute SSIS package from C# code of ASP.NET web server. Here's my case: 1. SSIS package with simple data transformation from one table to CSV is stored in SQL Server 2005 storage 2. CSV is for simplicity placed in C: .txt 3. I haven't used SSIS configuration files 4. Protection level of package = EncryptSensitiveWithUserKey 5. Executes OK from Bussiness inteligence studio 6. I've created console application with this code:
7. When I connect through Remote Desktop Connection I successfully execute this console application on SQL server host machine. 8. When I try execution from computer where I develop package and where I successfully executed it from Bussiness inteligence studio I'm geting FAILURE was a result
Connection params for SQL are same in console application and in SSIS project of Bussiness inteligence studio.
(1) contains a for loop task (in which all the logic is contained) that loops through a particular folder for excel files WITHIN THE FOR LOOP: (2) pulls data from an excel file into SQL Tables(Data Transformation Task) (3) run stored proc to validate data (Execute SQL TASK) (4)ON SUCCESS of executing the SQL Task (Script Task - move file to success or reject folder based on value returned from sproc) (5)ON FAILURE of executing the stored proc (Script Task - move file to bad format failure)
NOTE: I have modified the MaximumErrorcount property of (1 FOR LOOP) and (3 EXECUTE SQL TASK) and the package itself to 0. In order to deal with badly formatted excel files...I do not want the package to stop for every missing tab in excel file or data entry error. I simply want the badly formatted file to be moved to a special folder
PROBLEM: on failure logic is never executed (I have 2 options after step (3)) on success do step (4) on failure do step (5) step (3) fails...then it simply iterates to the next file step 5 is never executed
Is this because I changed the maximumerrorcount property? What am I doing wrong witht he Precendence Logic?