I have three control flow tasks that are executing in consecutive order. Tasks 1 and 3 will always execute, but sometimes (based on an expression) task 2 will not. I would like to use precedence constraints in such a way that task 3 will execute regardless of whether task 2 executes, but in the event task 2 does execute, task 3 will only execute AFTER task 2 completes. Is there a way to accomplish this without setting the disabled property of task 2 at runtime?
I have 3 data flows connected sequentially on my workflow before processing dimensions and facts which checks data from different databases and if conditions does not meet then writes to log table. So if it writes to log table , I have to quit and finish the workflow. How can I do that?
Hello, I've recently begun exploring Integration Services and have a question regarding a Control Flow element.
I'm trying to understand precedence constraints. Specificially, the logical AND and logic OR components. If I have two constaints A and B where A is using a logical AND and B is using a logical OR, how are the constraints evaluated? The Evaluation Operation is set to "Expression and Constraint" for both constraints.
If my understanding is correct, path A would be followed if both A AND B are true. Path B would be followed if A OR B is true. Is this correct?
Has anyone seen precedence constraints disappear in a package after closing and opening again? In this case, it's not package-wide. Only constraints inside one Foreach Loop container disappeared. Any idea what causes this to happen?
p.s: Yes, I did Save All. Most of these contraints were saved in the package for more than a week anyway.
I have a package that executes 2 sql tasks then another SSIS package. My precedence constraints between sql task 1 and sql task 2, and sql task 2 and the SSIS package check that @ErrorValue == 0.
@ErrorValue is set after each sql task (both of which execute stored procs) with the value of the output parameter for the stored proc. I have run both stored procs on the server and both return 0 in the output parameter.
When I run the package the two sql tasks execute as expected (tasks turn green indicating success) but the package stops at that point and the SSIS package task never executes.
If I remove the second sql task and connect the first sql task to the package task with the @ErrorValue == 0 precendence constraint the package task still does not execute (even though the constraint worked to get the sql task 2 to execute).
The package only runs if I remove the expression in the second precedence constraint (between sql task 2 and the package) thus changing it to simply 'on success'.
Is it possible to use objects in precedence constraints? For example, I have a collection object that may or may not be populated in a script at some point and want to be able to execute different paths depending upon whether the Count property equals zero or not.
I realize I could very easily just create another int variable and populate it accordingly, but I am just curious about this. I couldn't get it to work, but then I am really just starting out here and just not sure if I am doing it wrong.
I have about 12 sequence containers mapped out to execute separately based on a precedence contraint and an expression. These lie at the same level and order in my sequence. Only one of these will execute based on an expression. After any of these executes, I have a consecutive sequence container that I'm attempting to execute. I've set the precedence contraint from all twelve of the prior sequence containers to this single sequence container that I would like to run after any of the 12. My problem is that the package will only allow one of these twelve sequence containers to become a precedence of this second single sequence container at a time. The package will let me graphically attach the precedence constraint from all 12 to this single sequence container, but when the package runs, it fails to follow through to this single sequence container. I'm trying to figure out why this is the case and how I can get what I would like to work -- work. Thanks.
I have created a task that uses a ForEachFile Loop. Within that loop I load 5 files. Once those 5 files have loaded successfully I want to run a stored procedure. I placed this stored procedure into a script task. The script task has 5 precedence constraints,. They are a success on the load of the files. When I run the task, the 5 files load, but the script task will not start. I have gone so far as to check each load individually to see if it would start the script task, which they do. It's only when I add more than 1 precedence constraint that the script task will not run. The precedence constraints are set to AND, so they all are required. They all succeed. What am I doing wrong?
Hey everyone, I have used the set values option of the execute package utility many times to change values on variables in the package with no problem before, however have run into a stumbling block when coupling this with precedence constraints. I have a package that goes down a different path based on a precedence constraint that works perfectly if the default value of a variable is used....however when I pass in a new value via Set Values in the execute package utility the package will error out with this:
The expression "@[User::IVR_FLAG] == 1" must evaluate to True or False. Change the expression to evaluate to a Boolean value.
IVR_FLAG is an int with a default value of 1, and if left that way(or changing it to another value in the package, as long as the default is used) this error will not occur....it will go down the proper path. But when Set Values is used to run the package to change IVR_FLAG, the above error occurs, which doesn't make sense since the above expression should still evaluate to either true or false. Anyone have any thoughts or suggetions? Thanks, Adrian
I had a Send email task linked to my Sequence Containers in my package and it was working fine. Everytime the container fails it would send an email to myself.
At some point all Failure constraints stopped working. Failure constraints work if I add brand new tasks, but with the existing tasks, they don't work. The Task which fails, turns red and execution stops. Next failure task is not executed.
I am not sure what triggered it to stop working. I cannot get anything on the log
This is more a medium level annoyance than a problem. When I save a package, close it and later re-open it, the precendence constraints end up all over the place making my neatly layed out diagram a shambles.
Call me anally retentive if you like (you wont be the first ), but I find this extremely annoying. Does anybody else have this problem? Does anyone have a solution?
I have Variable , data source and conditional transformation which checks the count(*) if the count == 0 then I connect an script component and change variable to false(initial it is True) and write into a log file...
Then I check that variable on predence constarint at workflow if variable==True then success. BUT Whenever I run the package my dataflow gets green even the condition does not meet like count==0 . So my variable's value is "False". Actually if the condition doesnt meet then my script shouldnt work. Am I missing something???
I know this is probably a flick of a switch but I cannot figure out which switch. Setup is SQL Server / Stored Procedures / DAL / BLL(skipped for testing) / PL. The stored procedure queries from only one table and two columns are ignored because they are being phased out. I can run the stored procedure and preview the data in the DAL but when I create a page with an ODS linked to the DAL and a GridView I get this error. I checked every column that does not allow nulls and they all have values. I checked unique columns (ID is the only unique and is Identity=Yes in the table definition). I checked foreign-key columns for values that are not in the foreign table and there are none. Any ideas why do I get this? Failed to enable constraints. One or more rows contain values violating non-null, unique, or foreign-key constraints.
Hi, I am getting the above error when trying to load a report into my Web Application, I have tracked the error down to one specific field in my database. Even though this field is a NVarChar field and is of size 30 it would seem that there is an issue returning the value from the field. I can write it into the database no problems but when I try to get it out of the database it returns the above error. e.g MOB 401.908.804 - Fails 0401.907.324 - okay 8239 9082 (pager) - fails Anyone got an idea on how to fix this???? Regards.. Peter.
I've got a DTS job which has lots of tasks in it. I've also got quite a few flows of workflow and i've noticed that a task won't execute if it has both Failure and Success workflow pointing to it... It can have multiple failures pointing to it and it will execute but it can't have multiple successes or a combination of workflows...
Does anyone know a way to get around this or to change the 'AND' ing that seems to be on the workflow going into a task..
Now this one I don't think will be fixed by changing the length of a variable.
Anyway, In this nifty DTS package I've created I have it set so that on the failure of an SQL task DTS should send me an e-mail letting me know it failed. The SQL statement in the DTS task is "EXEC WEB_Check_Files". In the Stored Proc I then call RAISERROR w/ a user defined error message w/ severity 10. When I call RAISERROR the DTS packages logs everything just kosher but skips past the "on fail" e-mail message. It refuses to execute it and don't know why. Spent the last hour or two going through docs looking at RAISERROR. Tried RETURN 1 to see if that would do anything and nadda. I also don't have "Fail package on first error" checked.
Don't know if it matters but I'm executing the package through the "play" button in Enterprise Manager.
I hope I've done due dilligence before posting for help. I've combed a lot of Google and dBforums search results with no luck.
Anywho, this *seems* like it should be simple: I have a Transform Data Task, into which I've introduced deliberate errors, followed by an ActiveX Script Task that is supposed to fire based on the failure of the Data Transform Task. The second task is joined to the first via an "On Failure" workflow step.
The problem: The second task never fires. The first task fails as expected, but the second one just shows the "Not Run" indicator in the package results after executing.
Here is a graphical illustration of the package and results: http://www.bountifulware.com/blogs/rex/dtsproblems.html
I've experimented with the transaction settings in the package properties, as well as the transaction settings in the workflow properties of each step. I don't particularly want the package as a whole to be couched in a transaction, but if that is part of the equation for making the "On Failure" step fire, I'll happily go along. Also, in the Data Transform task with the deliberate errors, I have the max error count set high, as I want the task to continue logging errors for each record that chokes. I've tried various settings there as well, however.
Thank you in advance -- I'll try to contribute more and leech less after this.
I would like to build a workflow system where 100 processes are requesting an item from a ~1.000.000 items large DB, process that item and move it to the next state. The problem with the current implementation I tried is that I get deadlocks....
The DB table looks like:
CREATE TABLE Transactions( itemid CHAR(32), status TINYINT not null default 0, result INT not null default 0, lockby TINYINT not null default 0, .... (etc.) PRIMARY KEY(refno) ); CREATE INDEX IxStatus on transactions (status)
Each process (with his own ID) is doing 4 step: 1) update transactions set status=1, lockby=<ID> from (select top 1 itemid from transactions where status=0) as t1 where t1.itemid=transactions.itemid
2) select itemid from transactions where status=1 and lockby=<ID>
3) process item
4) update transactions set status=2,result=<RESULT> where itemid=<ITEMID>
I know the idea was to seperate workflow and dataflow, but I have come across a scenario where it would be useful for a branch of a dataflow to wait until another branch has finished.
I have some transactional data which records events for the start and end of a session. I want to build a list of unique sessions with the start and end date. I currently have the list of events sorted by time, followed by a conditional split for the start and end events. I can then insert all of the start events and would like to wait until all of the starts are inserted before updating them with their relevant end times.
Is this achievable?
Does anyone else think it would be a good idea to be able to set precendence across multiple branches of a data flow?
Does anyone have a better solution?
I know this is the wrong forum, but is there a way to model this against the transactional data in SSAS, I will move this question to the SSAS forum if anyone can think this would work!
Can any one provide sample for work flow in SSIS . and how to excute package conditionaly like if package1 is sucess then excute package2 else excute package3 .( more like writeing batch file in Data Stage ETL Tool ).
Is there a way in order to execute a subscribed report based on a certain criteria?
For example, let's say send a report to users when data exist on the report else if no data is returned by the query executed by the report then it will not send the report to users.
My current situation here is that users tend to say that this should not happen, since no pertinent information is contained in the report, why would they receive email with blank data in it.
Hello, I have a SQL Server 2000 DTS package in which the first step executes a batch file. The batch file contains FTP commands that log into an FTP server, and pull down whatever file is there.
I set up a failure workflow to send an email if the step fails. When I have a SQL Server job run this package, and there is no file to dowload, the whole package fails without the failure workflow result firing.
For the step (DTSStep_DTSCreateProcessTask_1), I have the 'FailPackageOnError' property set to -1. In the package properties, I have the check box for 'Fail Package on First Error' cleared.
What do I need to do so that the failure workflow occurs when the step fails?
Trouble with Workflow Hello. I have a DTS package that executes some tasks of the type "Execute Package Task". Every Task has a Condition of been executed just in case of success. I understand that only if the precedent task ends successfully the next one would be processed. My problem is that the DTS continues even if one of the previous steps fails. Would you have any idea?____________________________________________________________________ My escenario is something like this: EP: Execute Package Task ES: Execute Sql Task
______ _____ EP# 1 (on success) ----> EP# 2 (on success) ----> .... ______ _____
i created a DTS package in SQL 2000 using the enterprise manager. I have defined a SQL task to drop some temporary tables on failure of another SQL task. Also the same temporary tables need to be dropped on success of another task. the on success workflow is working fine. but when i add the on failure workflow to the temporary table dropping task, the temp table droppin is not getting excuted at all either for success nor for the failure. please help me out.
I just recognized a strange displaying problem in one of my SSIS jobs.
I created job which contains a sequence container. Within the sequence container there is one "Execute SQL Task" and one "Foreach Loop Container". Within the "Foreach Loop Container" there are 4 task which are connected with precedence constraints.
Now my problem is that if I load the SSIS job and open the "sequence container" while the "Foreach Loop Container" is already open then the precedence constraint won't be displayed. If I close and open the "Foreach loop container" again then the precedence constraints will be displayed again.
My first assumption was that it might be a problem with the display drivers of the computer ....however the problem appears also on any other computer.
Does anyone know how to solve this display problem without closeing/reopening the container?
I know this sounds bizarre, but hey... this is DTS, right?
For ALL of my local packages on a particular server, when I open them with DTS designer on my workstation, everything is fine - I can even execute them. When I log onto the server locally, and open them, the workspace is empty!!!
They are stored as local packages. As I said, I can execute them within designer on my workstation, but if I try to run them via dtsrun, I get an error saying "No Steps have been defined for the transformation Package." And when I look at them on the server, that appears to be true.
One last thing (I know you've heard this before)... Everything was working fine last week - that darn Santa.
I have just installed MS Office XP developer and MS SQL Server 2000 as I would like to use the workflow aspects of this. The problem is that the installation of XP developer wil not install "Workflow Services for SQL Server" as it says one needs SQL Server 7.0 SP2, but I have SQL Server 2000 SP2.
HI, We are currently trying to import the Data from one DB to Multiple DBs.And these multiple DBs connection strings must be configurable. This I can do it by reading the connection strings from dtsconfig file and execute it for each DB.Is it possible to execute for all the DBs in parallel(Multithreaded).
Hi, not sure my subject title makes it clear what I want but here it is.
I have a workflow which basically looks at an excel file in a folder on the local drive and then does loads of stuff to it. Everytime I want to process a different excel file that is in the same location al I have to do is change the value of a single local variable, which is just the name of the excel file.
Is there a way to make this automatic? For exmaple....could I somehow put my whole workflow inside a loop that looks inside that local folder and one by one, get the name of the file, assigns the name to that global variable, and then runs the flow...and continues to do that until it gets to the end??
any help would be greatly appreciated...thanks!!!!
is it possible that you can use SB as part of a workflow engine?
SB sends messages backwards and forwards on a conversation so I am guessing that there is some way that the application can "work out" which person to send a particular message to ?
I've seen examples like an expense form being filled in and sent to a manager, presumably you can specify if the manager accepts or rejects the expense form?
I know there is windows workflow, but, i thought using SB is an alternative?
Hi all, I am looking for any information about using SSIS as a workflow automation (job engine) tool. My company is looking into buying a 3rd party app to do our job scheduling and I think that SSIS could do all that we need. The only issue they found with SSIS is the lack of a GUI/dashboard to view all jobs at once. We need more flexibility than the job scheduler in SQL Server Agent will allow. I have heard that it is possible to build a C# GUI app that can serve as a job engine front page.
What we need is a way to view all jobs in the system and be able to start, stop, pause all jobs manually in a graphical interface. I know of a few companies that are doing this but I am unable to find anything on line about it. My bosses are ready to give SSIS a shot if I can prove that we can build such an interface. Does anyone have any first hand knowledge of such an application or have any tips on where I should look?