I've got a DTS job which has lots of tasks in it. I've also got quite a few flows of workflow and i've noticed that a task won't execute if it has both Failure and Success workflow pointing to it... It can have multiple failures pointing to it and it will execute but it can't have multiple successes or a combination of workflows...
Does anyone know a way to get around this or to change the 'AND' ing that seems to be on the workflow going into a task..
Now this one I don't think will be fixed by changing the length of a variable.
Anyway, In this nifty DTS package I've created I have it set so that on the failure of an SQL task DTS should send me an e-mail letting me know it failed. The SQL statement in the DTS task is "EXEC WEB_Check_Files". In the Stored Proc I then call RAISERROR w/ a user defined error message w/ severity 10. When I call RAISERROR the DTS packages logs everything just kosher but skips past the "on fail" e-mail message. It refuses to execute it and don't know why. Spent the last hour or two going through docs looking at RAISERROR. Tried RETURN 1 to see if that would do anything and nadda. I also don't have "Fail package on first error" checked.
Don't know if it matters but I'm executing the package through the "play" button in Enterprise Manager.
I hope I've done due dilligence before posting for help. I've combed a lot of Google and dBforums search results with no luck.
Anywho, this *seems* like it should be simple: I have a Transform Data Task, into which I've introduced deliberate errors, followed by an ActiveX Script Task that is supposed to fire based on the failure of the Data Transform Task. The second task is joined to the first via an "On Failure" workflow step.
The problem: The second task never fires. The first task fails as expected, but the second one just shows the "Not Run" indicator in the package results after executing.
Here is a graphical illustration of the package and results: http://www.bountifulware.com/blogs/rex/dtsproblems.html
I've experimented with the transaction settings in the package properties, as well as the transaction settings in the workflow properties of each step. I don't particularly want the package as a whole to be couched in a transaction, but if that is part of the equation for making the "On Failure" step fire, I'll happily go along. Also, in the Data Transform task with the deliberate errors, I have the max error count set high, as I want the task to continue logging errors for each record that chokes. I've tried various settings there as well, however.
Thank you in advance -- I'll try to contribute more and leech less after this.
I would like to build a workflow system where 100 processes are requesting an item from a ~1.000.000 items large DB, process that item and move it to the next state. The problem with the current implementation I tried is that I get deadlocks....
The DB table looks like:
CREATE TABLE Transactions( itemid CHAR(32), status TINYINT not null default 0, result INT not null default 0, lockby TINYINT not null default 0, .... (etc.) PRIMARY KEY(refno) ); CREATE INDEX IxStatus on transactions (status)
Each process (with his own ID) is doing 4 step: 1) update transactions set status=1, lockby=<ID> from (select top 1 itemid from transactions where status=0) as t1 where t1.itemid=transactions.itemid
2) select itemid from transactions where status=1 and lockby=<ID>
3) process item
4) update transactions set status=2,result=<RESULT> where itemid=<ITEMID>
I know the idea was to seperate workflow and dataflow, but I have come across a scenario where it would be useful for a branch of a dataflow to wait until another branch has finished.
I have some transactional data which records events for the start and end of a session. I want to build a list of unique sessions with the start and end date. I currently have the list of events sorted by time, followed by a conditional split for the start and end events. I can then insert all of the start events and would like to wait until all of the starts are inserted before updating them with their relevant end times.
Is this achievable?
Does anyone else think it would be a good idea to be able to set precendence across multiple branches of a data flow?
Does anyone have a better solution?
I know this is the wrong forum, but is there a way to model this against the transactional data in SSAS, I will move this question to the SSAS forum if anyone can think this would work!
Can any one provide sample for work flow in SSIS . and how to excute package conditionaly like if package1 is sucess then excute package2 else excute package3 .( more like writeing batch file in Data Stage ETL Tool ).
Hello, I have a SQL Server 2000 DTS package in which the first step executes a batch file. The batch file contains FTP commands that log into an FTP server, and pull down whatever file is there.
I set up a failure workflow to send an email if the step fails. When I have a SQL Server job run this package, and there is no file to dowload, the whole package fails without the failure workflow result firing.
For the step (DTSStep_DTSCreateProcessTask_1), I have the 'FailPackageOnError' property set to -1. In the package properties, I have the check box for 'Fail Package on First Error' cleared.
What do I need to do so that the failure workflow occurs when the step fails?
Trouble with Workflow Hello. I have a DTS package that executes some tasks of the type "Execute Package Task". Every Task has a Condition of been executed just in case of success. I understand that only if the precedent task ends successfully the next one would be processed. My problem is that the DTS continues even if one of the previous steps fails. Would you have any idea?____________________________________________________________________ My escenario is something like this: EP: Execute Package Task ES: Execute Sql Task
______ _____ EP# 1 (on success) ----> EP# 2 (on success) ----> .... ______ _____
I have 3 data flows connected sequentially on my workflow before processing dimensions and facts which checks data from different databases and if conditions does not meet then writes to log table. So if it writes to log table , I have to quit and finish the workflow. How can I do that?
i created a DTS package in SQL 2000 using the enterprise manager. I have defined a SQL task to drop some temporary tables on failure of another SQL task. Also the same temporary tables need to be dropped on success of another task. the on success workflow is working fine. but when i add the on failure workflow to the temporary table dropping task, the temp table droppin is not getting excuted at all either for success nor for the failure. please help me out.
I just recognized a strange displaying problem in one of my SSIS jobs.
I created job which contains a sequence container. Within the sequence container there is one "Execute SQL Task" and one "Foreach Loop Container". Within the "Foreach Loop Container" there are 4 task which are connected with precedence constraints.
Now my problem is that if I load the SSIS job and open the "sequence container" while the "Foreach Loop Container" is already open then the precedence constraint won't be displayed. If I close and open the "Foreach loop container" again then the precedence constraints will be displayed again.
My first assumption was that it might be a problem with the display drivers of the computer ....however the problem appears also on any other computer.
Does anyone know how to solve this display problem without closeing/reopening the container?
I know this sounds bizarre, but hey... this is DTS, right?
For ALL of my local packages on a particular server, when I open them with DTS designer on my workstation, everything is fine - I can even execute them. When I log onto the server locally, and open them, the workspace is empty!!!
They are stored as local packages. As I said, I can execute them within designer on my workstation, but if I try to run them via dtsrun, I get an error saying "No Steps have been defined for the transformation Package." And when I look at them on the server, that appears to be true.
One last thing (I know you've heard this before)... Everything was working fine last week - that darn Santa.
I have just installed MS Office XP developer and MS SQL Server 2000 as I would like to use the workflow aspects of this. The problem is that the installation of XP developer wil not install "Workflow Services for SQL Server" as it says one needs SQL Server 7.0 SP2, but I have SQL Server 2000 SP2.
HI, We are currently trying to import the Data from one DB to Multiple DBs.And these multiple DBs connection strings must be configurable. This I can do it by reading the connection strings from dtsconfig file and execute it for each DB.Is it possible to execute for all the DBs in parallel(Multithreaded).
Hi, not sure my subject title makes it clear what I want but here it is.
I have a workflow which basically looks at an excel file in a folder on the local drive and then does loads of stuff to it. Everytime I want to process a different excel file that is in the same location al I have to do is change the value of a single local variable, which is just the name of the excel file.
Is there a way to make this automatic? For exmaple....could I somehow put my whole workflow inside a loop that looks inside that local folder and one by one, get the name of the file, assigns the name to that global variable, and then runs the flow...and continues to do that until it gets to the end??
any help would be greatly appreciated...thanks!!!!
is it possible that you can use SB as part of a workflow engine?
SB sends messages backwards and forwards on a conversation so I am guessing that there is some way that the application can "work out" which person to send a particular message to ?
I've seen examples like an expense form being filled in and sent to a manager, presumably you can specify if the manager accepts or rejects the expense form?
I know there is windows workflow, but, i thought using SB is an alternative?
I have three control flow tasks that are executing in consecutive order. Tasks 1 and 3 will always execute, but sometimes (based on an expression) task 2 will not. I would like to use precedence constraints in such a way that task 3 will execute regardless of whether task 2 executes, but in the event task 2 does execute, task 3 will only execute AFTER task 2 completes. Is there a way to accomplish this without setting the disabled property of task 2 at runtime?
Hi all, I am looking for any information about using SSIS as a workflow automation (job engine) tool. My company is looking into buying a 3rd party app to do our job scheduling and I think that SSIS could do all that we need. The only issue they found with SSIS is the lack of a GUI/dashboard to view all jobs at once. We need more flexibility than the job scheduler in SQL Server Agent will allow. I have heard that it is possible to build a C# GUI app that can serve as a job engine front page.
What we need is a way to view all jobs in the system and be able to start, stop, pause all jobs manually in a graphical interface. I know of a few companies that are doing this but I am unable to find anything on line about it. My bosses are ready to give SSIS a shot if I can prove that we can build such an interface. Does anyone have any first hand knowledge of such an application or have any tips on where I should look?
Is it possible to import the SSIS workflow diagram into visio? Is there a way to extract the images and import them into SSIS for use with creating a workflow, or better yet, is there a way to re-engineering the workflow in SSIS with Visio?
For an initial step, I would like to re-create the SSIS workflow in Visio for presentations.
I am pulling my hair out having built my first application using WWF, I have moved the application off the development machine to a machine ready for production for the persistence store will now be on a remote server.
I cannot create a connection string for the persistence store that will allow me to attach to the mdb on the remote server.
the original connection string in development was:
Data Source=.SQLEXPRESS;AttachDbFilename='C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDataWorkflowPersistenceStore.mdf';Integrated Security=True;Connect Timeout=30;User Instance=False
I cannot create an equivalent string that does not return a error like this one:
An attempt to attach an auto-named database for file 'C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDataWorkflowPersistenceStore.mdf failed. A database with the same name exists, or specified file cannot be opened, or it is located on UNC share.
I did simply copy the WorkFlowPersistenceStore.mdf from the development machine to the server machine (completely independant environments) - It has worked in the past for other non wwf databases - Is there something particular about the wwf database that would have prevented this?
I have tried making the persistence store permanently attached (in the development environment) but this returns a different error that implies that a permanently attached database is not allowed - so I assume that the persistence service HAS to have a detached database - Am I correct?
I have tried a UNC naming that is the same on both the machine with the server and the client machine e.g. \MACHINEwwfWorkflowPersistenceStore.mdb - but that still returns an error - Is that not allowed? (or is that not allowed) - does the file path for an attached database have to be recognisable by the client machine, the server machine or both?
MUST the User Instance be True or False or does it not really matter
It is probably obvious that I am not confident in the whole area of connection strings and the effect of different parameters and the truth is I am not certain that I have not completely confused the SQL Server with the various attempts I have made.
Is there someone who could confirm that I am in the right area and could give me an example of what SHOULD work in a situation I have described so that I know that I am roughly right and to keep trying or I am barking up the wrong tree?
I have a "Execute DTS Package 2000 " task in SSIS. The SQL 2000 DTS has one task which precedence is "completion". Using SQL2000 it works properly, but when I invoke it from SSIS it doesn€™t respect the precedence. So, when the task above fails it ends the DTS execution.
Is it possible to configure the task to respect the precedences?
today MDS can send emails only when there is validation failures.I need to send an email when a value change to some users, but I dont want to make the record has failed. its not a validation.so I'm looking for a custom workflow doing this.is there some libraries of workflow available?
are there restrictions on what a DBA can do on a developer's edition?
i'm asking coz i get really bad connection problems when changing service accounts, client aliases, port numbers, connecting to different components like ssis, reporting services, database engine, etc...
in sql2000, changing these things are a breeze, is there some guidelines when making these changes in 2005?
Just want to know whether you guys do the following as a dba:
1. Setup a logging for tracking database as well as table size that containing size, indexing size. You can measure the growth. 2. Record for indexing for each database. I think this is over kill task. 3. Record database setup such as create statistic, update statistic etc.
These tasks can be automate to record every month for example and record this into Administration database for instance.
If I'm in the Data Flow tab in VS 2005, how can I select only certain components to run to test? I tried highlighting the ones that I want to run but it's running all of them in the tab...some of the components I want to take out for testing then maybe put back in later. If I delete the tasks I don't want to run, then I end up having to recreate them
Where to look for the web assitant created jobs on the server?I created a web page using the web assistant which is suppossed to get updated each time a value changes in a particular table,but I can't find the task which I created ,where do i look for that?I didn't use this wizard previously.thanks for any help! Sheila.
Would anyone be aware of anyplace I could find some good information on creating DTS custom tasks? I've come across a couple of articles from SQL Server Magazine, but nothing too substantive... Better yet, if anyone has any success (great or small), I'd like to hear from you and hear some of the things you did, what your custom task does, difficulty. I'm just trying to get an idea of how much work I have ahead of me....
If I have 2 scheduled tasks set for the same time (perhaps accidentally), will the SQL Executive start 1 and queue the other one until the first is complete and then run the 2nd task? Or will they both be started simultaneously?
I have been running the following production job successfully for a long time. It now fails, and the Task History Last Error Message displays 'No Message'. The log file ( C:MSSQLLOGMaint_TombV50.txt) shows it ran successfully, with a Return Code 0.
To all, If I have a scheduled tasks that is owned by 'sa', how can I assign permissions to allow another user, even the database dbo, to register the SQL server and view the scheduled tasks?
Hello , I need create tasks, which will be wake up daily to backup 4 databases. My questions are: 1. Can I create 1 task to backup all databases using TSQL command? Ex: DUMP DATABASE test1 to test1_backup DUMP TRANSACTIONS test1 to test1_backup with truncate_only DUMP DATABASE test2 to test2_backup DUMP TRANSACTIONS test2 to test2_backup with truncate_only Etc. If it’s possible can I just print this command to ‘Command:’ text box in the ‘New task’ window without writing the TSQL command into a text file and execute it with the ISQL program through CmdExec?????? 2.If first doesn’t work: can I do the same job, but create for every backup own task and run the same TSQL command for particular database. And if I can is it possible to schedule run all 4 tasks the same time or I should put time’s interval? 3. If 1. And 2. Are false. Give me your smart advice.
Is anybody knows how to transfer all the tasks in one server to another? Our development database will be transferred/copied to a new production box and that includes all tasks that we've created. We have almost a hundred tasks defined and we don't want it create manually. If someone had done this before, please give me a hint, i appreciate it very much!