Hi all,
I am looking for any information about using SSIS as a workflow automation (job engine) tool. My company is looking into buying a 3rd party app to do our job scheduling and I think that SSIS could do all that we need. The only issue they found with SSIS is the lack of a GUI/dashboard to view all jobs at once. We need more flexibility than the job scheduler in SQL Server Agent will allow. I have heard that it is possible to build a C# GUI app that can serve as a job engine front page.
What we need is a way to view all jobs in the system and be able to start, stop, pause all jobs manually in a graphical interface. I know of a few companies that are doing this but I am unable to find anything on line about it. My bosses are ready to give SSIS a shot if I can prove that we can build such an interface. Does anyone have any first hand knowledge of such an application or have any tips on where I should look?
I have recently become a release manager for SRSS in our company. Since then I've been swamped with requests to migrate reports, permissions and subscription lists from development environment to production.
Each time I have to do it manually with a lot of clicks. It is a real pain...
So, may be... may be there is an automation tool out there to help me? Does anybody know?
This tool or s/w package should move a report file along with its permissions and subscription lists from one server to another.
Can any one provide sample for work flow in SSIS . and how to excute package conditionaly like if package1 is sucess then excute package2 else excute package3 .( more like writeing batch file in Data Stage ETL Tool ).
When I move - but not migrate - packages to 2005, this still works. I don't have a problem with permissions or any of the other problems that I have seen posted in this forum. But if I migrate a package to SSIS, the package is now seen in sysdtspackages90 instead of table sysdtspackages (assuming I remove the old version). But "LoadFromSQLServer(...)" can't find it anymore. So it seems like one of the following is true:
a) There is a different item that I should use in sp_oacreate besides DTS.Package. I tried SSIS.Package or DTS.Package90 which seemed like reasonable possibilities but they don't work. Is there something that will work? Or is "LoadFromSQLServer(...)" not supported for SSIS packages?
b) Or OLE Automation doesn't work with SSIS packages. OLE Automation is enabled and it works on non-migrated packages. Is it true that it was decided by Microsoft that this capability was not needed in SSIS?
I see numerous postings saying not to use the sp_OA routines. I understand there are other ways to execute an SSIS package from a SP and that there may be more secure ways. But I need to know if it is an absolute requirement to use these other methods.
Is it possible to import the SSIS workflow diagram into visio? Is there a way to extract the images and import them into SSIS for use with creating a workflow, or better yet, is there a way to re-engineering the workflow in SSIS with Visio?
For an initial step, I would like to re-create the SSIS workflow in Visio for presentations.
I have a "Execute DTS Package 2000 " task in SSIS. The SQL 2000 DTS has one task which precedence is "completion". Using SQL2000 it works properly, but when I invoke it from SSIS it doesn€™t respect the precedence. So, when the task above fails it ends the DTS execution.
Is it possible to configure the task to respect the precedences?
I currently doing microsoft's hands-on training tutorial for SSIS. I realized that its not an comprehensive ETL tool as it was advertised to me. I see it more of a package tool for all imports, exports which include DML scripts.
In my scenario I have to convert 10 source data tables to destination schema tables. i normally use DML scripts to convert the data. Once I have mapping between source and destination tables it does not take more than hour to write DMl scripts. But when I try to use SSIS its taking me more time to create data flow tasks and create a package. And does SSIS support all the sql server 2005 string functions.
We have a web application using SQL Server 2005 to store and serve the data to forms made using .Net 2. The forms are used by customers to enter fifteen or so items if data. An important part of what the app does is to validate these 15 items of data against data brought in from the main application which is in Access 2003 (for example, did the customer enter a valid part #, did they actually purchase that part number in the qty stated, etc).
I want to periodically (nightly) schedule a copy of the lookup data from Access (2 different databases) into tables in SQL Server. The copy process would simply overwrite the existing lookup data in SQL server with the new data each night. My guess that about 250 mb of data would be copied. The two different databases are identical in structure, but different in data. They are used in two different countries.
The process as I see it would be to:
1. Delete existing lookup data. 2. Copy in data from database 1 3. Copy in data from database 2
I need to use something with enough programming power to resolve issues with primary keys, eg not copy Access table columns which contain them, and perhaps populate a column in the SQL Server lookup tables which indicate which Access database a given record came from.
If we use SSIS, then I will need to learn it, which is great as long as I am going down the right path in the first place.
Would SSIS be the right tool? Is there something more appropriate? Is there a good tutorial?
At our company we are considering building an architecture for file imports and processing and considering both Biztalk and SSIS at this time.
My understanding from reading the material out there regarding this subject that Biztalk is more suited for integrating applications and real time communication of information and SSIS is more suited for bulk loads into databases/data warehouses and data manipulation.
Currently we are somewhat along the lines described above, but there is a desire to use one technology over the other for importing files and data manipulation, and I am not sure that is practical. Also there is a debate currently as to whether Biztalk does better at handling logic than SSIS.
I have read through the article on microsoft site that outlines the above - http://www.microsoft.com/technet/prodtechnol/biztalk/2004/whitepapers...
However, I would like to get some input from people who have actually used both these tools in the real world for ETL process and could provide some insight so as to help us make an informed decision.
I am trying to migrate our processing from command line based scripts and foxpro to SQL so I need to run the SSIS packages using dtexec. I copied the dtexec file and a few dll's that are missing to our production servers but i cant execute the packages. I dont want to install the full client tools (particularly managment/business inteligence studio) on our production servers due to the overhead and limited system disk space.
Can somebody tell me what the minimum install would be so I would be able to run SSIS packages using the dtexec or dtexecui tools? I would also like to install some of the other command line client tools like osql etc.
Hi, I am less of a technical but more of a analyst professional and right now investgating on various tools / options for the new conversion project I will be leading in insurance client. One of the tools that client want to use is SSIS but the source and target database is not on SQL server but plans are to build a staging SQL server database for transformation. Does SSID supports this kind of ETL process where both source and target system are non SQL servers.
I've got a DTS job which has lots of tasks in it. I've also got quite a few flows of workflow and i've noticed that a task won't execute if it has both Failure and Success workflow pointing to it... It can have multiple failures pointing to it and it will execute but it can't have multiple successes or a combination of workflows...
Does anyone know a way to get around this or to change the 'AND' ing that seems to be on the workflow going into a task..
Now this one I don't think will be fixed by changing the length of a variable.
Anyway, In this nifty DTS package I've created I have it set so that on the failure of an SQL task DTS should send me an e-mail letting me know it failed. The SQL statement in the DTS task is "EXEC WEB_Check_Files". In the Stored Proc I then call RAISERROR w/ a user defined error message w/ severity 10. When I call RAISERROR the DTS packages logs everything just kosher but skips past the "on fail" e-mail message. It refuses to execute it and don't know why. Spent the last hour or two going through docs looking at RAISERROR. Tried RETURN 1 to see if that would do anything and nadda. I also don't have "Fail package on first error" checked.
Don't know if it matters but I'm executing the package through the "play" button in Enterprise Manager.
I hope I've done due dilligence before posting for help. I've combed a lot of Google and dBforums search results with no luck.
Anywho, this *seems* like it should be simple: I have a Transform Data Task, into which I've introduced deliberate errors, followed by an ActiveX Script Task that is supposed to fire based on the failure of the Data Transform Task. The second task is joined to the first via an "On Failure" workflow step.
The problem: The second task never fires. The first task fails as expected, but the second one just shows the "Not Run" indicator in the package results after executing.
Here is a graphical illustration of the package and results: http://www.bountifulware.com/blogs/rex/dtsproblems.html
I've experimented with the transaction settings in the package properties, as well as the transaction settings in the workflow properties of each step. I don't particularly want the package as a whole to be couched in a transaction, but if that is part of the equation for making the "On Failure" step fire, I'll happily go along. Also, in the Data Transform task with the deliberate errors, I have the max error count set high, as I want the task to continue logging errors for each record that chokes. I've tried various settings there as well, however.
Thank you in advance -- I'll try to contribute more and leech less after this.
I would like to build a workflow system where 100 processes are requesting an item from a ~1.000.000 items large DB, process that item and move it to the next state. The problem with the current implementation I tried is that I get deadlocks....
The DB table looks like:
CREATE TABLE Transactions( itemid CHAR(32), status TINYINT not null default 0, result INT not null default 0, lockby TINYINT not null default 0, .... (etc.) PRIMARY KEY(refno) ); CREATE INDEX IxStatus on transactions (status)
Each process (with his own ID) is doing 4 step: 1) update transactions set status=1, lockby=<ID> from (select top 1 itemid from transactions where status=0) as t1 where t1.itemid=transactions.itemid
2) select itemid from transactions where status=1 and lockby=<ID>
3) process item
4) update transactions set status=2,result=<RESULT> where itemid=<ITEMID>
I know the idea was to seperate workflow and dataflow, but I have come across a scenario where it would be useful for a branch of a dataflow to wait until another branch has finished.
I have some transactional data which records events for the start and end of a session. I want to build a list of unique sessions with the start and end date. I currently have the list of events sorted by time, followed by a conditional split for the start and end events. I can then insert all of the start events and would like to wait until all of the starts are inserted before updating them with their relevant end times.
Is this achievable?
Does anyone else think it would be a good idea to be able to set precendence across multiple branches of a data flow?
Does anyone have a better solution?
I know this is the wrong forum, but is there a way to model this against the transactional data in SSAS, I will move this question to the SSAS forum if anyone can think this would work!
Hello, I have a SQL Server 2000 DTS package in which the first step executes a batch file. The batch file contains FTP commands that log into an FTP server, and pull down whatever file is there.
I set up a failure workflow to send an email if the step fails. When I have a SQL Server job run this package, and there is no file to dowload, the whole package fails without the failure workflow result firing.
For the step (DTSStep_DTSCreateProcessTask_1), I have the 'FailPackageOnError' property set to -1. In the package properties, I have the check box for 'Fail Package on First Error' cleared.
What do I need to do so that the failure workflow occurs when the step fails?
Trouble with Workflow Hello. I have a DTS package that executes some tasks of the type "Execute Package Task". Every Task has a Condition of been executed just in case of success. I understand that only if the precedent task ends successfully the next one would be processed. My problem is that the DTS continues even if one of the previous steps fails. Would you have any idea?____________________________________________________________________ My escenario is something like this: EP: Execute Package Task ES: Execute Sql Task
______ _____ EP# 1 (on success) ----> EP# 2 (on success) ----> .... ______ _____
I have 3 data flows connected sequentially on my workflow before processing dimensions and facts which checks data from different databases and if conditions does not meet then writes to log table. So if it writes to log table , I have to quit and finish the workflow. How can I do that?
i created a DTS package in SQL 2000 using the enterprise manager. I have defined a SQL task to drop some temporary tables on failure of another SQL task. Also the same temporary tables need to be dropped on success of another task. the on success workflow is working fine. but when i add the on failure workflow to the temporary table dropping task, the temp table droppin is not getting excuted at all either for success nor for the failure. please help me out.
I just recognized a strange displaying problem in one of my SSIS jobs.
I created job which contains a sequence container. Within the sequence container there is one "Execute SQL Task" and one "Foreach Loop Container". Within the "Foreach Loop Container" there are 4 task which are connected with precedence constraints.
Now my problem is that if I load the SSIS job and open the "sequence container" while the "Foreach Loop Container" is already open then the precedence constraint won't be displayed. If I close and open the "Foreach loop container" again then the precedence constraints will be displayed again.
My first assumption was that it might be a problem with the display drivers of the computer ....however the problem appears also on any other computer.
Does anyone know how to solve this display problem without closeing/reopening the container?
I know this sounds bizarre, but hey... this is DTS, right?
For ALL of my local packages on a particular server, when I open them with DTS designer on my workstation, everything is fine - I can even execute them. When I log onto the server locally, and open them, the workspace is empty!!!
They are stored as local packages. As I said, I can execute them within designer on my workstation, but if I try to run them via dtsrun, I get an error saying "No Steps have been defined for the transformation Package." And when I look at them on the server, that appears to be true.
One last thing (I know you've heard this before)... Everything was working fine last week - that darn Santa.
I have just installed MS Office XP developer and MS SQL Server 2000 as I would like to use the workflow aspects of this. The problem is that the installation of XP developer wil not install "Workflow Services for SQL Server" as it says one needs SQL Server 7.0 SP2, but I have SQL Server 2000 SP2.
HI, We are currently trying to import the Data from one DB to Multiple DBs.And these multiple DBs connection strings must be configurable. This I can do it by reading the connection strings from dtsconfig file and execute it for each DB.Is it possible to execute for all the DBs in parallel(Multithreaded).
Hi, not sure my subject title makes it clear what I want but here it is.
I have a workflow which basically looks at an excel file in a folder on the local drive and then does loads of stuff to it. Everytime I want to process a different excel file that is in the same location al I have to do is change the value of a single local variable, which is just the name of the excel file.
Is there a way to make this automatic? For exmaple....could I somehow put my whole workflow inside a loop that looks inside that local folder and one by one, get the name of the file, assigns the name to that global variable, and then runs the flow...and continues to do that until it gets to the end??
any help would be greatly appreciated...thanks!!!!
is it possible that you can use SB as part of a workflow engine?
SB sends messages backwards and forwards on a conversation so I am guessing that there is some way that the application can "work out" which person to send a particular message to ?
I've seen examples like an expense form being filled in and sent to a manager, presumably you can specify if the manager accepts or rejects the expense form?
I know there is windows workflow, but, i thought using SB is an alternative?
I have three control flow tasks that are executing in consecutive order. Tasks 1 and 3 will always execute, but sometimes (based on an expression) task 2 will not. I would like to use precedence constraints in such a way that task 3 will execute regardless of whether task 2 executes, but in the event task 2 does execute, task 3 will only execute AFTER task 2 completes. Is there a way to accomplish this without setting the disabled property of task 2 at runtime?