Dataflow With Workflow
Nov 13, 2006
I know the idea was to seperate workflow and dataflow, but I have come across a scenario where it would be useful for a branch of a dataflow to wait until another branch has finished.
I have some transactional data which records events for the start and end of a session. I want to build a list of unique sessions with the start and end date. I currently have the list of events sorted by time, followed by a conditional split for the start and end events. I can then insert all of the start events and would like to wait until all of the starts are inserted before updating them with their relevant end times.
Is this achievable?
Does anyone else think it would be a good idea to be able to set precendence across multiple branches of a data flow?
Does anyone have a better solution?
I know this is the wrong forum, but is there a way to model this against the transactional data in SSAS, I will move this question to the SSAS forum if anyone can think this would work!
Philip Coupar
View 7 Replies
ADVERTISEMENT
Apr 17, 2007
Dear Friends,
I need to execute a SQL query, inside a dataflow (not in controlFlow) and need the records returned to continue the dataflow... In my case I cant use lookup and OLE DB COmmand and nothing else...
I need to execute a query and need the records for dataflow... with OLE DB command I cant see the fields returned... :-(
How can I do it? Using a script? Can I use a Script Component? That receive 2 parameters for input and give me the fields returned from query as output?
Thanks!!
View 38 Replies
View Related
Jul 12, 2005
Hi all,
I've got a DTS job which has lots of tasks in it. I've also got quite a few flows of workflow and i've noticed that a task won't execute if it has both Failure and Success workflow pointing to it... It can have multiple failures pointing to it and it will execute but it can't have multiple successes or a combination of workflows...
Does anyone know a way to get around this or to change the 'AND' ing that seems to be on the workflow going into a task..
Thanks.
View 2 Replies
View Related
Jan 12, 2005
Howdy,
Now this one I don't think will be fixed by changing the length of a variable.
Anyway, In this nifty DTS package I've created I have it set so that on the failure of an SQL task DTS should send me an e-mail letting me know it failed. The SQL statement in the DTS task is "EXEC WEB_Check_Files". In the Stored Proc I then call RAISERROR w/ a user defined error message w/ severity 10. When I call RAISERROR the DTS packages logs everything just kosher but skips past the "on fail" e-mail message. It refuses to execute it and don't know why. Spent the last hour or two going through docs looking at RAISERROR. Tried RETURN 1 to see if that would do anything and nadda. I also don't have "Fail package on first error" checked.
Don't know if it matters but I'm executing the package through the "play" button in Enterprise Manager.
So... anyone got an idea? I'm fresh out.
Yall rock!
Thanks,
Me
View 1 Replies
View Related
Aug 13, 2004
I hope I've done due dilligence before posting for help. I've combed a lot of Google and dBforums search results with no luck.
Anywho, this *seems* like it should be simple: I have a Transform Data Task, into which I've introduced deliberate errors, followed by an ActiveX Script Task that is supposed to fire based on the failure of the Data Transform Task. The second task is joined to the first via an "On Failure" workflow step.
The problem: The second task never fires. The first task fails as expected, but the second one just shows the "Not Run" indicator in the package results after executing.
Here is a graphical illustration of the package and results: http://www.bountifulware.com/blogs/rex/dtsproblems.html
I've experimented with the transaction settings in the package properties, as well as the transaction settings in the workflow properties of each step. I don't particularly want the package as a whole to be couched in a transaction, but if that is part of the equation for making the "On Failure" step fire, I'll happily go along. Also, in the Data Transform task with the deliberate errors, I have the max error count set high, as I want the task to continue logging errors for each record that chokes. I've tried various settings there as well, however.
Thank you in advance -- I'll try to contribute more and leech less after this.
View 4 Replies
View Related
Feb 5, 2008
I would like to build a workflow system where 100 processes are requesting an item from a ~1.000.000 items large DB, process that item and move it to the next state. The problem with the current implementation I tried is that I get deadlocks....
The DB table looks like:
CREATE TABLE Transactions(
itemid CHAR(32),
status TINYINT not null default 0,
result INT not null default 0,
lockby TINYINT not null default 0,
.... (etc.)
PRIMARY KEY(refno)
);
CREATE INDEX IxStatus on transactions (status)
Each process (with his own ID) is doing 4 step:
1) update transactions set status=1, lockby=<ID> from
(select top 1 itemid from transactions where status=0) as t1
where t1.itemid=transactions.itemid
2) select itemid from transactions where status=1 and lockby=<ID>
3) process item
4) update transactions set status=2,result=<RESULT> where itemid=<ITEMID>
Does anyone has a suggestion on how to do this?
View 3 Replies
View Related
Jun 14, 2006
Hi
Can any one provide sample for work flow in SSIS . and how to excute package conditionaly like if package1 is sucess then excute package2 else excute package3 .( more like writeing batch file in Data Stage ETL Tool ).
Thanks & Regards
Jegan.T
View 5 Replies
View Related
Apr 16, 2007
Hello,
I have a SQL Server 2000 DTS package in which the first step executes a batch file. The batch file contains FTP commands that log into an FTP server, and pull down whatever file is there.
I set up a failure workflow to send an email if the step fails. When I have a SQL Server job run this package, and there is no file to dowload, the whole package fails without the failure workflow result firing.
For the step (DTSStep_DTSCreateProcessTask_1), I have the 'FailPackageOnError' property set to -1. In the package properties, I have the check box for 'Fail Package on First Error' cleared.
What do I need to do so that the failure workflow occurs when the step fails?
Thank you for your help!
cdun2
View 4 Replies
View Related
Jan 16, 2007
Trouble with Workflow
Hello. I have a DTS package that executes some tasks of the type "Execute Package Task".
Every Task has a Condition of been executed just in case of success. I understand that only if the precedent task ends successfully the next one would be processed. My problem is that the DTS continues even if one of the previous steps fails. Would you have any idea?____________________________________________________________________
My escenario is something like this:
EP: Execute Package Task
ES: Execute Sql Task
______ _____
EP# 1 (on success) ----> EP# 2 (on success) ----> ....
______ _____
So, EP#1 fails but EP2# is also executed.
View 5 Replies
View Related
May 1, 2008
Hi,
I have 3 data flows connected sequentially on my workflow before processing dimensions and facts which checks data from different databases and if conditions does not meet then writes to log table. So if it writes to log table , I have to quit and finish the workflow.
How can I do that?
thanks,
J
View 3 Replies
View Related
Feb 1, 2007
i created a DTS package in SQL 2000 using the enterprise manager.
I have defined a SQL task to drop some temporary tables on failure of another SQL task. Also the same temporary tables need to be dropped on success of another task. the on success workflow is working fine. but when i add the on failure workflow to the temporary table dropping task, the temp table droppin is not getting excuted at all either for success nor for the failure. please help me out.
View 5 Replies
View Related
Jul 27, 2006
Hi,
I just recognized a strange displaying problem in one of my SSIS jobs.
I created job which contains a sequence container. Within the sequence container there is one "Execute SQL Task" and one "Foreach Loop Container". Within the "Foreach Loop Container" there are 4 task which are connected with precedence constraints.
Now my problem is that if I load the SSIS job and open the "sequence container" while the "Foreach Loop Container" is already open then the precedence constraint won't be displayed. If I close and open the "Foreach loop container" again then the precedence constraints will be displayed again.
My first assumption was that it might be a problem with the display drivers of the computer ....however the problem appears also on any other computer.
Does anyone know how to solve this display problem without closeing/reopening the container?
Thanks,
StSt
View 1 Replies
View Related
Dec 29, 2000
I know this sounds bizarre, but hey... this is DTS, right?
For ALL of my local packages on a particular server, when I open them with DTS designer on my workstation, everything is fine - I can even execute them. When I log onto the server locally, and open them, the workspace is empty!!!
They are stored as local packages. As I said, I can execute them within designer on my workstation, but if I try to run them via dtsrun, I get an error saying "No Steps have been defined for the transformation Package." And when I look at them on the server, that appears to be true.
One last thing (I know you've heard this before)... Everything was working fine last week - that darn Santa.
PLEASE HELP!!!
Thanks,
Andy
View 2 Replies
View Related
Aug 22, 2002
I have just installed MS Office XP developer and MS SQL Server 2000 as
I would like to use the workflow aspects of this. The problem is that
the installation of XP developer wil not install "Workflow Services for
SQL Server" as it says one needs SQL Server 7.0 SP2, but I have SQL Server 2000 SP2.
Has anybody come across something like this.
Thanks
Donnacha
View 1 Replies
View Related
Aug 22, 2007
HI,
We are currently trying to import the Data from one DB to Multiple DBs.And these multiple DBs connection strings must be configurable. This I can do it by reading the connection strings from dtsconfig file and execute it for each DB.Is it possible to execute for all the DBs in parallel(Multithreaded).
Thanks,
Sabari.
View 6 Replies
View Related
Aug 21, 2007
Hi, not sure my subject title makes it clear what I want but here it is.
I have a workflow which basically looks at an excel file in a folder on the local drive and then does loads of stuff to it. Everytime I want to process a different excel file that is in the same location al I have to do is change the value of a single local variable, which is just the name of the excel file.
Is there a way to make this automatic? For exmaple....could I somehow put my whole workflow inside a loop that looks inside that local folder and one by one, get the name of the file, assigns the name to that global variable, and then runs the flow...and continues to do that until it gets to the end??
any help would be greatly appreciated...thanks!!!!
andy
View 3 Replies
View Related
Jun 14, 2007
Hi,
this is a bit of an open-ended question. but,
is it possible that you can use SB as part of a workflow engine?
SB sends messages backwards and forwards on a conversation so I am guessing that there is some way that the application can "work out" which person to send a particular message to ?
I've seen examples like an expense form being filled in and sent to a manager, presumably you can specify if the manager accepts or rejects the expense form?
I know there is windows workflow, but, i thought using SB is an alternative?
MrPeds
View 2 Replies
View Related
Mar 31, 2006
hi,
i have a simple database. to test it i have the database locally and i have the same database in the server.
currently when i update the database i copy the whole database (mdf file including the data) to the server.
i would like to copy only the tables and not the data to the server. what is the best and easiest way to do it?
regards,
rnv
View 7 Replies
View Related
Jan 3, 2008
I have three control flow tasks that are executing in consecutive order. Tasks 1 and 3 will always execute, but sometimes (based on an expression) task 2 will not. I would like to use precedence constraints in such a way that task 3 will execute regardless of whether task 2 executes, but in the event task 2 does execute, task 3 will only execute AFTER task 2 completes. Is there a way to accomplish this without setting the disabled property of task 2 at runtime?
View 4 Replies
View Related
May 12, 2008
Hi all,
I am looking for any information about using SSIS as a workflow automation (job engine) tool. My company is looking into buying a 3rd party app to do our job scheduling and I think that SSIS could do all that we need. The only issue they found with SSIS is the lack of a GUI/dashboard to view all jobs at once. We need more flexibility than the job scheduler in SQL Server Agent will allow. I have heard that it is possible to build a C# GUI app that can serve as a job engine front page.
What we need is a way to view all jobs in the system and be able to start, stop, pause all jobs manually in a graphical interface. I know of a few companies that are doing this but I am unable to find anything on line about it. My bosses are ready to give SSIS a shot if I can prove that we can build such an interface. Does anyone have any first hand knowledge of such an application or have any tips on where I should look?
Thanks in advance!
-- Craig ***
View 4 Replies
View Related
Jan 25, 2008
Is it possible to import the SSIS workflow diagram into visio? Is there a way to extract the images and import them into SSIS for use with creating a workflow, or better yet, is there a way to re-engineering the workflow in SSIS with Visio?
For an initial step, I would like to re-create the SSIS workflow in Visio for presentations.
Dan
View 5 Replies
View Related
Feb 14, 2008
I am pulling my hair out having built my first application using WWF, I have moved the application off the development machine to a machine ready for production for the persistence store will now be on a remote server.
I cannot create a connection string for the persistence store that will allow me to attach to the mdb on the remote server.
the original connection string in development was:
Data Source=.SQLEXPRESS;AttachDbFilename='C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDataWorkflowPersistenceStore.mdf';Integrated Security=True;Connect Timeout=30;User Instance=False
I cannot create an equivalent string that does not return a error like this one:
An attempt to attach an auto-named database for file 'C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDataWorkflowPersistenceStore.mdf failed. A database with the same name exists, or specified file cannot be opened, or it is located on UNC share.
I did simply copy the WorkFlowPersistenceStore.mdf from the development machine to the server machine (completely independant environments) - It has worked in the past for other non wwf databases - Is there something particular about the wwf database that would have prevented this?
I have tried making the persistence store permanently attached (in the development environment) but this returns a different error that implies that a permanently attached database is not allowed - so I assume that the persistence service HAS to have a detached database - Am I correct?
I have tried a UNC naming that is the same on both the machine with the server and the client machine e.g. \MACHINEwwfWorkflowPersistenceStore.mdb - but that still returns an error - Is that not allowed? (or is that not allowed) - does the file path for an attached database have to be recognisable by the client machine, the server machine or both?
MUST the User Instance be True or False or does it not really matter
It is probably obvious that I am not confident in the whole area of connection strings and the effect of different parameters and the truth is I am not certain that I have not completely confused the SQL Server with the various attempts I have made.
Is there someone who could confirm that I am in the right area and could give me an example of what SHOULD work in a situation I have described so that I know that I am roughly right and to keep trying or I am barking up the wrong tree?
Thanks in anticipation!
View 3 Replies
View Related
Feb 1, 2008
I have a "Execute DTS Package 2000 " task in SSIS. The SQL 2000 DTS has one task which precedence is "completion". Using SQL2000 it works properly, but when I invoke it from SSIS it doesn€™t respect the precedence. So, when the task above fails it ends the DTS execution.
Is it possible to configure the task to respect the precedences?
View 6 Replies
View Related
Sep 18, 2015
today MDS can send emails only when there is validation failures.I need to send an email when a value change to some users, but I dont want to make the record has failed. its not a validation.so I'm looking for a custom workflow doing this.is there some libraries of workflow available?
View 2 Replies
View Related
Nov 7, 2007
Hi,
I'm wondering if it is possible to make an update of a specifics rows in a database table using dataflow tasks
thanks
View 9 Replies
View Related
Dec 11, 2007
When i use tablename or viewname variable in datasource component and data determination component ,
how can i manage the output columns and the input columns?
Yes,i can use default value init the columns ,but when variable value changed,error occurs.
Thanks a lot!
View 3 Replies
View Related
Dec 24, 2007
Hi Pals,
Here is my scenario in my ETL process, I have one DataFlow task.
Assuming that i have 10 clean records in my source database and i need to load all the 10 recs into my target table.
IS there any means of cross checking the no of rows from source table and number of rows loaded into my target table.
Any suggestions are greatly appreciated.
Thanks & Regards.
View 6 Replies
View Related
May 23, 2007
Hi all,
In my DataFlow i set the "OLEDB Source" which is a table in my Extract Server and need to do some transformations and stage the table which will be a Dimension in the staging DB,
Q1-Now i need only 3 columns from the Source table, which transformation do i need to use to just extract the the 3 columns?
Q2- Two Columns of 3,which i will need to transform as it is-no changes at all and One of the column which has values like "BOSTON...."
(I have a vague idea of what i need to do,need something solid suggestions/advices to kickoff,plan is to use this city column with a Replace function (as one of the forum member's Spirit1 adviced..thanks..!!))to take out the dots and then need to write a condition if BOSTON then Assign Code "BOS" which will be City_Code and this "City_Code" will have to be looked in City_Dimension to get the "City_Key_Number" for "Boston" and lastly the City_Code and City Key Number both have to be transformed to the destination Dimension.
Any ideas /suggestions will be appreciated.
Thanks in advance...!!
ravi
View 5 Replies
View Related
May 23, 2006
Greetings!
I am attempting to implement the following case statement BEFORE getting the data in to my destination table but I don't know how to create an expression for it.
In the mapping section of my OLE DB destination component I can only do mapping but I can't actually manipulate the data before it gets to the destination table.
What do I have to do to implement :
case
when SOPD.PRICE_TOP_NUMBER is NULL then -8
else SOPD.PRICE_TOP_NUMBER
end AS price_top_number,
before it goes to the destination column?!
Thanks for your help in advance.
View 11 Replies
View Related
Nov 7, 2007
Hello every one,
I have a simple dataflow from source to target.
My source is raw file and target is oracle table.
The problem I€™m facing in SSIS is its talking more than 10 minutes to load almost 30-40k records, I do not have any transformation activity this is just a simple dump from source to target.
Please do inform me of any setting to be talk care for fasten the process.
Thank you
View 5 Replies
View Related
Apr 25, 2007
This error seem to be very silly.did anyone come across this error.
I have been transferring data from textfile to a table using oledb destination.
The number of records in the text file are 2,091,650
Its was running just fine couple of days ago when the incoming data was little small then this...(arround 300,000).Now it seem to have a problem.
Here is the flow
1.File System task ->I copy the file to different location
2.Execute sql task->truncate tables
3.DataFlow task->I check for only the error files in this data flow.and all valid rows i transfer to a different text file.
4.Dataflow->filesource i connect to new text file created earlier.Here i convert fields to repective datatype and i insert if new or update record.
I dont know whats going on...
When i run my package it runs through the first three perfectly fine.When it comes to fourth step it sits there.....it dosent go to the tasks within this dataflow at all...and begining i have flat files source...
What could be the reason...
when i look at progress tab...i was able to look at the progress of other tasks but when it comes to this task it shows start>>>>>time and it sits there...
View 4 Replies
View Related
Dec 12, 2006
I am transfering data from a textfile to sql server.I use a data flow task for trasfering my text files.
Here is what i do.
1.Add text file source
What i want to achieve here is if the text file countains the column name in the first row i should delete them and if it does not contain column name in the first row just transfer it.
how can this be achieved???
2.add one more column to my text file which should contain the status(insert or update).
how can this be done??
3.before transfering data ot destination i want to know if the record exists if exists i just want to update it instead of insert.and if new record i want to insert it .and the status in the above new column need to change.
please help...
View 5 Replies
View Related
Jul 16, 2007
Hi all,
I have encountered a SQL 2005 deadlock issue while executing dataflow in a SSIS package. The deadlock happens when I have indexed two columns. If I don't have index, deadlock does not happen.
Error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Transaction (Process ID 67) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.".
The above causes the rest of the dataflow execution to be terminated.
What my dataflow does is to extract data from 14 flat files and then insert the records into a single table (no primary key, but with two columns indexed).
Can anyone please advise how I can avoid deadlock with indexes in a table?
Thank you and much appreciated!
View 6 Replies
View Related