I have read several posting about various modes of trapping errors, but none seem to directly address what I am looking for (SQLIS.com.MSDN, etc)
Coming from a Java/C# background, I am looking for a way to trap errors that arise within the ssis control flow much like the said languges:
try {
do something
} catch(AnExceptionType myException) {
handle my exception
}
/ ** my code at this point is unperterbed by the exception unless I explicitly raise the exception out of the scope of the exception handler. */
To make the analogy in SSIS, I want to be able to handle an error within a "container" and not have to handle the same error in surrounding containers.
Example:
I have a "Foreach" container (call it container FEC) that contains several other containers. One of the subordinate containers is a "For Loop" (call it FLC). The FLC in turn has some nested tasks, some of which are expected to fail and therefore I want to handle in a graceful manner. The tasks that are expected to fail have a "fail" constraint that links them to a task that I want to occur when the failure occurs, and that works, but the failure is not trapped as it percolates out of the container to the FEC. I also tried to trap it with event handler, but that is also an incorrrect trail to follow.
I don't want the failure to percolate up to the FEC. I have set the max errors to a reasonable value for FLC and my "program" is not exceeding that value; however, the FEC still sees that error so it fails. How do I keep FEC from seeing the error (without upping the max errors for the FEC)?
BTW, I am using the script task to set a variable value to indicate successes or fails for those tasks where I can set the max errors to a high enough level (allow the error to occur, then let the fail/success precedent constraint pass control to the script task so that the variable can be set). This is only a partial solution.
I am new to SSIS, in fact to the MS world having been a code slinger for Java and Oracle. So far I have been very impressed with SSIS. Analogous structures that I expect to find in modern development environments have been within easy reach. This is my first serious challenge. Please help.
Hello, I encountered an interesting situation. I have a gridview and a sqldatasource. It has delete function. When I delete a record an error of foreign key violation is raised. I would like to trap this error and give a user friendly message to the user. If I use ADO.Net I can use Try/Catch, but it seems there is no way to do the same thing using datasource. Anyone knows? Thank you, J
I thought this SP would return 547(foreign key constraint voilation) when column 'thing' was being referenced in another table. Instead, when the front-end application code calls this SP it gets a 1 from the delete statement itself. In other words, my return statement never seems to get executed. Is there any way of achieving this? In other words, I want to trap the error 547 and return that to the front-end.
Hi,I have a stored proc StoredProc1 ={INSERT INTO Table1SELECT *FROM View1Return @@ERROR}StoredProc1 is used in another sp StoredProcMain ={(some code before)...EXEC @iResult = StoredProc1If @iResult <> 0BEGINROLLBACK TRANSACTIONReturn @iResultEND.... (continue)}So I want to rollback if StoredProc1 is not successful.Then I ran into a problem. I added a column to Table1 but forgot toupdate View1 to add the equivalent column. When I executedStoredProc1, I got the "Insert Error: Column name or number ofsupplied values does not match table definition." But the error isNOT trapped. It seems the instruction "Return @@ERROR" returns 0 andStoredProcMain goes on as if there wasn't an error.How can I trap this error?ThanksWalter
I am panning to write a DTS package whcih alter the table and output any error messages if the alter statement fails.
I have created Execute SQL Task in which I have wrote the following command.
Alter table Employee ADD EmpStatus char(4) DEFAULT A not null;
I have created a work flow to write the error message to a text file. But I am having trouble to trap the error message prduced by the Alter statement (like "column names in each table must be unique. Column name specified int the table more than once").
I have read some ideas on this, but nothing is working for me.I have an SQLDataSource bound to a FormView. I need to use the FormView to Insert new rows. When I type new values, all is well. When I type a duplicate, a get a runtime primary key error. That's fine, but how do I trap that? Overriding Page_Error doesn't work for me.Anyone please?
I would like to trap a return value from a cmdexec that is scheduled. The cmdexec returns 0 if it is a success and something other than 0 if it doesn't.
Can I raise an error from a command file. The command file calls a console application ( i.e. no interface ).
Hi I am running some scripts in files using sqlcmd via a SQL Server Agent job. If sqlcmd generates an error (for example if it is unable to connect) then the job fails. However, if the T-SQL within the script is invalid (syntax, name resolution etc etc) the job completes reporting success. If sqlcmd is invoked directly via the query window then no error is raised however there is a result set returned reporting the error. Anyone know why and whether is it possible to get the error to be recognised by the job? invalid_sql.sql--The below is not actually valid SQL.do SOME stuff, innit! sqlcmdEXEC master.dbo.xp_cmdshell "sqlcmd -S my_server -i C:invalid_sql.sql" Cheers
Hello to all,I've fallow problem. I've a sp called as a job of SA each minute. Thisruns pretty nice, but from time to time, the job is aborted, and Idon't know why.Considering my logging, which is implemented in DB, I know, in whichpoint it is happening, but I don't know the exact error.This one is for sure any SQL server exception.I wanted to track this error, but reading all news, and help, andperforming some tests, I've find out, that this is almost likeimpossible, to catch the error in t-sql code (for example in this sp),and wirte it to any table for futher review.Reading great documentation from Erland Sommarskog, I know, there isno way to catch this error in t-sql, because, usualy the sql terminatesexecution of the code immieadetly (so I found it also by my tests).Now, my question is: sice I'm calling this sp continously in ServerAgent as a job scheduled to be called each one minute, is it any way,to trap this error on this level? In SA? and THEN save it somewhere inmy db?I'm calling the sp as a 'command' in job step as 'execsp_name_of_procedure'.If I'll try like this:declare @err intset @err = 0exec sp_name_of_procedureset @err = @@errorif @err <> 0begininsert into tbl_logger (sql_error, msg) values (@err, 'SQL raised anerror')endwill it work, or the sql will assume the whole code as a one batch, andwill terminate after call of sp?Thank you in advance for reply.GreatingsMateusz
Hi there,I am converting a large PL/SQL project into Transact-SQL and have hitan issue as follows:I have a PL/SQL procedure that converts a string to a date. Theprocedure does not know the format of the date in the string so ittries loads of formats in converting the string to a date until itsucceeds.After trying each potential format it uses the Oracle 'EXCEPTION WHENOTHERS' construct to trap the failure so it can try another format.Is it possible to do this with SQLServer ? If I do a CONVERT and it isnot one of the standard formats it fails. This is part of a backgroundscheduled process and I cannot afford the procedure to bomb out.I suspect the answer is I cannot do this and will need to impose somecontrol over the string being received (from various externalsystems!!) to ensure it is a specific known format. Even if I know itwill be one of the known SQLServer formats this will not be enoughsince if the first one I try is not correct the process will crash.Any ideas ?Thanks
I have a DTS package that I'm moving over to SSIS. In place of migrating this package, I've choosen to recreate it. This package moves data from an Informix database to a SQL database.
In the old package the first task was to make a simple connection to the Informix database and if the task failed, it would send an email and stop the package.
The biggest reason for this is because the Unix server that I'm getting the Informix data from forces the user passwords to be reset ever 90 days. So in my old package, if I forgot to change the password and the connection started to fail it would send me an email.
In my new package, SSIS performs a validation before starting. There are a number of task that uses the connection to the Informix database. Under testing, if I put in a bad password, the validation process generates a validation error. I've tried catching this validation error using the Error Handling events but I've had no luck. I can send out errors PreValidation and PostValidation but OnError appears not to fire under a validation error.
Might anyone have any suggestions on a proper way to validate and be able to send out email notification if a connection fails? Any assistance would be appreciated.
I need to pass a parameter from control flow to data flow. The data flow will use this parameter to get data from a Oracle source.
I have an Execute SQL task in control flow to assign value to the Parameter, next step is a data flow which will need take a parameter in the SQL statement to query the Oracle source,
The SQL Looks like this:
select * from ccst_acctsys_account
where to_char(LAST_MODIFIED_DATE, 'YYYYMMDD') >?
THe problem is the OLE DB source Edit doesn€™t have anything for mapping parameter.
I have an Execute SQL Task that returns a Full Rowset from a SQL Server table and assigns it to a variable objRecs. I connect that to a foreach container with an ADO enumerator using objRecs variable and Rows in first table mode. I defined variables and mapped them to the columns.
I tested this by placing a Script task inside the foreach container and displaying the variables in a messagebox.
Now, for each row, I want to write a record to an MS Access table and then update a column back in the original SQL Server table where I retreived data in the Execute SQL task (i have the primary key). If I drop a Data Flow Task inside my foreach container, how do I pass the variables as input to an OLE DB Destination on the Data Flow?
Also, how would I update the original source table where source.id = objRects.id?
Thank you for your assistance. I have spent the day trying to figure this out (and thought it would be simple), but I am just not getting SSIS. Sorry if this has been covered.
Dear All! My package has a Data Flow Task. In Data Flow Task, I use a Script Component and a OLE BD Destination to transform data from txt file to database. Within Data Flow Task, I want to call File System Task to move file to a folder or any Task of "Control Flow" Tab. So, Does SSIS support this task? Please show me if any Thanks
I'm currently setting variables at the package level with an ExecuteSQL task. This works fine. However, I'm now starting to think about restartability midway through a package. It would be nice to have the variable(s) needed in a data flow set within the data flow so that I only have to restart that task.
Is there a way to do that using an SQL statement as the source of the value in a data flow?
OR, when using checkpoints will it save variable settings so that they are available when the package is restarted? This would make my issue a moot point.
Hi all! I recently started working with SSIS and one of the things that is puzzling me the most is what's the best way to go:
A small control flow, with large data flow tasks A control flow with more, but smaller, data flow tasksAny help will be greatly appreciated. Thanks, Ricardo
I am new to SSIS can anyone tell me the diff between control flow and dataflow. if all the transformation are done using dataflow than why do we use control flow. Sorry if I am asking you very basic question.
I am having a hard time with what appears to be something simple. I want to import an excel spreadsheet into a table on a daily basis from a command line. I created a package from the Import Wizzard in the SQL Management Studio and saved it. Since I want a clean table each day, my process needs to be create a temp table, import from the Excel file into the temp table. If that is successful, delete the original table and rename the temp table the original name. The point of this process is to provide for a fail-safe if there is some unforseen problem downloading the data on a particular day.
When I run the package, the first thing it does is delete the original table. I know this because the process shows the time that it finished is before anything else has started or finished. The time shown for the completion of the data flow task is about 2 minutes after that time.
This is maddening!!! The one thing I do not want to happen I can not seem to prevent. I have my control flow set on success. Why does it do this?
Is it possible to setup a Control Flow at the solution level rather than at the package level? I'd like to setup a Control Flow that truncates multiple tables in a staging database then runs multiple packages that reloads those tables. Each package has a Control Flow tab that seems to be specific to the package. Is it possible to set something up that governs the execution of multiple packages?
Essentially I have an incoming file, each line in the file is a record. The records share the same initial key fields for the first 10 columns, then the field structure varies depending on a rectype and sequence number.
My initial plan was load the keys into fields, and load the remaining data into a long varchar field.
Then the stored procedure would evaluate the Rectype and Seqno of each record and chop up the Varchar accordingly.
So I set up a cursor to read the temporary table, do a fetch into variables, and go to evaluate the variables.
I want to be able to use a CASE statement to evaluate the fields and then perform various logic, but it's giving me fits because it seems like CASE only really works in Select statements, and won't really allow you to do any sort of GOTO logic.
I chopped the following SQL up and put in a rough cut of what I thought I was doing.
FETCH NEXT FROM Transaction_Cursor into @eepssn,@rectype,@seqno,@data
WHILE @@FETCH_STATUS = 0 BEGIN select @companycode = cmpcompanycode, @eeccoid = eeccoid, @eeceeid = eeceeid, @eecempno = eecempno from company,empcomp where eeceeid = (select eepeeid from emppers where eepssn = @eepssn ) and eecemplstatus = 'A' and cmpcoid = eeccoid
CASE @RECTYPE+@SEQNO when '001001' then goto parse_pretax when '002001' then goto parse_LOAN else select @RECTYPE+@SEQNO+' not recognized!' end process_it: insert into foo2 (empno,companycode,amt,pct) values (@eecempno,@companycode,@eeamt,@eepct) FETCH NEXT FROM Transaction_Cursor into @eepssn,@rectype,@seqno,@data END
CLOSE Transaction_Cursor DEALLOCATE Transaction_Cursor
goto bypass
parse_pretax: let @eepct = substring(@data,1,5)
goto process_it
parse_loan: let @eeamt = substring(@data,27,11) goto process_it
bypass:
I could sketch it out a little bit better in Northwind or Pubs, but I think I just need a smack upside the head and a little edification.
Hi, I would like to have a decision maker component in Control Flow to decide which Data Flow to run. For example: if myVar=A then run data flowA if myVar=B then run data flowB ......
We have Conditional Split component available in data flow but not in Control Flow. What component in control flow can be used to do the same job that conditional flow is doing in data flow? Or what's a work around for this.
I have a problem when using nested loops in my Control Flow. The package contains an outer Foreach Loop using the Foreach File Enumerator which in my test case will loop over two files found in a directory. Inside this loop is another Foreach Loop using the Foreach Nodelist Enumerator. Before entering the inner loop a variable, xpath, is set to a value that depends on the current file, i e /file[name = '@CurrentFileName']/content. The Nodelist Enumerator is set to use this variable as its OuterXPATHString. Now, this is what happens:
First Iteration: The first file is found and the value of xpath = /file[name = 'test1.txt']/content. When the inner loop is entered it iterates over the content elements under the file with name test1.txt as expected.
Second Iteration: The second file is found and the value of xpath = /file[name = 'test2.txt']/content. When the inner loop is entered it unexpectedly still iterates over the content elements under the file with name test1.txt.
My question is: Should it not be possible to change the loop condition of an inner loop in an outer loop such that the next time it is entered it will be done based on the new condition? It seems that the xpath variable is read once, the first time, and never again. If that is the case, does anyone know of a workaround?
I recently installed sql server 2005 with integration services. While trying to drag control flow items in control flow pane, I am having below error...
The task with the name 'Execute sql task' and creattion name "Microsoft.sqlserver.dts.tasks.executesqltask, Microsoft.sqlserver.SqlTask,Version=9.0.242.0,culture=neutral,public token=7895cd6666345' is not registered for use on this computer.
Contact Information:
Execute SQL Task
Can anybody please help me why this and how we solve this problem...please
Hello, My package "splits" into a number of work flows by the use of precedent management.
After a number of steps I want to combine the flows back to one and finish the package.
What is the mechanism to combine the flows into a singular, linear process again so that I do not have to multiply define identical tasks to finish each flow leg?
I have run into a problem while creating a simple UDF on SQL Server 2000.
Code Snippet
CREATE FUNCTION [dbo].[GetSectionNum] (@section varchar(4)) RETURNS varchar(2) AS BEGIN DECLARE @sTemp varchar(2),@s char DECLARE @count int DECLARE @length int
set @length = LEN(@section) set @count = 1
WHILE (@count <= @length) BEGIN SET @s = SUBSTRING(@section,@count,1) IF(ISNUMERIC(@s)) BEGIN SET @sTemp = @sTemp + @s END SET @count = @count + 1 END
IF(LEN(@sTemp) = 1) BEGIN SET @sTemp = '0' + @sTemp END
RETURN @sTemp ENDWhen I perform a syntax check I receive and error about "Error 156: incorrect syntax near keyword 'BEGIN'. I have narrowed the problem to the IF statement inside the While block. If I remove the IF statement the syntax check is successful. This is the first UDF I have written so I'm swimming in uncharted water. Thanks ahead of time for your help.
I am using the event handling mechanism to do my custom logging. This works fine. Using the OnPreExecute and OnPostExecute my log tables fill up with the start- and enddates of all the containers and tasks in the complete package hierarchy.
However, what I am missing (e.g. in a system variable) is the source ID of the parent container that started the task or container. In other words, in my logging reports, I would like to build up a tree starting with the topmost package, like the progress indication in the IDE.
The built-in Logging features do not log this information either as far as I can tell.
I have a package which has 3 data flows and a script component to write log file. Those 3 data flows are in sequence (mean upon success, next will run). If the first 2 data flows fails, control will go to script and upon completion of 3rd, control will go to script to create log file.
Now, my question is, within my script component, is there any way to find out what was the last ran data flow and its status? What I know is, adding a 'activex task' between each data flow and script component which updates a variable and then I can use that variable in the script component to know the last ran data flow. But what I want to know is, is there any system variable (or something) which will do the samething without any need to add additional task?
hi all of you, I haven€™t idea if the following description is a issue or not, anyway:
I begin from Control Flow layer.
I€™ve created a sequence container and inside I€™ve got two groups, one own a sql task and another one own a Data Flow task. Both are linked for a completion conector. Up to here everything is fine. But when I collapse my sequence container the arrow remains there for these tasks and you can see the sequence container €śclosed€? and the arrow lonely.
Not very esthetic, not practical.
Any clarification or though will be as usual welcomed
A branching element is critical to any process flow. Currently, as far as I know, there;s only a Conditional Split Data Flow Element. There is no direct way I can branch out in a control flow.
In some cases, I could branch by using a conditional operator ?: to either create a dynamic sql string for each patch, or a package name for a Package Execution task and so on.
This approach is not always good enough.
Are there others out there who want a Conditional Control Flow element?
I am somewhat new to SSIS and have a question on branching/control flow. We have 3 manufacturing facilities on AS400. Requirement were to create independent SSIS packages in case certain facilities were down, to not interrupt or fail other facilities. to be honest, I didn't want to maintain 3 * (each plant package) packages and decided to create one that takes in a command line parameter and will create seperate batch files (requirements for our archaic scheduling system) that pass in the plant to be processed.
Everything works fine and the packages analyze the passed parameter with no problem and evaluate it when calling stored procedures int a SQL Task. I am required (I think - let me know if I'm wrong here) to have 3 seperate Data Flow tasks depending on which plant is being processed because they are different connections (same server, different DBs - AS400). Which Data Flow Taks run is based on a precedence constraint that analyzes the passed in plant number.
After the Data Flow Tasks are complete, I'd like to merge back into one path again (even though1 out of N will only run at a time) because they are all simply stored proc calls with different parameters. It seemed silly to me to have the same tasks repeated 3 different times. I looked into started up tasks after events, but I don't know a lot about it and it seems as though I would still be maintaining 3 different paths anyway.
Can anyone point me in the right direction?
P.S. (Even if the sollution is to have 1 Data Flow Task I'm still interested in knowing how to converge paths back together)