Foreach Loop .. Renaming Files On Success And Failure ?? Where ? How?
Feb 23, 2008
ummm. sorry, I've read and seen the tutorials but somehow and missing this.
I have a foreach container. Inside a dataflow task, with an XML source, a data conversion (cause of urrr UNICODE) and and an ole DB data source.
By design (and for this simple example), I get a volation if I attempt to load loads with out deleting entries from my table. No biggie, I would just like this simple package to rename my file to extension .good or .bad depending on success of each loop.
Where and what do I need for this?
I can't import from XML files using a foreach loop. I load an XML file with a generated XSD. When I map the file to the table it has no errors. If I now go back and change to a different XML file, I get an error:
"Error 1 Validation error. Data Flow Task: DTS.Pipeline: input column "COLUMNNAME" (129) has lineage ID 2115 that was not previously used in the Data Flow task. Package.dtsx 0 0"
This is for testing purposes. When I run the foreach loop it does not work. Ironically, I do the exact same thing in another foreach loop with a completely different XML and it works fine.
I was trying to insert some row from one table to another of different database. I was using Execute SQL task along with Foreach loop container.
In my execute SQL task I am using this query
SET IDENTITY_INSERT dbo.Table1 ON
INSERT INTO dbo.Table1
SELECT * FROM DB2.dbo.Table2 WHERE TableKey = ?
When executed I get this error: failed with the following error: "Syntax error, permission violation, or other nonspecific error". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
While the same query when executed in Management Studio Its successful.
The properties I set
For Each Loop Editor Settings: 1) Collection: a) Enumerator Set to ForEach ADO Enumerator b) ADO Object Source Variable: User:bjectVariablename c) Checked Rows in the first table 2) Variable Mapping: New Int Variable2 and Index = 0 to set it to first colunm. 3) Expression: Left blank
Execute SQL Task Editor: 1) General: a) Timeout : 0 b) CodePage: 1252 c) Result Set: None d) SQLSourceType: Directinput e) SQL Statement: SET IDENTITY_INSERT dbo.Table1 ON INSERT INTO dbo.Table1 SELECT * FROM DB2.dbo.Table2 WHERE TableKey = ? f) BypassPrepare: False 2)Parameter Mapping: Variable Name : New Integer variable2 selected Direction: Input DataType: Long ParameterName: 0
Can somebody help me in this regards.
Reference: a) http://www.whiteknighttechnology.com/cs/blogs/brian_knight/archive/2006/03/03/126.aspx
Hi - I'm new to SSIS and am having problems figuring out how to do the following.
I need to load data from flat files into SQLserver 2005 and have created the data flows ok, but my data files are *not* located in a single directory so I cannot use the foreach file enumerator option in the foreach loop container collection. Please correct me if I'm wrong?
My approach has been to execute a SQLcommand to get the filenames from another database table and to use the foreach ADO enumerator option and mapping the returned filenames to a project scoped variable (data type object since it is a rowset).
My problem comes when I edit the properties of the connection manager to try to use that variable for the connectionstring property in the expression editor. I get an error because the datatype of the variable is not supported in an expression.
Can anyone tell me how to correct this or outline another way to solve my problem?
I have a package that loops over ~60 Excel files in a directory. Each file has three named ranges in it, which I import into different tables. Sometimes the package runs without a hitch, sometimes it chokes. But it is intermittent.
If I pull the control flow components out of the foreach loop and point the Excel connection manager to the specific Excel file that has caused the package to choke, I get a message in the dataflow component pointing to the named range that "the metadata of the following output columns does not match the metadata of the external columns......Do you want to replace the metadata of the output columns with the metadata of the external columns?" When I choose 'yes', then the file will be loaded. then I can put the control flow components back into the foreach loop and the file will run again, successfully, along with some more, until it chokes again....
So, first of all, does anyone have any insight into this? Sometimes, somedays, these files will load with no problems. These exact files; I am having to reload constantly... Other times, like today, it is a battle.
Otherwise, is there a way to get Integration Svcs to handle the metadata issue on the fly???
Any ideas, resources, references, war stories, or good clean jokes would be appreciated, Kathryn
My requirement is I have to read 2 sets of files from a folder. For example, I have to read all files starting with either 'a' or 'b' only. In 'Foreach Loop', if I say 'a*,b*', it is not working. Instead of comma (,), I tried colon, semi-colon and pipeline characters also. It is not working. So I am using 2 loops now. But I would like to know is there any way to do it using a single loop?
In all of our extract packages, we use a foreach loop container to grab files from the 400 sitting out in a certain directory. For this particular package, we have specified the files should be named RP*.* We know there are several files out in the directory. The package runs without error and completes, but says no files were found in the directory with that name. What could be causing this issue? Thanks!
Hi I am trying to load a bunch of excel files into a table and running into tables
I create a seq container--> add a foreach loop control task configure a variable thats ok as I am displaying the files names through script
I am stuck as how to configure the next data flow task to load into a destination I create a source excel and then map the expression to the variable but the destination SQL connection is not able to see this one
well i am trying to follow the example of msdn help on how to loop through excel files and it doesn't work. the variables have the project scope. what can i do?
I'm trying to capture the file name and insert to the Database with Loop Container and Execute Sql task...However when I run, I get error with the input parameter.
In my Sql Task, the parameter mapping: Variable name: user::variable, direction: input, Data Type: Varchar, parameter name: @xVariable, parameter size: -1
connectiontype: OleDB sourcetype: direct input statement : insert into xtable(xcolumn) values(@xVariable)
Anyone have any other suggestion how to capture the filename and input in the database?
I use a ForEach Loop Container in a ssis Package. The package has to look up in the directory 'f:ackups' for backupfiles and copy them into another folder.In my development environment it works fine. But if I run it on the SQL-Server with the SQL-Server Agent, the package logs always that the folder ist empty.Unfortunately the message is always 'empty folder' even if I define 'f:labla' as folder that actually not exists!
As filespecification I tried both *.* and *.bak .My assumption is, that the SQL-Server agent has not enough rights for that folder. But on the other side the agent is able to create backup-files in this folder.The SQL-Server Agent works under netservice control.
How is foreach loop container - foreach ADO enumerator performace in SSIS package compares to use of cursors in stored procedures
Is there any articles comparing them
I understand a lot of factors can affect the performance, however what is expected performance for the foreach ADO enumerator loop for large dataset. What is Microsoft recommendation for that - recommended - not recommended (using large datasets - over million records)
I have no "Foreach File Enumerator" oprtion in the Enumerator Property of the Foreach Loop Component.
I have this enumerator in the c:Program FilesMicrosoft SQL Server90DTSForEachEnumerators folder.
Also I check it in the GAC - it does not here. I try to execute gacutil.exe -iF ForEachFileEnumerator.dll, but it is failed with "Failure adding assembly to the cache: The module was expected to contain an assembly manifest." Seems it is not managed enumerator.
Please help me.
Also information on how to regeister unmanaged enumerators are welcome!
I have a stored proc that uses xp_cmdshell to boot off a batch file on the NT side of the box (box OS is Windows 2000 Advanced Server).
Here is the pertinent code:/*----- Kick off the NT bat job to suck over the data through the web service pipe*/ SELECT @NTCommand = 'D:TradeAnalysisWondaDataStoreJobsPullFromWONDA _InstitutionalRankings.bat ' + CONVERT(varchar(10), @ReqDate, 101) EXECUTE @e_error = master.dbo.xp_cmdshell @NTCommand SELECT @m_error = CASE WHEN ISNULL(@e_error, 0) <> 0 THEN (@e_error + 50000) ELSE @@Error END IF @m_error <> 0 GOTO ErrorHandlerThe trouble is that the batch file is failing (soft error, caught internally to the batch file, which then kills itself, screaming loudly all the way).
The batch file is using the following "voice" in which to scream in pain as it dies (a.k.a., using this code to terminate itself, which kills the cmd shell and returns 13 as an error code)REM WonDBService.exe says we failed Date /T Time /T EXIT /B 13
Meanwhile, back at the ranch (errr...back in stored procedure), what is being returned is a ZERO (in the first code block, @e_error is being set to ZERO when the xp_cmdshell returns from the bat file.
So, now that I have ruled out the obvious *LOL* how can I get my xp_cmdshell to realize it has failed miserably at the one, tiny, simple, not-too-much-to-ask, job that it is designed to do?
I have a number of stored procedures that run one after the other. How do you code to get the success or failure so that some logic can be applied accordingly? I've heard of the TRY CATCH structure, but I new and have yet to use it. How many different ways can success or failure be handled in code?
When I run my package, a task will fail, however, the package will claim that it was successful. Why is this, and how can I trigger a failed package when one task fails?
Please excuse my ignorance as I'm a complete noob when it comes to vb.net.
I have 2 script tasks, each connected to an upstream task via Success and Failure constraints. Each script assigns a value to a variable, depending on whether the task succeeds or fails.
My code thus far is:
Code Snippet Public Sub Main()
Dts.Variables("strEmailBody").Value = _ "Business Model Reporintg Control Complete - Status = Success"
Dts.TaskResult = Dts.Results.Success
End Sub
What i want to do is use a single script task depending on the success or failure of the package, setting the variable value accordingly.
If there are no errors Then
"Success"
Else
"Failure"
I've tried
Code SnippetIF CBool(Dts.Results.Success) Then...
But whislt it compiled, didn't evaluate correctly during runtime.
Can anyone suggest where I'm going wrong? Again I'm totally new to .net and I'm surprised I've gotten this far!
Does anybody have any experience/advice on how to ensure that SOAP service call success/failures are returned to the calling app?
Consider a client that calls a SOAP service during which the client goes down and is unable to receive the SOAP response, the work having been done by the service. Similarly, the SOAP service may perform the task but a failure in the return makes the client think the process failed.
What would be the best way to ensure that the client is notified to avoid the call having to be made again?
Are there middleware tools that can be used to provide a form of message queuing for SOAP service calls?
I have a SSIS pkg that gets data from SQL and do data conversion and Insert into OLE db AS400 destination, There is a flag column in SQL table , that has to be updated to true, once the records are inserted in AS400 how do i do that in SSIS
SQL oledb ---------> dataConversion ---------------> AS400 OLE db Destination | update SQL table Flag column<---------------------------------|
I have an SSIS package that executes a stored procedure. In that stored procedure is a try/catch block. If the try isn't successful, it goes to the catch block which does a rollback. So when I execute the SSIS package, it tells me that the stored procedure was ran successfully because there essentianlly were no errors and everything ran fine, but in reality, everytime it goes into the catch block and does a rollback, I want the SSIS package to fail as well. How would I send back a failure to the SSIS package from the stored procedure?
I have configured smtp email in MS sql server and configure email to schedular job when schedular jobs become failed. Can i configure email so that email will be sent from scheduler job on both success of job and Failure of job?
We are using SSIS to load some 100k records from flat file to Oracle Destination. We are using Oracle 10g client. But during the execution after some 5hrs or 6hr with 900k records upload we are getting the message Package execution completed. In the Execution results there is no message related to success or failure and the tasks in the Data Flow where yellow in color. What might be the problem? Any information regarding this case will be helpful for us.
I need to execute about dozen packages from another package... how do I dynamically pass the dozen package names to the package and execute using foreach loop...?
idea is to store the names of packages in a text file and set the file connection property reading each package names from the text file... in this way I can just configure/edit the text file from time to time, the packages and the units that I want to execute...
The foreach loop below runs fine and it sends emails as expected but the SP does not update the Companies table. Any thoughts? How do I cause the SP to execute? PROCEDURE newdawn.EmailSentDate @CID intAS UPDATE Companies SET Companies.LastEmailDate = (GetDate()) WHERE tblCompanies.C_ID = @CID RETURN foreach (GridViewRow row in GridView3.Rows) { pstrTo = row.Cells[2].Text; CEmail = row.Cells[2].Text; CName = row.Cells[3].Text; CID = row.Cells[1].Text; CState = TextBox1.Text; CCity = row.Cells[5].Text; CCat = row.Cells[4].Text; CCID = row.Cells[0].Text; SqlConnection con = new SqlConnection(ConfigurationManager.ConnectionStrings["localhomeexpoConnectionString2"].ConnectionString); SqlCommand cmd = new SqlCommand("EmailSentDate", con); cmd.CommandType = CommandType.StoredProcedure; cmd.Parameters.AddWithValue("@CID", CCID); try { System.String mailServerName = "mail.domain.com"; REST of code sends email to email address found in each row ....it all works find but SP above does not fire.
Sorry that I open a new thread again but I didn€™t found any good tutorials about foreach loop container. I would like to read every row from one table and do something. I red something about sql task€¦ Can someone give a little instruction, please?
I using data flow task to import from flat file to database but i need to use Foreach Loop Container to import multiple files in specific folder and all will be insert in the same table
Help with Looping in a SSIS Package Scenario: We have a web app that lets our existing clients insert new locations into a table (Clients) in a SQL Server DB. This table has an Identity column as the PK. This table also has another ID field HBID that is the PK for another table (HLocation) in another Database system (Sybase). The HBID field is given a Default value of €˜1€™ during an insert operation in our (SQL Server) table, Clients. The Sybase database uses a Sequence table for inserting a PK into the table. Whenever our clients insert a new record in Clients (SQL Server), I need to generate a new HBID from the Sequence table in (Sybase), update the HBID in the Clients table (SQL Server) and then finally insert the record with the new HBID into table HLocation (Sybase). I have devised a SSIS package for this as following: Note all variables are scoped at the Package Level. Execute SQL Task #1 that drives a For Each Loop with the following properties: General: Connection Type: OLE DB Connection: SQL Server DB SQL SourceType: Direct Input SQLStatement: SELECT COID, HBID, Name, DateCreated FROM Client WHERE HBID = '1'
ResultSet: = Full Resultset
Result Set: Result Name = 0 VariableName = USER::rsClient Variable Type = Object
ForEachLoop with the follwing properties:
Collection: Foreach ADO Enumerator ADO Object Source Variable = USER::rsClient Enumeration Mode = Rows in first table Variable Mappings: Variable: Index: COID 0 HBID 1 Name 2 DateCreated 3
Inside of the ForEachLoop I have another Execute SQL Task to generate a new HBID from Sybase set up as following.
Execute SQL Task #2
Connection Type: OLE DB Connection: Sybase SQL SourceType: Direct Input SQLStatement: UPDATE Autoinc SET INC_LAST = (INC_LAST+1) WHERE INC_KEY ='HBID'; SELECT INC_LAST AS NewHBID FROM Autoinc WHERE INC_KEY ='HBID'
ResultSet = SingleRow
Result Set: Result Name = NewHBID VariableName = USER::NewHBID, VariableType = Int32
Also Inside the ForEachLoop is a Script Task that has all of my variables as ReadWrite = COID, HBID, Name, DateCreated and NewHBID. I concatenate the values in a string a pass the string into a Message Box to make sure they are looping correctly and they are.For example the results might look like the following:
12698, 1, John Doe Trucking, 10/1/2007, 14550 13974, 1, Joe Smith Trucking, 10/1/2007, 14551 10542, 1, Dave Jones Trucking, 10/1/2007, 14552 Etc.
The values 14550 -14552 are the new HBID being generated in the loop.
The problem is that when I try to Update the HBID in the Client table (SQL Server) with another Execute SQL Task I keep getting the same NewHBID number. In this case 14550 would be updated for every record instead of the next number in the loop.
I have set up Execute SQL Task #3 as follows:
General: Connection Type: OLE DB Connection: SQL Server DB SQL SourceType: Direct Input SQLStatement: UPDATE Client SET HBID = ? WHERE HBID = '1'(SELECT COID, HBID, Name, DateCreated FROM Client)
ResultSet: = Full ResultSet
Result Set: Result Name = 0 VariableName = USER::rsNewClient, VariableType = Object
Parameter Mapping: VariableName USER::NewHBID, Direction = INPUT, DataType = Long ParameterName = 0
I€™ve tried putting Execute SQL Task #3 inside of the ForEachLoop, connecting it to the output of the ForEachLoop. I€™ve tried setting up a dataflow with a Derived Column using the USER::NewHBID as the Expression.
I still get the same results, 14550 added to every row.
Can any one help or shed some light?
Any and all suggestion will be deeply appreciated!