HI, is it possible to copy a row from the Input0_ProcessInputRow into another variable of a synchronous script component. I have a script component that has three outputs. What I would like to do is the following:
At first iteration, copy entire row into a variable.
For the other rows, is business key cols is different from the previous row (the one copied at previous iteration) or type 2 columns :
1-direct previous row (from the variable) to second.output
2- direct current row to the first output
3- copy the current row into the variable
If the business key cols are the same and type 2 cols are the same:
1-Direct row to the third output
My problem is that when I declare a variable as Input0Buffer and then copy the current row in it, at each subsequent iterations, the variable is updated. What I want to do is to keep the value of the previous row. I know I can do it b are differenty copying the content of all columns into separate variables but since some script components will have 150+ columns, I would like to find a simpler way to do it (like a copy of the structure of the input buffer).
The reason I use a script component is that the lookups are not updated dynamically, the rows I need to process may not exist in the target table and once inserted, they are not retreived subsequently by the buffer.
In a Data Flow, I have the necessity to use a SSIS variable of type €œObject€? inside Script Component and assign to it the content of 'n' variables of string type. On exiting from the script the variable of type object should contain something like in the following lines: AAAAAAAAAAAAAAAAAAAAAAAAAAAAA BBBBBBBBBBBBBBBBBBBBBBBBBBBBB CCCCCCCCCCCCCCCCCCCCCCCCCCCCC DDDDDDDDDDDDDDDDDDDDDDDDDDDDD €¦€¦€¦€¦€¦€¦€¦. €¦€¦€¦€¦€¦€¦€¦. On exiting from the data flow I will use the variable of type Object in a Script Task, by reading each element in a cyclic fashion. Is there anyone who have experienced something like this? Could anyone provide any example of that? Thanks in advance!
My current project copies source tables to destination tables. Structure at both sides is identical so it€™s kinda €˜repetitive€™ process to drag source and destination components.
So I want to create component which should do following €“
select source connection select source tables select destination connection select all destination table map source table to destination table
validate columns mapping based on the names
To enhance, I also would like to support creation and mapping of custom values.
So I would really appreciate any kind of input in creating this component. Currently I am reading tutorials on creating SSIS components.
I have developed a simple package utilizing script component and variable. What I am trying to do is declare a variable in the parent package that can be change at run time to either 1 or 0. If 1 I want the package to succeed and branch off to run some child packages and if 0, then branch the other way and run another child package. In ether way, I want to be able to load one of the two child packages and not both.
I I declare a variable in my parent package like IsExist, int32 with default 0.
Then I have this code thanks partly to Jamie Thomson in " SSIS: Writing to a variable from a script task"
I also place the variable name in the ReadWriteVariables in the script component.
The problem now is it is always failing. It stopped at the script task and fail. This is not what I want. I want the package to execute another task upon failure. Anyone knows what I am doing wrong?
I am writing a custom transformation component that utilises a user variable.
Before using the variable at run time I am checking that the variable exists, but it would be nice to be able to add it if it does not. I cannot find any documentation on the subject, though I can see that the Variables class is derived from a ReadOnlyCollectionBase.
Is there a way to add a user variable with package scope from a custom component, either at run time or design time?
I am trying to load data to a sql table from flat file..I have a script component which will validate the data. I am also firing the errros when data validation fails. Once I encounter an error I want the whole package to be aborted. What I have done is declared a variable called errorcount (Scope :Package) and I am incrmenting this variable everytime an error is raised and next I have a Conditional Split which will check if this variable count is greater than 0 if yes I will not update the table.This is the only way I think this will work.
error I am getting is :The collection of variables locked for read and write access is not available outside of PostExecute.
I understand they r in the same data flow thats the reason I get this error but how can I by pass it ?
MY dataflow-> Flat File Source-Script Comp-Condtional Split-(If errorCount==0)-SQL Table
How can I redo the dataflow so that this err is not raised.
I have tried to add an output column (blnContinue) and use that in conditional stmt but as this is stored at row level values which have blnContinue as true are getting processed, I dont want this I just want to stop the package execution even if a single validation fails
Hello, I have a 'ForEach Loop Container' that does a select from a table and I use Variable Mappings to map each row set result to Package Variables in SSIS. This works fine. However, in the Data Flow, I have a Script component in which I need to access the values of those variables that have been set in the ForEach Loop Container. How do I do this?
Perhaps I'm on the wrong track but I guess I need to access the ADO Object Source variable set in the ForEach Loop Container? Through my own experimentation, I know that accessing the VariableDispenser collection only returns whatever default values that happen to be assigned to the variables in the SSIS GUI (not the values assigned during each iteration of the ForEach Loop).
I have a script component that contains a script similar to that shown below;
Public Sub Main()
Dim sqlConn As New SqlConnection("Server=myserver;Database=mydb;Trusted_Connection=True;")
Dim SQLStmt As String
SQLStmt = "select top 10 CountID from mytable"
Dim sqlComm As New SqlCommand(SQLStmt, sqlConn)
sqlConn.Open()
Dim r As SqlDataReader = sqlComm.ExecuteReader() While r.Read()
Dim MyValue As Integer
If r("CountID") IsNot DBNull.Value Then
MyValue = CInt(r("CountID "))
' HERE I WANT TO WRITE MyValue TO A PACKAGE VARIABLE AS A ROW
End If
End While
r.Close() sqlConn.Close()
Dts.TaskResult = Dts.Results.Success
End Sub
Once i have a value assigned to MyValue, I want to write this as a row to a variable (DATATYPE Object Value VALUE System.Object) in the package which will be an ADO Recordset Destination that i can then loop through later.
I have a package that has a data lfow task. this task imports data from a db2 database (using the IBM Ole DB provider fro db2) and adds it to sql server database table. This package was created on the server. then though version control (using TFS source control) I check out the package on my local machine. and when I open the package I get the foll 3 errors.
Error 1 Validation error. Import Account Num from BMGP_BDR: DTS.Pipeline: The component metadata for "component "DataReader Source" (1113)" could not be upgraded to the newer version of the component. The PerformUpgrade method failed.
Error 2 Error loading BMAG Download Xref Tables - bmag.dtsx: Microsoft.SqlServer.Dts.Pipeline.ComponentVersionMismatchException: The version of component "DataReader Source" (1113) is not compatible with this version of the DataFlow. [[The version or pipeline version or both for the specified component is higher than the current version. This package was probably created on a new version of DTS or the component than is installed on the current PC.]] at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostCheckAndPerformUpgrade(IDTSManagedComponentWrapper90 wrapper, Int32 lPipelineVersion)
Error 3 Error loading BMAG Download Xref Tables - bmag.dtsx: The component metadata for "component "DataReader Source" (1113)" could not be upgraded to the newer version of the component. The PerformUpgrade method failed.
I have a package which reads an Access file from a folder. My connection manager to this file is .NET providers for OledbMicrosoft Jet 4.0 OLE DB Provider.
Package works from my computer. But when I execute it on the server as a SQL Agent job, I get
The component metadata for "component "DataReader Source" (1) could not be upgraded to the newer version of the component. The PerformUpgrade method failed.
I copied the mdb file to a folder on the server which my packages have no problem reading data from.
My packages run under the same domain account as defined in proxies.
In my SSIS package I am looping multiple flat file to load data into target table. I have used one variable to get that filename. I have error log table which used to log the invalid record which come from various files.
All are working fine , except I want to log the filename also in my error log table. How can we use the package variable to assign the variable value in script component.so that I can map the output column to target table. I have created Filename column in script component in output column tab.
I am creating a customer data flow component for SSIS for use in a package. I've got some custom properties that I am exposing using the supplied advanced editor (no custom property editor here).
Some of my properties are enumerated types, and I have deciphered how to get those properties to show as dropdown lists of their respective enumerations. (For those of you who may be looking as hard as I did as to how to accomplish this, see the end of this post.)
I also have a few properties which request SSIS package variable names - such as an file name variable. However, I can't figure out how to tell the advanced editor that the property is looking for an SSIS variable, so that it can show a dropdown list of package variables, much like virtually any other Microsoft supplied Data Flow component can.
Is there a Type Converter I could specify for those custom properties? Is there another way to instruct SSIS that my custom property is expecting a variable? Or do I need to code a custom UI for editing my Data Flow Task?
To create a dropdown list of values for a custom property that represents an enum, do the following:
1. Create your enum definition, such as "public enum ThisIsMyEnum { one, two }"
2. Create a new class that inherits from TypeConverter, such as "public class MyEnumConverter : TypeConverter"
3. Override "CanConvertFrom", and return true if "sourceType == typeof(string)"
4. Override "CanConvertTo", and return true if "destinationType == typeof(string)"
5. Override "ConvertFrom", and return the enum value (such as "one" or "two" in my example) that corresponds to the string passed in the parameter "value"
6. Override "ConvertTo", and return a string that corresponds to the enum value passed in the parameter "value"
7. Override "GetStandardValuesSupported" and return true
8. Override "GetStandarValuesExclusive" and return true to indicate that ONLY the enum values should be accepted
9. Override "GetStandardValues", and return a new StandardValuesCollection constructed with Enum.GetValues() of your enum, such as "return new StandardValuesCollection(Enum.GetValues(typeof(ThisIsMyEnum)));"
10. Just above your "public enum" declaration, add a "TypeConverter" attribute to link your type converter to your enum, such as "[TypeConverter(typeof(MyEnumConverter))]"
11. In "ProvideComponentProperties", after you've created your custom property like this: "IDTSCustomProperty90 propEnum = ComponentMetaData.CustomPropertyCollection.New()", add another line to specify the TypeConverter property of the property to the full assembly name of your type converter, like so: "propEnum.TypeConverter = typeof(MyEnumConverter).AssemblyQualifiedName;"
I have a Data Flow Script Component(Destination Type) and in the properties I have a read/write variable called User::giRowCount
User::giRowCount is populated by a Row Count Component previously in the Data Flow.
After reading http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=97494&SiteID=1 it is very clear that you can actually only use variables in the PostExecute of a Data Flow Script Component or you will get an error "Microsoft.SqlServer.Dts.Pipeline.ReadWriteVariablesNotAvailableException: The collection of variables locked for read and write access is not available outside of PostExecute."
What I need to do is actually create a file in the PreExecute and write the number of records = User::giRowCount as second line as part of the header, I also need to parse a read/write variable such as gsFilename to save me hardcoding the path
-they must go in the PreExecute sub --workarounds please-here is the complete script component that creates a file with header, data and trailer --Is there any workaround
Public Class ScriptMain Inherits UserComponent 'Dim fs As FileStream Dim fileName As String = "F:FilePickUpMyfilename.csv" 'Dim fileName = (Me.Variables.gsFilename.ToString)
Dim myFile As FileInfo = New FileInfo(fileName) Dim sw As StreamWriter = myFile.CreateText Dim sbRecord As StringBuilder = New StringBuilder
Public Overrides Sub PreExecute()
sbRecord.Append("RECORD_START").Append(vbNewLine)
End Sub
Public Overrides Sub ParsedInput_ProcessInputRow(ByVal Row As ParsedInputBuffer)
Public Overrides Sub PostExecute() sbRecord.Append("RECORD_COUNT").Append((vbTab)) sbRecord.Append(Me.Variables.giRowCount.ToString).Append(vbNewLine) sbRecord.Append("RECORD_END").Append(vbNewLine) 'Now write to file before next record extract sw.Write(sbRecord.ToString) 'Clear contents of String Builder sbRecord.Remove(0, sbRecord.Length)
I am trying to develop a SSIS package which will read the records from the flat file and insert them into a destination table. I have some validations written in script component. I have declared two Read Write variables with package level scope. when i try to assign a value to the variable in the script component and run the package, the package throws me an error "The collection of variables locked for read and write access is not available outside of PostExecute".
What should be done to over come the problem please help me on this regard
Hi all I'm into a project which uses a lot of views for joining 2 or more tables. Using the MERGE component in SSIS will be a huge effort coz it only has 2 inputs and I gotta SORT the input too. Isnt it possible to have a VIEW like component that joins more than 2 tables and DOESNT need sorting?? (I've thought about creating views in database engine but it breaks my data floe in SSIS and is'nt a practical solution)
I am writing a custom dataflow transformation component and I need to get the name of the preceeding component.
I have been trying to find a way to get a reference to the Package object, MainPipe object or IDTSPath90 object (connecting to the IDTSInput90 of my component) from my component because I think from there I can get to the information I want.
No idea where this bug crept in from. Have been using SSIS for 1.5 years now without hitting this problem.
I had a script component opening an XML document and parsing it using XPATH. I added some code that uses StreamReader / Streamwriter (closing one stream before starting the other). The code works without issue in my C# app.
And it ran without issue 2-3 times in SSIS. Then suddenly after running my package again, the script component says it completes successfully, yet nothing happens. I set a breakpoint on the first line of code - it never hits it. I add a msgbox as the first line of code - and it never displays.
I then close my package / exit out of ssis ... and then re-open it. When i open my script component, all of my code is GONE. All references that I added are gone.
I tried adding the streamreader/writer process to a dll I created from my c# app ... and added the DLL to the package -- same result.
I can reproduce this on 2 different computers.
Anyone experience this problem ? Any idea how to stop it ? Or debug it ?
Here is a slimmed down code sample of what causes the error :
Public Class ScriptMain Public Sub Main() Try Dim xmlDoc As New XmlDocument xmlDoc.Load("c:ulkasync_86281519_20070628045850225_4.xml") MsgBox("xmlLoaded") --this doesn't display once the package starts "acting up" Catch ex As Exception MsgBox(ex.Message) UpdateXML("c:ulkasync_86281519_20070628045850225_4.xml", ex.Message) End Try Dts.TaskResult = Dts.Results.Success End Sub Private Sub UpdateXML(ByVal fileName As String, ByVal message As String) Try Dim invalidChar As String = message.Trim().Substring(message.Trim().IndexOf("0x"), 4) Dim rd As StreamReader = New StreamReader(fileName) Dim xml As String = rd.ReadToEnd() Xml = Xml.Replace(invalidChar, String.Empty) xml = xml.Replace("", String.Empty) xml = xml.Replace("<![CDATA[<![CDATA[", "<![CDATA[") xml = xml.Replace("]]>]]>", "]]>") MsgBox("replaced") rd.Close() Dim wr As StreamWriter = New StreamWriter(fileName) wr.Write(xml) wr.Close() Dim xdoc As XmlDocument = New XmlDocument() xdoc.Load(fileName) Catch ex As Exception UpdateXML(fileName, ex.Message) End Try End Sub End Class
I set up DB mirror between a primary (SQL1) and a mirror (SQL2); no witness. I have a problem when I issue command:
alter database DBmirrorTest Set Partner = N'TCP://SQL2.mycom.com:5022'; go
The error message is:
The remote copy of database "DBmirrorTest" has not been rolled forward to a point in time that is encompassed in the local copy of the database log.
I have the steps below prior to the command. (Note that both servers' service accounts use the same domain account. The domain account I login to do db mirror setup is a member of the local admin group.)
1. backup database DBmirrorTest on SQL1
2. backup database log
3. copy db and log backup files to SQL2
4. restore db with norecovery
5. restore log with norecovery
6. create endpoints on both SQL1 and SQL2
CREATE ENDPOINT [Mirroring]
STATE=STARTED
AS TCP (LISTENER_PORT = 5022, LISTENER_IP = ALL)
FOR DATA_MIRRORING (ROLE = PARTNER)
7. enable mirror on mirror server SQL2
:connect SQL2
alter database DBmirrorTest
Set Partner = N'TCP://SQL1.mycom.com:5022';
go
8. Enable mirror on primary server SQL1
:connect SQL1
alter database DBmirrorTest
Set Partner = N'TCP://SQL2.mycom.com:5022';
go
This is where I got the error.
The remote copy of database "DBmirrorTest" has not been rolled forward to a point in time that is encompassed in the local copy
Hi! I did: alter database mydb set single_user with rollback immediate; exec sp_detach_db @dbname='mydb', @keepfulltextindexfile='true';
then I tried to copy files to new location on other drives, same server but got >>Cannot copy <myfile>: Access is denied Make sure the disk is not full or write-protected and that the file is not currently in use<<
I also tried rename of file without success. I also tried with db service stoppet (not preferred) without success.
How to find out, which process locks the files? Best regards
if i have a given database (a model) and i want to copy this database in the same database instance. Is it ok to copy the mdf and ldf file and attach the files with a new database name in the same instance.
I am attempting to use the copy wizard to copy databases from SQL Server 2005 to SQL Server 2008 R2 w/ FP1.
The copy fails with a login failure to SQL Server 2005. I have a user id & password under Windows for both servers. I have a user id and password under SQL security with the called for admin security rights.
The 2005 server has two instances, 20 databases, two dozen maintenance plans, and over a hundred users. I really would like to use the utility so I don't have to recreate everything manually.
Before implementing memory based bulk copy insert with IRowsetFastLoad interface of SQL Server 2005 OLE DB provider, I want to know some considerations.
- performance : compared with T-SQL's "BULK INSERT ..." and bcp utility
- SQL Server's resource usage : when running memory based bulk copy, server resource's influence
- server side action(behavior) : when server is busy, delayed-update means IRowsetFastLoad::Commit(true) method can insert right after?
- row-count : The rowcount limitation can be inserted by IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit
I'm working on an SSIS package that uses a vb.net script to grab some XML from a webservice (I'd explain why I'm not using a web service task here, but I'd just get angry), and I wish to then assign the XML string to a package variable which then gets sent along to a DataFlow Task that contains an XML Source that points at said variable. when I copy the XML string into the variable value in the script, if do a quickwatch on the variable (as in Dts.Variable("MyXML").value) it looks as though the new value has been copied to the variable, but when I step out of that task and look at the package explorer the variable is its original value.
I think the problem is that the dataflow XML source has a lock on the variable and so the script task isn't affecting it. Does anyone have any experience with this kind of problem, or know a workaround?
I have a SQL Task that updates running totals on a record inserted using a Data Flow Task. The package runs without error, but the actual row does not calculate the running totals. I suspect that the inserted record is not committed until the package completes and the SQL Task is seeing the previous record as the current. Here is the code in the SQL Task:
DECLARE @DV INT; SET @DV = (SELECT MAX(DateValue) FROM tblTG); DECLARE @PV INT; SET @PV = @DV - 1;
I've not been successful in passing a SSIS global variable to a declared parameter, but is it possible to do this:
DECLARE @DV INT; SET @DV = ?; DECLARE @PV INT; SET @PV = @DV - 1;
I have almost 50 references to these parameters in the query so a substitution would be helpful.
I'm new to SSIS, but have been programming in SQL and ASP.Net for several years. In Visual Studio 2005 Team Edition I've created an SSIS that imports data from a flat file into the database. The original process worked, but did not check the creation date of the import file. I've been asked to add logic that will check that date and verify that it's more recent than a value stored in the database before the import process executes.
Here are the task steps.
[Execute SQL Task] - Run a stored procedure that checks to see if the import is running. If so, stop execution. Otherwise, proceed to the next step.
[Execute SQL Task] - Log an entry to a table indicating that the import has started.
[Script Task] - Get the create date for the current flat file via the reference provided in the file connection manager. Assign that date to a global value (FileCreateDate) and pass it to the next step. This works.
[Execute SQL Task] - Compare this file date with the last file create date in the database. This is where the process breaks. This step depends on 2 variables defined at a global level. The first is FileCreateDate, which gets set in step 3. The second is a global variable named IsNewFile. That variable needs to be set in this step based on what the stored procedure this step calls finds out on the database. Precedence constraints direct behavior to the next proper node according to the TRUE/FALSE setting of IsNewFile.
If IsNewFile is FALSE, direct the process to a step that enters a log entry to a table and conclude execution of the SSIS.
If IsNewFile is TRUE, proceed with the import. There are 5 other subsequent steps that follow this decision, but since those work they are not relevant to this post. Here is the stored procedure that Step 4 is calling. You can see that I experimented with using and not using the OUTPUT option. I really don't care if it returns the value as an OUTPUT or as a field in a recordset. All I care about is getting that value back from the stored procedure so this node in the decision tree can point the flow in the correct direction.
The SSIS package passes the FileCreateDate parameter to this procedure, which then compares that parameter with the date saved in tbl_ImportFileCreateDate.
If the date is newer (or if there is no date), it updates the field in that table and returns a TRUE IsNewFile bit value in a recordset.
Otherwise it returns a FALSE value in the IsNewFile column.
SELECT @CreateDateInTable = FileCreateDate FROM tbl_ImportFileCreateDate WHERE ProcessName = @ProcessName
IF EXISTS (SELECT ProcessName FROM tbl_ImportFileCreateDate WHERE ProcessName = @ProcessName)
BEGIN
-- The process exists in tbl_ImportFileCreateDate. Compare the create dates.
IF (@FileCreateDate > @CreateDateInTable)
BEGIN
-- This is a newer file date. Update the table and set @IsNewFile to TRUE.
UPDATE tbl_ImportFileCreateDate
SET FileCreateDate = @FileCreateDate
WHERE ProcessName = @ProcessName
SET @IsNewFile = 1
END
ELSE
BEGIN
-- The file date is the same or older.
SET @IsNewFile = 0
END
END
ELSE
BEGIN
-- This is a new process for tbl_ImportFileCreateDate. Add a record to that table and set @IsNewFile to TRUE.
INSERT INTO tbl_ImportFileCreateDate (ProcessName, FileCreateDate)
VALUES (@ProcessName, @FileCreateDate)
SET @IsNewFile = 1
END
SELECT @IsNewFile
The relevant Global Variables in the package are defined as follows: Name : Scope : Date Type : Value FileCreateDate : (Package Name) : DateType : 1/1/2000 IsNewFile : (Package Name) : Boolean : False
Setting the properties in the "Execute SQL Task Editor" has been the difficult part of this. Here are the settings.
General Name = Compare Last File Create Date Description = Compares the create date of the current file with a value in tbl_ImportFileCreateDate. TimeOut = 0 CodePage = 1252 ResultSet = None ConnectionType = OLE DB Connection = MyServerDataBase SQLSourceType = Direct input IsQueryStoredProcedure = False BypassPrepare = True
I tried several SQL statements, suspecting it's a syntax issue. All of these failed, but with different error messages. These are the 2 most recent attempts based on posts I was able to locate. SQLStatement = exec ? = dbo.p_CheckImportFileCreateDate 'GL Account Import', ?, ? output SQLStatement = exec p_CheckImportFileCreateDate 'GL Account Import', ?, ? output
Parameter Mapping Variable Name = User::FileCreateDate, Direction = Input, DataType = DATE, Parameter Name = 0, Parameter Size = -1 Variable Name = User::IsNewFile, Direction = Output, DataType = BYTE, Parameter Name = 1, Parameter Size = -1
Result Set is empty. Expressions is empty.
When I run this in debug mode with this SQL statement ... exec ? = dbo.p_CheckImportFileCreateDate 'GL Account Import', ?, ? output ... the following error message appears.
SSIS package "MyPackage.dtsx" starting. Information: 0x4004300A at Import data from flat file to tbl_GLImport, DTS.Pipeline: Validation phase is beginning.
Error: 0xC002F210 at Compare Last File Create Date, Execute SQL Task: Executing the query "exec ? = dbo.p_CheckImportFileCreateDate 'GL Account Import', ?, ? output" failed with the following error: "No value given for one or more required parameters.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
Task failed: Compare Last File Create Date
Warning: 0x80019002 at GLImport: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. The Execution method succeeded, but the number of errors raised (1) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "MyPackage.dtsx" finished: Failure.
When the above is run tbl_ImportFileCreateDate does not get updated, so it's failing at some point when calling the procedure.
When I run this in debug mode with this SQL statement ... exec p_CheckImportFileCreateDate 'GL Account Import', ?, ? output ... the tbl_ImportFileCreateDate table gets updated. So I know that data piece is working, but then it fails with the following message.
SSIS package "MyPackage.dtsx" starting. Information: 0x4004300A at Import data from flat file to tbl_GLImport, DTS.Pipeline: Validation phase is beginning.
Error: 0xC001F009 at GLImport: The type of the value being assigned to variable "User::IsNewFile" differs from the current variable type. Variables may not change type during execution. Variable types are strict, except for variables of type Object.
Error: 0xC002F210 at Compare Last File Create Date, Execute SQL Task: Executing the query "exec p_CheckImportFileCreateDate 'GL Account Import', ?, ? output" failed with the following error: "The type of the value being assigned to variable "User::IsNewFile" differs from the current variable type. Variables may not change type during execution. Variable types are strict, except for variables of type Object. ". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly. Task failed: Compare Last File Create Date
Warning: 0x80019002 at GLImport: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. The Execution method succeeded, but the number of errors raised (3) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "MyPackage.dtsx" finished: Failure.
The IsNewFile global variable is scoped at the package level and has a Boolean data type, and the Output parameter in the stored procedure is defined as a Bit. So what gives?
The "Possible Failure Reasons" message is so generic that it's been useless to me. And I've been unable to find any examples online that explain how to do what I'm attempting. This would seem to be a very common task. My suspicion is that one or more of the settings in that Execute SQL Task node is bad. Or that there is some cryptic, undocumented reason that this is failing.
I am in the middle of taking course 2073B €“ Programming a Microsoft SQL Server 2000 Database. I noticed that in Module9: Implementing User-Defined Functions exercise 2, page 25; step 2 is not returning the correct answer.
Select employeeid,name,title,mgremployeeid from dbo.fn_findreports(2)
It returns manager id for both 2 and 5 and I think it should just return the results only for manager id 2. The query results for step 1 is correct but not for step 2.
Somewhere in the code I think it should compare the inemployeeid with the previous inemployeeid, and then add a counter. If the two inemployeeid are not the same then reset the counter. Then maybe add an if statement or a case statement. Can you help with the logic? Thanks!
Here is the code of the function in the book:
/* ** fn_FindReports.sql ** ** This multi-statement table-valued user-defined ** function takes an EmplyeeID number as its parameter ** and provides information about all employees who ** report to that person. */ USE ClassNorthwind GO /* ** As a multi-statement table-valued user-defined ** function it starts with the function name, ** input parameter definition and defines the output ** table. */ CREATE FUNCTION fn_FindReports (@InEmployeeID char(5)) RETURNS @reports TABLE (EmployeeID char(5) PRIMARY KEY, Name nvarchar(40) NOT NULL, Title nvarchar(30), MgrEmployeeID int, processed tinyint default 0) -- Returns a result set that lists all the employees who -- report to a given employee directly or indirectly AS BEGIN DECLARE @RowsAdded int -- Initialize @reports with direct reports of the given employee INSERT @reports SELECT EmployeeID, Name = FirstName + ' ' + LastName, Title, ReportsTo, 0 FROM EMPLOYEES WHERE ReportsTo = @InEmployeeID SET @RowsAdded = @@rowcount -- While new employees were added in the previous iteration WHILE @RowsAdded > 0 BEGIN -- Mark all employee records whose direct reports are going to be -- found in this iteration UPDATE @reports SET processed = 1 WHERE processed = 0
-- Insert employees who report to employees marked 1 INSERT @reports SELECT e.EmployeeID, Name = FirstName + ' ' + LastName , e.Title, e.ReportsTo, 0 FROM employees e, @reports r WHERE e.ReportsTo = r.EmployeeID AND r.processed = 1 SET @RowsAdded = @@rowcount -- Mark all employee records whose direct reports have been -- found in this iteration UPDATE @reports SET processed = 2 WHERE processed = 1 END RETURN -- Provides the value of @reports as the result END GO