How To Retrieve Global Variables In An ActiveX Script Task Using VBScript In SSIS
Oct 27, 2006
I need to retrieve the Global Variables set in my package configuration file within an ActiveX Script Task within an SSIS package. In DTS, I could access the Global Variables to execute a SQLXMLBulkLoad for the following statement:
Main = DTSTaskExecResult_Success
set objBL=Nothing
End Function
=========================================
I have tried using the Script Task to write this in VB.NET, however the MSXML4.0 is not exposed within the limited object model of the Script Task Designer. I have written a Data Flow Object using the XML Source, however it requires quite a bit of effort to have the Data Flow Component parse the XML (with 10 hierarchical nodes), transform each and provide a SQL Server Destination. This works, however the XML Source Component requires a hardcoded reference to the XSD Schema file and does not allow for a Global Variable to used. (They do provide this functionality for the XML file source though).
My requirement is to allow for the Global Variable to be passed for the Schema file at runtime. The only way I can think of is to recreate what I was doing in DTS where I could simply pull in the XML and XSD Global Variables and execute the SQLXMLBulkLoad in VB Script.
Any ideas on how to write this in VBScript within the ActiveX Script Task in SSIS?...
I am creating a DTS package to import a .txt file into sql. I have everything in place, but the text file needs to have the last record deleted before the import. I need help with this part
I would like to delete the last record from a fixed width text file before I import it into sql. The number of rows will vary from file to file.
Can any one offer suggestions on the best way to do this.
I understand that I have to use the FSO to open and read the file, but I am not sure the best way to proceed after that.
Hey all, I have a stored procedure, which need one variable as parameter. I am trying to call this stored procedure from my DTS Task and my parameter is defined as the Global Variable in DTS. here is the SP call within my DTS Task
it gives me an error that DTSGlobalVariables function not defined. In this case how can i pass the value of Client Id which is my global variable to my SP.
I'm trying to pass a global variable from a DTS package to the child packages that it calls using ExecutePackage tasks. I have selected the child's global variable on the Inner Global Variable tab and I have selected the parent's global variable on the Outer Global Variable tab. That doesn't work. Whatever I type into the Value column of the Inner Global Variable tab gets passed to the child package. How do I get the parent's global variable passed to the child package? Do I need to set the value on the Inner Global Variable tab to some special word to make it look for the parent's global variable? If I set it to nothing, nothing gets passed.
I have been able to make this work using an ActiveX Script task. I can set the Inner Global Variable value of the Task object to the parent's global variable value, but that's not the clean solution I'm looking for. There must be a simple way to do this because Microsoft's documentation brags about this feature, but they don't explain exactly how to do it.
I have an FTP task in a for each containter and am setting the RemotePath using an expression (works great). Thought I could use this to start learning some of the scripting funtionality in SSIS (in a script task) so found some code in this forum (thanks Original Posters!) and tried my hand at some coding... Intent was to create a variable and then dynamically overwrite the Expression in the FTP Task from the script (I know I don't need to do this, I just wanted to use it for learning purposes)....
I have a variable named varFTPDestPathFileName (string) and want to set it to the value of varFTPDestPath (string) + varFTPFileName (string). Note: all variables are scoped at the package level (could this be the problem?). I did not assign any of the variables to ReadOnly or ReadWrite on the Script Task Editor page (seems to me that doing this in the code is a whole lot cleaner [and self documenting] than on the Task Editor page)...
I keep getting the following error: "The element cannot be found in a collection. This error happens when you try to retrieve an element from a collection on a container during execution of the package and the element is not there."
Here is the script:
Public Sub Main() Dim vars As Variables ' Lock for Read/Write the variables we are going to use Dts.VariableDispenser.LockForRead("User::varFTPDestPath") Dts.VariableDispenser.LockForRead("User::varFTPFileName") Dts.VariableDispenser.LockForWrite("User::varSourcePathFileName") Dts.VariableDispenser.GetVariables(vars)
' Set Value of varSourcePathFileName <<--- ERROR OCCURS HERE vars("User::varSourcePathFileName").Value = _ Dts.Variables("User::varFTPDestPath").Value.ToString + _ Dts.Variables("User::varFTPFileName").Value.ToString
vars.Unlock()
Dts.TaskResult = Dts.Results.Success
End Sub
I would also like to be able to loop through the Dts.VariableDispensor to see the contents of the variables and their values.
Somthing like
For each ??? in vars msgbox(???.Value) Next
One other question... Do we always have to preface the variable with "User::" or "System::", if so can you explain why?
I've been looking around but haven't yet found the syntax for usage of global variables in an SQL Task.
I've set the global variable Id (see code below):
if (select field from table where id = @[User::id]) is null select top 1 1 as response from table else select top 1 0 as response from table
My objective with it is to set another global variable (@isNull). Supposably, when the selection returns null, I should set the variable to null, I did it by using the selections and mapping the response to that variable (is ther a better way to do so?).
When I try to execute this, it says the variable has not been defined. Here is the error:
Error: Must declare the variable '@'.
I've also tryed it withou the brackets and the User:: thing in the beggining, (@id directly) and here is the response:
Error: Must declare the variable '@id'.
How should I access the global variables in the SQL code? (BTW, I've checked the field in execution time and it is set to 23, the correct Id, so the block that preceedes this one is working properly)
I have a package (Package1) that is run from another package (Package2) via a Execute Package Task. I set a Global Variable called sErrorMessage in the in Package1 and would like to access that Global Variable in an ActiveX Script Task in Package2. How can I do this?
The logic I am trying to recreate via SSIS is the following SQL statement:
insert into db3.dbo.targettable1 -- Target database table (SiteC, Objecte, Attrib1)
select distinct ?, ?, from ? -- Source database table join dbo.targettable2 c1 -- Target database table on c1.Alias = ? and c1.CSetID = ? and c1.FacID = (select f.PFacID from dbo.Fac f where f.FacID = ?) where not exists (select * from dbo.targettable2 c -- Target database table where c.Alias = ? and c.FacID = ? and c.CSetID = ?)
I have an OLE DB Source that consists of an expression to approximate the following portion of the Above Select statement:
Select ?, from ? -- Source database table and
The package has 2 global variables User:CSetID and User::FacID whose scope is global to the package and whose values are set within a Foreach Loop Container outside of the Data Flow Task
I was trying to reference the 2 global variables within the Looup Transformation to recreate the following portion of the SQL statement.but encounter errors:
join dbo.targettable2 c1 -- Target database table on c1.Alias = ? and c1.CSetID = ? and c1.FacID = (select f.PFacID from dbo.Fac f where f.FacID = ?)
In the Advanced Editor window of Lookup Transaction
select * from (select * from [dbo].[targettable2 ]) as refTable where [refTable].[Alias] = ? and [refTable].[FacID] = ? and [refTable].[CSetID] = ?
Is there away to reference global variables in a Lookup Transformation that are set outside a Data Task Flow?
Before in DTS, I used to using ActiveX scripts to setup simple Global Variables that would later be used by the package, let's say a give state or a date. I'm completely lost how to do the same in SSIS, can someone give me a hand? I'm not sure what component to use and how to read this variable.
Below is a code snippet that is throwing the following error:Error Source: Microsoft DTS PackageError Description: Error Code: 0Error Source= Microsoft VBScript runtime errorError Description: Object required: 'Server'Error on line 13In my code line 13 is the following:Set Cnxn = Server.CreateObject("ADODB.connection")I did set up a Microsoft OLE DB connection in my DTS package for thisActiveX Task. I'm new to VBScript and ActiveX. Any help would beappreciated.Thanks,-p'************************************************* *********************' Visual Basic ActiveX Script'************************************************* ***********************Function Main()Main = DTSTaskExecResult_Success' connection, command and recordset variablesDim Cnxn, strCnxn' create and open connectionSet Cnxn = Server.CreateObject("ADODB.connection")strCnxn = "data source=Pluto;initial catalog=Stats;User Id=sa;password=;"Cnxn.Open strCnxnEnd Function
I have a DTS package which executes an ActiveX control written in vbscript. I need to move the ActiveX code out of the DTS environment. The code is too big to execute in a Sql Server Job Step as type "ActiveX Script." Do I have any other easy options?
Within a SQL 2000 DTS Package I have an ActiveX Script that would go within my transform tasks and update the queries by concatenating a "Where" clause with a date from a database table. This way I could keep track of when the last time I updated the table so that I could only bring down the rows since the last run. How can this be done within SSIS? I've been looking and I'm getting confused. Any help would be greatly appreciated.
I'm new to SSIS, but have been programming in SQL and ASP.Net for several years. In Visual Studio 2005 Team Edition I've created an SSIS that imports data from a flat file into the database. The original process worked, but did not check the creation date of the import file. I've been asked to add logic that will check that date and verify that it's more recent than a value stored in the database before the import process executes.
Here are the task steps.
[Execute SQL Task] - Run a stored procedure that checks to see if the import is running. If so, stop execution. Otherwise, proceed to the next step.
[Execute SQL Task] - Log an entry to a table indicating that the import has started.
[Script Task] - Get the create date for the current flat file via the reference provided in the file connection manager. Assign that date to a global value (FileCreateDate) and pass it to the next step. This works.
[Execute SQL Task] - Compare this file date with the last file create date in the database. This is where the process breaks. This step depends on 2 variables defined at a global level. The first is FileCreateDate, which gets set in step 3. The second is a global variable named IsNewFile. That variable needs to be set in this step based on what the stored procedure this step calls finds out on the database. Precedence constraints direct behavior to the next proper node according to the TRUE/FALSE setting of IsNewFile.
If IsNewFile is FALSE, direct the process to a step that enters a log entry to a table and conclude execution of the SSIS.
If IsNewFile is TRUE, proceed with the import. There are 5 other subsequent steps that follow this decision, but since those work they are not relevant to this post. Here is the stored procedure that Step 4 is calling. You can see that I experimented with using and not using the OUTPUT option. I really don't care if it returns the value as an OUTPUT or as a field in a recordset. All I care about is getting that value back from the stored procedure so this node in the decision tree can point the flow in the correct direction.
The SSIS package passes the FileCreateDate parameter to this procedure, which then compares that parameter with the date saved in tbl_ImportFileCreateDate.
If the date is newer (or if there is no date), it updates the field in that table and returns a TRUE IsNewFile bit value in a recordset.
Otherwise it returns a FALSE value in the IsNewFile column.
SELECT @CreateDateInTable = FileCreateDate FROM tbl_ImportFileCreateDate WHERE ProcessName = @ProcessName
IF EXISTS (SELECT ProcessName FROM tbl_ImportFileCreateDate WHERE ProcessName = @ProcessName)
BEGIN
-- The process exists in tbl_ImportFileCreateDate. Compare the create dates.
IF (@FileCreateDate > @CreateDateInTable)
BEGIN
-- This is a newer file date. Update the table and set @IsNewFile to TRUE.
UPDATE tbl_ImportFileCreateDate
SET FileCreateDate = @FileCreateDate
WHERE ProcessName = @ProcessName
SET @IsNewFile = 1
END
ELSE
BEGIN
-- The file date is the same or older.
SET @IsNewFile = 0
END
END
ELSE
BEGIN
-- This is a new process for tbl_ImportFileCreateDate. Add a record to that table and set @IsNewFile to TRUE.
INSERT INTO tbl_ImportFileCreateDate (ProcessName, FileCreateDate)
VALUES (@ProcessName, @FileCreateDate)
SET @IsNewFile = 1
END
SELECT @IsNewFile
The relevant Global Variables in the package are defined as follows: Name : Scope : Date Type : Value FileCreateDate : (Package Name) : DateType : 1/1/2000 IsNewFile : (Package Name) : Boolean : False
Setting the properties in the "Execute SQL Task Editor" has been the difficult part of this. Here are the settings.
General Name = Compare Last File Create Date Description = Compares the create date of the current file with a value in tbl_ImportFileCreateDate. TimeOut = 0 CodePage = 1252 ResultSet = None ConnectionType = OLE DB Connection = MyServerDataBase SQLSourceType = Direct input IsQueryStoredProcedure = False BypassPrepare = True
I tried several SQL statements, suspecting it's a syntax issue. All of these failed, but with different error messages. These are the 2 most recent attempts based on posts I was able to locate. SQLStatement = exec ? = dbo.p_CheckImportFileCreateDate 'GL Account Import', ?, ? output SQLStatement = exec p_CheckImportFileCreateDate 'GL Account Import', ?, ? output
Parameter Mapping Variable Name = User::FileCreateDate, Direction = Input, DataType = DATE, Parameter Name = 0, Parameter Size = -1 Variable Name = User::IsNewFile, Direction = Output, DataType = BYTE, Parameter Name = 1, Parameter Size = -1
Result Set is empty. Expressions is empty.
When I run this in debug mode with this SQL statement ... exec ? = dbo.p_CheckImportFileCreateDate 'GL Account Import', ?, ? output ... the following error message appears.
SSIS package "MyPackage.dtsx" starting. Information: 0x4004300A at Import data from flat file to tbl_GLImport, DTS.Pipeline: Validation phase is beginning.
Error: 0xC002F210 at Compare Last File Create Date, Execute SQL Task: Executing the query "exec ? = dbo.p_CheckImportFileCreateDate 'GL Account Import', ?, ? output" failed with the following error: "No value given for one or more required parameters.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
Task failed: Compare Last File Create Date
Warning: 0x80019002 at GLImport: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. The Execution method succeeded, but the number of errors raised (1) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "MyPackage.dtsx" finished: Failure.
When the above is run tbl_ImportFileCreateDate does not get updated, so it's failing at some point when calling the procedure.
When I run this in debug mode with this SQL statement ... exec p_CheckImportFileCreateDate 'GL Account Import', ?, ? output ... the tbl_ImportFileCreateDate table gets updated. So I know that data piece is working, but then it fails with the following message.
SSIS package "MyPackage.dtsx" starting. Information: 0x4004300A at Import data from flat file to tbl_GLImport, DTS.Pipeline: Validation phase is beginning.
Error: 0xC001F009 at GLImport: The type of the value being assigned to variable "User::IsNewFile" differs from the current variable type. Variables may not change type during execution. Variable types are strict, except for variables of type Object.
Error: 0xC002F210 at Compare Last File Create Date, Execute SQL Task: Executing the query "exec p_CheckImportFileCreateDate 'GL Account Import', ?, ? output" failed with the following error: "The type of the value being assigned to variable "User::IsNewFile" differs from the current variable type. Variables may not change type during execution. Variable types are strict, except for variables of type Object. ". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly. Task failed: Compare Last File Create Date
Warning: 0x80019002 at GLImport: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. The Execution method succeeded, but the number of errors raised (3) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "MyPackage.dtsx" finished: Failure.
The IsNewFile global variable is scoped at the package level and has a Boolean data type, and the Output parameter in the stored procedure is defined as a Bit. So what gives?
The "Possible Failure Reasons" message is so generic that it's been useless to me. And I've been unable to find any examples online that explain how to do what I'm attempting. This would seem to be a very common task. My suspicion is that one or more of the settings in that Execute SQL Task node is bad. Or that there is some cryptic, undocumented reason that this is failing.
I'm trying this code in a script task inside a foreach loop that gets a file in a foreach file enumerator and maps it to a variable FileName. In the script task I'm setting the ReadOnlyVariables to @[User::FileName] : Imports System.IO Imports System Imports System.Data Imports System.Math Imports Microsoft.SqlServer.Dts.Runtime Public Class ScriptMain Public Sub Main() Dim fname As String = CType(Dts.Variables("FileName").Value, String) File.Move(fname, fname + ".processed") Dts.TaskResult = Dts.Results.Success End Sub End Class But now getting this error: Error: Failed to lock variable "\xxxxDataf1.csv" for read access with error 0xC0010001 "The variable cannot be found. This occurs when an attempt is made to retrieve a variable from the Variables collection on a container during execution of the package, and the variable is not there. The variable name may have changed or the variable is not being created.".
As you can see the variable does appear to be resolving.
I have built the following query in SSMS, when I add it to an Execute SQL Task in SSIS. I get this error -
"[Execute SQL Task] Error: Executing the query "SELECT @columnz = COALESCE(@columnz + ',[' + times..." failed with the following error:
"Must declare the scalar variable "@columnz".". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly."
Query:
use design
drop table tmpNCPCNCDownstreamMaxUtilization3wks select node, max(utilization) as max_Utilization, DATE into tmpNCPCNCDownstreamMaxUtilization3wks from stage_ncpcncdownstream_temporal WHERE Date BETWEEN DATEADD(day, -20, GETDATE()) AND GETDATE()
I have a SQL Task that updates running totals on a record inserted using a Data Flow Task. The package runs without error, but the actual row does not calculate the running totals. I suspect that the inserted record is not committed until the package completes and the SQL Task is seeing the previous record as the current. Here is the code in the SQL Task:
DECLARE @DV INT; SET @DV = (SELECT MAX(DateValue) FROM tblTG); DECLARE @PV INT; SET @PV = @DV - 1;
I've not been successful in passing a SSIS global variable to a declared parameter, but is it possible to do this:
DECLARE @DV INT; SET @DV = ?; DECLARE @PV INT; SET @PV = @DV - 1;
I have almost 50 references to these parameters in the query so a substitution would be helpful.
In my SSIS Data Flow Task, I have a query that retrieves data based on a couple of date parameters. Is there a way we can pass/use the Variables defined in the SSIS package in the query ?
(I am assigning values to those variables from C# code)
The query should look like this:
select ordernumber, customerid from salesorder
where statecode=3 and datefulfilled between @variable1 and @variable2
I've got a package in SSIS 2012 that has an Execute SQL task in the control flow level.
The SQL in question does an Upsert via the SQL merge statement. What I want to do, is return the count of records inserted and records updated (No deletes going on here to worry about). I'm using the output option to output the changed recs to a table variable.
I've tried returning the values as:
Select Count(*) as UpdateCount from @mergeOutput where Action = 'Update' and Select Count(*) as InsertCount from @mergeOutput where Action = 'Insert'
I've tried setting the resultset to both Single rowset and Full rowset, but i'm not seeing anything returned to the package variables I've set for them (intInsertcount and intUpdatecount).
I am using SSIS 2012 to dynamically backup stored procedures on a list of Servers and Databases.Here are the steps in my package,
1. Execute SQL Task: Captures a result set (configured to save the data set in an Object variable) with all the Servers and Databases on which stored procedures exist.
2. For each loop that is configured to get each each row(server name @[User::Server_Name] and databases name @[User::DataBase_Name]) from the object variable (@[User::Connection_Strings])and pass it to a connection manager that has an expression for servername and database name.
2a) Within the for each loop, i have an execute process task that is configured as
i) Executable: C:WindowsSystem32WindowsPowerShellv1.0powershell.exe ii) Arguments: Configured this to fetch value from an expression. The expression i am using is,'C:batch - CopyPowerShell Scripts to Backup Stored ProceduresScriptOutSPs.ps1' -$Server_Name "+ @[User::Server_Name]+ " -$Database_Name "+ @[User::DataBase_Name]
Note: @[User::Server_Name] is the Servername from object variable and so is @[User::DataBase_Name] for database name . The execute task is to run a command line that triggers a powershell script with parameters. Here is the powershell script that i am using,
When i execute the script, by passing parameters from arguments, it executes successfully but nothing happens. Passing wrong arguments in the expression?
In the properties for a DTS package, I have a string variable. I need to execute an update statement based on this variable. I was going to use Execute SQL, but I didn't know now to get a value from the variables section.
I store the value there because I will be using dtsrun or the DTS library to execute the package, which I can change this value upon execution. How do I execute an update based on the value?
Have anyone encountered a problem with DTS Global variables. What I am trying to do is keep a counter in a global variable. It works fine when I run the job manually but if I schedule the job it doesn't increment the counter. Any ideas?
cani define a global time in sql server2000, which all tables in DIFFERENT databases can use. I need this in order to INNER JOIN all tables in SAME/DIFFERENT DBs on basis of this datetime field.
Is there a way to declare a persistent global variable in SQL Server?
I'd like my stored procs to fetch data in a different source depending on a debug (or development) variable.
For example, I'd like to be able to set a variable to either 0 or 1 (true or false) and have a static SP defined as:
IF @MYVARIABLE = 1 SELECT * FROM Openquery(Server1, 'SELECT * FROM Table1") ELSE SELECT * FROM Openquery(Server2, 'SELECT * FROM Table2')
What do you think? Since these SPs should be called a lot, I don't want to store the info in a table, I want it as a global variableso it will be as fast as possible.
Using SQL2000, I have a DTS that takes data from MySQL to sqlserver, thecatch is I want to specify a specific range of dates.How to use a global variable? At the moment I manually changes the dates andjobs run on a daily basis.sample sql statement from Mysql connection:select *from Table1where date between '1/1/2004' and '6/30/2004'CHANGE TO:select *from Table1where date between @fromdate and @todateTIABob
Is there a way in SQL Server to setup Global Server Variables or Constants
We are working with an Off the Shelf Constituent Management application based on SQL Server. We can only read from the App's DB. So we setup another DB to run SP to access the data in the main DB. One problem we have is that there codes that the app DB uses that we need to reference as criteria in our SP. Example: Code for a phone type of email is 731. So if we want to pull email addresses we need to Select where PhoneType = 731. We found out that each time the main db is rebuilt those codes change. That means finding everytime we used that code and changing it.
It would be great to be able to set a global variable and use it anytime that code is needed.
A short explanation: I have 4 locations that will import from text files.
These text files will need to be 'stamped' with a location_id. I'd like to pass the location_id to the dts package via a VB program. The reason for the VB program is I'd like to keep users from the Enterprise Manager. Is this possible to accomplish? Thanks for your help.
I am executing: -------------------------------- DECLARE @x AS int
SET @lcQuery = 'UPDATE ... WHERE .... ' + ' SET @x = @@ROWCOUNT' EXEC (@lcQuery)
PRINT @x -------------------------------- @x does NOT return a value, because it is "local" to the lcQuery execution. As a matter of fact, to execute it, I have to write:
SET @lcQuery = 'DECLARE @x AS int ' + 'UPDATE ... WHERE .... ' + 'SET @x = @@ROWCOUNT' EXEC (@lcQuery)
How can I pass the variable @x to the progam from within EXEC (@lcQuery)? How does EXEC (@lcQuery) execute? In a different space?
As a newcomer to SQL2000, I need to create a DTS Global Variable that will import a file daily automatically. I've got the DTS working great, but I have to manually execute it every day. The log file name Accounting 02-22-2002.csv changes by the date every day. So the next record set would be Accounting 02-23-2002.csv and so on. Any ideas on how to create this? I think some VBScript is necessary from what I understand. I'm looking at the function DATEADD(d, 1, date).
We're using SQL Server 2000. I have a table with a couple fields I'd like to automatically updatre with a atrigger. One of the fields needs to log the current time the record has been modified. The other field needs to log the current user ID.Storing the current time is easy with UPDATE and GETDATE(), but obtaining the current user from my ASP.NET code isnt as pretty as I would have to pass in the user ID from my backend code into the server, perhaps as a stored procedure variable that will manually update the table with the new ID. I was hoping to perhaps set some global variable of the user ID for the current connection to the dbase so that once the trigger is made, it feeds off this connection-level global variable to plug in the needed User ID for the record.So for instance:SET @@MyCurrentUserID = 4; UPDATE MyTable SET myStuff='games';The MyTable trigger will be initiated and the LastModified field of MyTable will be auto update with the current datetime, but also the LastModifiedUser field of MyTable will be update with MyCurrentUserID.Is this possible with SQL Server 2000 for Windows 2K3?Tx