Package Ignores Environment Variable For ConnectionString
Mar 19, 2008
We're attempting set the ConnectionString for our configuration database connection manager from an environment variable, but SSIS seems to ignore the environment variable value. Deployment process:
Create the Connection Manager
Create an Environment Variable type configuration with the Target Property: Package.Connections[acConfigDBManager].Properties[ConnectionString]
Build the package
Copy the package and the manifest from the project's inDeployment folder to a folder on the server
From the SSIS server's console, Import the package from the File System.
Run the package, after first inspecting the ConnectionString in the Connection Managers collection
In all cases, the ConnectionString variable for our configuration database is the value in the package at build time.
So far we've tried the following variations:
confirmed that the spelling and case of the environment variable is identical on the XP development computer and the Win2003 server.
restarted SQL Server and SSIS
rebooted the Win2003 server.
built the package with a blank ConnectionString value in the connection manager
re-imported the package, overwriting the old one
deleted the old package before reimporting
renamed the package and imported
run imports from SQL Management Studio from the server console
assign the connection string from an arbitrary system environment variable
assign the ServerName and use an expression to build ConnectionString
What now? Has anyone been able to set a Connection Manager property from an environment variable?
I have issues with the Connection Manager in the SSIS package when using package configs thru environment variable.
Here goes..
SSIS package1:
Connections used: devcon1, devcon2 - Dev Env and testcon1,testcon2 - Test Env. Now using all four. Ideally either devcons or testcons should reside at a time.
Environment variable:
Pckg_config = <location of config file which has testcon1 and testcon2>
I need to use only devcon1 and devcon2 in Dev env. In test i need to use only testcon1 and testcon2 Hence i set the values of devcons in devEnv.dtsconfig and testcons in testEnv.dtsconfig
Now i remove both testcons from ssis package. If i try to run the Test Env and my testcons which are marked in testenv.dtsconfig are not found as connections in ssis package then the ssis gives error wanting for those connections.
SSIS maintains the connections in the Connection Manager per package. Although internally it is a pool of connections.
Ideally i should be able to play around with the connection at run time. My package now works, if it is deployed with all the devcons and testcons together. However, ideally it should be either devcons or testcons. I am trying to be more explicable to reach to the masses here.
Am i doing something wrong? All your efforts in solving this puzzle will be greatly appreciated. Please participate.
We are using Package configuration with environment variables. The problem we are having that if we try to open project from other PC (PC 2) it gives the error:
Error 1 Error loading F0005.dtsx: Failed to decrypt protected XML node "DTS:Password" with error 0x8009000B "Key not valid for use in specified state.". You may not be authorized to access this information. This error occurs when there is a cryptographic error. Verify that the correct key is available. z:visual studio 2005projectssales data martextract to staging areaF0005.dtsx 1 1
We are using environment variable named DWConfig and have configured correct path in each PC. If we edit package configuration in PC 2 and go thru the same procedures without any amendments the errors is removed for that PC and if, again we OPEN that project in PC 1 it gives same error and if we go thru package configuration wizard again error is removed.
Can any one tell me is there any solution of that problem?
Note: Our project is saved on server (neither PC 1 nor PC 2)
I have a package that uses an environment variable package configuration of value X for a connection string. I close BIDS. I change the value of the environment variable to value Y. I open BIDS and the package, and the value of my connection string is Y. I save my package with the new configuration. if I look at the dtsx file, I see connection string with value Y. All as expected.
I move the package to my server (I've tried Import package from SSMS, using the deployment manifest, and save copy as). On the server, the environment variable is set to value Y. If I run the package or export it; however, the value of my connection string is X!
Does anyone have any suggestions of things to try or some reason that this is not working?
Description: The package path referenced an object that cannot be found: "Package.Variables[User::<variable_name>].Properties[Value]". This occurs when an attempt is made to resolve a package path to an object that cannot be found. End Warning Could not load package "<package_name>" because of error 0xC0010014.
Basically, I create a package variable under my User Namespace and this variable will tell what server the SSIS is running at. We first create a system variable locally and the SQL Server will have a variable with exactly the same name so that the server name will be evaluated through the package variable/package configurations when the SSIS is executing from a SQL Server job.
This way we do not hard code the server name... We always succeeded on doing that with DTS as well as SSIS packages but just now my package is running into this issue...
Since I did not change ANYTHING in the package, I am guessing this is not programming related and that something was changed in the server. However, the DBA was helpless over here and I have no clue of what this error means.
I have a parent package that contains two children... The second child depends on the succes of the first child.
THe first child generates a variable value and stores it in an Environment variable ( Visibility - All ) ...After the first succeeds, the second will start executing and will pick up the variable value from environment variable( through package configuration setting )...
Unfortunately, this doesn't work...As the second child picks the stale value of the environment variables...Essentially it assigns variable value not after the first child is finished, but right at the beginning of parent execution...
I tried to execute coth children as Out Of Proc as well as In Proc...The same
Would anybody have an idea how to resolve this problem?
it would be really useful. it looks like the .dtsconfig file needs to be maintained on each install independently. This makes maintenance a nightmare. it would be a lot nicer if the .dtsconfig files were more like templates rather than hard coded values to specific system resources.
I'm trying to figure out the best way to write a script to deploy environment variables to different servers. To create the variable I'm currently using catalog. create_environment_variable. I want to wrap that in an if not exist statement.I had thought about just blowing away all the variables and recreating them but I thought that wouldn't go over well in prod. I wasn't sure if by deleting the variable, references to the variable would be lost.
I've created a SSIS Package and it's connection is based on Environment Variable(please seeprocedure).
Now, I'm trying to create a job that calls this package and it seems that when you view Data Sources, it still pointing to the old server.
But when you open-up the package through BIDS in the same server, it's using the new reference that I have specified in the environment variable (please refer to the first image).
I came across this blog with the same issues as mine. He suggested to re-start the SSIS Service which I already did  but nothing happens. I even re-started the SQL Agent but still no luck.
I'm not sure what else is missing except for re-starting the machine which is the last thing I want to do as this is PRODUCTION server.
I have a Master Package that executes a number a child packages.  In my SSIS Package Configuration:  1.  I have an SSIS Configuation table that holds the connection string.  2. An XML Configuration File with a setting of configuration location stored in an enviornmental variable.  3. And finally, an Eveniornmental variable with the setting of ProjectFolderAbsolutePath value, where it is the full path of the project folder.  The project functions normally but everytime I open it I get the following error.
" Warning loading MasterPackage.dtsx: The configuration environment variable was not found. The environment variable was: "EnviorVariable". This occurs when a package specifies an environment variable for a configuration setting but it cannot be found. Check the configurations collection in the package and verify that the specified environment variable is available and valid."
I have a problem with SSIS reading an environment variable after deploying the packages to a server. I explain.
I have an Parent Packages ETL_MAIN_PACKAGE.dtsx that reads the child packages from a record set and loops on it to execute them with the Execute Package Task task. The first child package to be executed is called DIM_PERIODIC.dtsx.
On my local machine, the Parent Package is configured to read its database connections from an XML file SSIS_configfile.config located on my C: drive. The path (C:SSIS_configfile.config) to this file is stored in the environment variable BI_ETL.
When I run the Parent Package inside SSIS only machine, the connections are read and the package executes perfectly. Now, I want to deploy the packages on our server.
I copied the XML configuration file to the server C drive, I created the same environment variable BI_ETL and set its value to C:SSIS_configfile.config and I rebooted the machine (in case).
The execution of the Parent package is managed by a stored procedure. I use xp_cmdshell command. The command line generated is :
This command generates an error telling that the Environment variable is not found and it throws this error:
Error : 2007-08-23 18:59:10.25 Code : 0x80019003 Sourse : The configuration environment variable was not found. The environment variable was: BI_ETL. This occurs when a package specifies an environment variable for a configuration setting but it cannot be found. Check the configurations collection in the package and verify that the specified environment variable is available and valid. End Error
Description: The file name "C:ETL_Deployment" /SET Package.Variables[P_LOOKUP_PATH].Value;C:ETL_DeploymentETL_LOGSDIM_PERIODIC.dtsx" specified in the connection was not valid.
End Error
I run the package on the same server with a command line directly in a DOS window:
Description: The file name "C:ETL_Deployment" /SET Package.Variables[P_LOOKUP_PATH].Value;C:ETL_DeploymentETL_LOGSDIM_PERIODIC.dtsx" specified in the connection was not valid.
End Error
I conclude that the environment variable is not read at all.
Does anybody have an idea on how to solve this problem ?
I'm doing a simple ETL that reads a database table and dumps the content to a text file. The text file will be named Employee.txt. This file name will remain the same across my environments, but I may want to vary the directory location to where I want this file dumped.
So, I defined an environment variable called "DataTargetDir" in all my environments. Now, I want to utilize this variable in the "File name:" box within the Flat File Connection Manager Editor. How do I do this? I'm thinking I can write something like "%DataTargetDir%Employee.txt" in the "File name:" box, but it's not working.
I am using Package Configuration to simplify SSIS package deployment process. All the configuration information are stored in XML file. So far so good, However, since I have many, 20, packages. For each package, there is one configuration file to it. During the deployment process, I have dynamically modify connecting string (server name, DB name) to new ones. It ends up 20 or more modification and it's eaily for me to make mistake. Is there any workaround such as setting up environment variable, I guess, to allow me only modify once and apply it to all the packages?
This is my problem. My package executes fine when i set the connection string with the same database where i execute the query. If i execute with another database connection stirng if fails bacause while executing the pacakge it trys to access the same connection string at design mode.
when i try to execute through cmd prompt by setting conn <new database connection string> it fails.
Is package configuration is the only solution. how can i change conn string depending on different server?
This has only started happening in the last two days
When we invoke our SSIS package from a web service it used to work fine but now gives the following error. the DLL name changes all the time. Any help would be appreciated
Code SnippetSystem.IO.FileNotFoundException: Could not find file 'C:WINDOWSTEMPvrjeaanf.dll'. at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath) at System.IO.FileStream.Init(String path, FileMode mode, FileAccess access, Int32 rights, Boolean useRights, FileShare share, Int32 bufferSize, FileOptions options, SECURITY_ATTRIBUTES secAttrs, String msgPath, Boolean bFromProxy) at System.IO.FileStream..ctor(String path, FileMode mode, FileAccess access, FileShare share) at Microsoft.CSharp.CSharpCodeGenerator.FromFileBatch(CompilerParameters options, String[] fileNames) at Microsoft.CSharp.CSharpCodeGenerator.FromSourceBatch(CompilerParameters options, String[] sources) at Microsoft.CSharp.CSharpCodeGenerator.System.CodeDom.Compiler.ICodeCompiler.CompileAssemblyFromSourceBatch(CompilerParameters options, String[] sources) at System.CodeDom.Compiler.CodeDomProvider.CompileAssemblyFromSource(CompilerParameters options, String[] sources) at System.Xml.Serialization.Compiler.Compile(Assembly parent, String ns, XmlSerializerCompilerParameters xmlParameters, Evidence evidence) at System.Xml.Serialization.TempAssembly.GenerateAssembly(XmlMapping[] xmlMappings, Type[] types, String defaultNamespace, Evidence evidence, XmlSerializerCompilerParameters parameters, Assembly assembly, Hashtable assemblies) at System.Xml.Serialization.TempAssembly..ctor(XmlMapping[] xmlMappings, Type[] types, String defaultNamespace, String location, Evidence evidence) at System.Xml.Serialization.XmlSerializer.FromMappings(XmlMapping[] mappings, Evidence evidence) at System.Web.Services.Protocols.XmlReturn.GetInitializers(LogicalMethodInfo[] methodInfos) at System.Web.Services.Protocols.XmlReturnWriter.GetInitializers(LogicalMethodInfo[] methodInfos) at System.Web.Services.Protocols.MimeFormatter.GetInitializers(Type type, LogicalMethodInfo[] methodInfos) at System.Web.Services.Protocols.HttpServerType..ctor(Type type) at System.Web.Services.Protocols.HttpServerProtocol.Initialize() at System.Web.Services.Protocols.ServerProtocolFactory.Create(Type type, HttpContext context, HttpRequest request, HttpResponse response, Boolean& abortProcessing)
I have an SSIS package and want to execute it on a computer that only has SQL Express installed on it. Since SQL Express didn't brought dtexec or dtui so what should I do ?
I'm working on an SSIS package that uses a vb.net script to grab some XML from a webservice (I'd explain why I'm not using a web service task here, but I'd just get angry), and I wish to then assign the XML string to a package variable which then gets sent along to a DataFlow Task that contains an XML Source that points at said variable. when I copy the XML string into the variable value in the script, if do a quickwatch on the variable (as in Dts.Variable("MyXML").value) it looks as though the new value has been copied to the variable, but when I step out of that task and look at the package explorer the variable is its original value.
I think the problem is that the dataflow XML source has a lock on the variable and so the script task isn't affecting it. Does anyone have any experience with this kind of problem, or know a workaround?
I have deployed a project with multiple packages to SSIS 2012 db. I am able to configure the project parameters fine. But, I am not able to replace the package variable values with the 'Environment' variables.
I have a for each loop that populates from a set of flat files into a Sql Server table, I run the Flat file Import via a dts package embedded into Execute DTS 2000 Task. I want to pass the Sourcefile Name that is fetched by the For Each Loop to assign it Global Variable in DTS. how this can be made ?
I have created a SSIS package with a Foreach Loop including a Data Flow Task, which in turn include a Row Count component which pass the row count value to variable with package scope. The variable is used in an Execute SQL Task following the Data Flow Task.
The package executes successfully when executed on its own, but when executed as a child from a parent package (which only include an Execute Package Task) the variable from the Foreach Loop becomes NULL.
There are a lot of other variables in the package receiving values dynamically without any problem, the row count variable however is the only variable in the package that receives a value as part of a Data Flow (and used in following tasks within the Foreach Loop).
Why does the variable become Null? For your information, I am using a variable with package scope and no variables from the parent package are used or passed from the child package to the parent package.
(For your information, we are running the 64 bit version)
We have one main package from which 7 other child packages are called. We are using ParentPackage variables to assign values for variables and database connections.
While the values from ParentPackage variable get assigned to some of the packages properly, to others it doesn€™t assign the value.
For example: Except for one of the packages the database connection string gets assigned properly to all other packages.
Similarly, in another package one of the variables doesn€™t get assigned. In other packages it is assigned properly.
We have checked all the other property values and they are exactly the same.
We cannot make any head or tail of this erratic behavior.
I have packages stored in SQL store. I was letting users run the packages from a .net app that I made with
Microsoft.SqlServer.Dts.Runtime
Now I have noticed this causes the packages to run on the client pc cpu, as well as the network traffic is done via the client pc, in my particular case this is slow.
From the doc and in this forum I have found that you can run a package on the Server cpu through sql agent, let packages be run in a sql job. after that you can start a package from an application with the SQL sp_start_job .
But How do you set a user::varibale in a package if you have to start the package from a sql agent job ?
I have finished a change request from our client. I need to update clients' database with the one in developments.Here is the changes i made to database:Added/Changed some tablesAdded/Changed some stored proceduresAdded data to some dictionary tableThe data in clients' current database MUST be kept. So how can I merge the changed information to clients' database?
I've integrated my reports into an app using the ReportViewer control, and I've created custom parameter selection controls. I want the user to be able to enter dates in dd/mm/yy format into a textbox. I've set the language of my reports to en-gb.
The SetParametes method of the ReportViewer control copes with this fine, but when I pass a date string in dd/mm/yy format into GetReportParameters, I get a 'parameter value not valid for its type' error. But it works for date strings in mm/dd/yy format, so it's as if GetReportParameters is ignoring my language setting.
Has anyone got any idea as to where I'm going wrong?
I have created a common table expression. The first query is meant to select a particular months worth of data. The second query is meant to select the following months data but only for the subscribers (MSISDN's) that appeared in the first month.
The query returns result for month 1 but ignores the second month. If I remove the last inner join to the CTE (INNER JOIN Test_Table tt ON tt.MSISDN = B.SUBSCRIBER_NUMBER) then I get a full list for both months but obviously this isn't limited to those MSISDN's in the first month.
The second issue that I'm having is that I can't select a sample of say only a 1000 MSISDN's from the first month...
WITH Test_Table (Report_Month, MSISDN, SUB_SEGMENT) AS ( Select R.Report_Month, R.MSISDN, DV.SUB_SEGMENT
I am having a problem restricting write access to tables in my database.In my database I have a table called, for the sake of argument, 'TableX'.In my SQL Server Logins, I have set up a login for 'Domain Users' using NTauthentication, and a login called 'FullTableX', using SQL Serverauthentication.I have added two users to my database relating to the above logins.I have added a role to my database called 'ReadTableX' with 'Domain Users'as a member of this role. 'ReadTableX' has SELECT permission only on arestricted set of tables.The only other role that 'Domain Users' is a member of is 'public', and'public' has no permissions on any of my tables.The user 'FullTableX' is a member of 'public', 'db_datareader' and'db_datawriter'With the above settings, I would expect user 'FullTableX' to have fullaccess (Select, Insert, Update and Delete) on all my database tables (so farso good), but any user connecting to the database with NT authentication(via ODBC System DSN) should only have read access to the limited set oftables. However, what is happening is that NT authenticated users also havefull access to all of the tables.What am I missing here??
All,Just want to make sure that I understand what's going on here.I have a table with IGNORE_DUP_KEY set on a unique, multi-columnindex.What I'm seeing is this:1) When performing a BULK INSERT, the UNIQUE index is not beingrespected and rows which violate the unique index are inserted.2) When performing a regular INSERT, the UNIQUE index is beingrespected and rows which violate the unique index ARE NOT inserted.Is this expected behavior.Also, I have some questions, given the index described.Q1) Will a regular INSERT that attempts to insert duplicate data getan error back or just a warning?Q2) How can I set things up so that a BULK INSERT would NOT allowduplicates to be entered into the table?Thanks,Wes Gamble
I'm using the SQL Native Client to connect my VB6 application to my SQL Server 2005 database. My SQL Server 2005 database has ANSI_NULLS turned off. I have a query embedded in my VB6 application that uses the syntax "fieldName = NULL" in the WHERE clause. When I execute the query via the SQL Native Client, the query returns zero rows. When I execute the same query via the old OLEDB driver, the query returns many rows. If I change my query to "fieldName IS NULL" syntax, the problem goes away. However, I am more interested in figuring out why ANSI_NULLS are turned on when using the SQL Native Client even though my database has them turned off. Is there a connection string property that I can use with the SQL Native Client to ensure that the query is executed with ANSI_NULLS off?
I have written a script source component and attached a flat file connection. The connection string of which is defined by an expression.
However when I get the connectionstring from the connection in the script it has the default filename value of flat file not the value of the expression. This is proved by passing in the filename variable, and comparing the 2.
The flat file has an expression on the ConnectionString of @[User::filename]
We had a sql 2000 db that had full text implemented on it. We upgrade that server to a 2005 server and now the full text searches that once worked don't. Most do work just the ones with ; or special characters in the query string don't work.
Anyone have the same behavior and how to get it to wrok again?
I want to write a statement something like this SELECT Add_Date, File_No FROM dbo.File_Storage WHERE (File_No = 11/11/1234/) But i want the search to ignore the first 2 digits so that it will return e.g 10/11/1234, 09/11/1234 so that it's only matching the last part Any Help Would be greatly appreciated Thanks