Hi, Let's say I have a package taking as parameter "InvoiceID". I want to execute this package as a child in another package. The parent package gets the list of invoices to produce and calls the child package for each entry of the list.
How do I pass the InvoiceID to the child? I know I can use the parent's variables from the child but I don't want the child package to be dependant on the parent package. For example, I might want to execute the "child" package as a stand-alone in development (providing it with a predefined InvoiceID). I might also want to call the same child package from another parent package with other variable names.
What I would like to do is "push" the value instead of "pulling" it. I know it's possible using the command line and the /SET option (ex.: /SET Package.Variables[InvoiceID].Value;' 184084)... Is it possible using the Execute Package Task?
In respect to Parent / Child packages, can some one correct me if I'm wrong.
Even if connection managers are created in Parent package, the same needs to be created in child packages if they need to connect to database. On the other hand I can just create connection strings in variable in parent package(instead of connection managers itself) and use parent package variable configuration to configure the connection string of child package with the variable value.
Sorry If I'm confusing
The same with variables, the parent variable needs to be mapped to a child variable(using parent package variable config) to be used in child package, it cannot be used as it is.
I am making use of the DtUtil tool to deploy my package to SQL Server. Following is my configuration: 32-bit machine and 32-bit named instance of Yukon.
I have some package variables which need to be set in the code.
Previously I did it as follows:
Set the package variables in the code. For example:
Here the package was successfully deployed and when i open those packages using BIDS, I am able to see that the variables are set to the values as doen in teh code.
Because of oen problem I am not using SaveToSQLServer method. So I switched to DTUtil tool. Now I am doing it this way:
Set the package variables as before. Deploy the package to SQL Server using DTUtil tool.
Now is the problem: The package is successfully deployed. But the variables are not set to the value that I have specified in the code.
I also tried DTexec utility to set the package variable. Even that does n't work. Can anyone help me out? Is there any alternate method to set package variables?
I'd like to be able to call different packages from a control flow. These packages will have different requirements for parameters therefore I'd like to create them dynamically.
Is this possible? Can I do it using a script task?
I am trying to understand the relationship of setting package configurations and setting variable values during job scheduling. I understand that I can select variables that I want to manipulate at run time using package configurations. I understand that the configuration file is an xml file that the job can be told to access at run time. Here are my questions:
1. Once I create a configuration file, do I physically modify the file to change the variable that is input at runtime? 2. Do I have to select the config file and then change the value using the Set values tab? 3. What is the relationship between the config file and the set values tab?
4. When creating a package configuration, when would you use the options other than XML configuration file?
tblServer represents a server, tblInstance represents any SQL Server instances present on the server. They're related thru srvID.
I'd like to write a query that gives me servers with all instances, on one row. This dataset will populate a List in an SSRS report.
I have this but it's clunky and does not provide for more than 2 instances on the server. I'd like this to account for any number of instances all on one row without having to hard code multiple joins:
with serverInfo as ( select srv.srvType as Type ,srv.srvName as ServerName ,srv.srvIP as ServerIP ,ins.instNetName as InstanceNetName
I've been executing a package and passing a parent variable to the child using package configurations but I'd now like to do this using a script task. The script task would then programmatically load the package and execute it.
How do I do this and still use the parent variable?
I've found examples of how to load a package but I haven't been able to find out how to I load it specifying the parent variables.
I think it's possible. MSDN shows the available methods but the example is for the base method.
I've not really used SSIS for a while, and I'm now building some in 2012 and trying to utilise some of the features in the 2012 SSIS catalogue; however I've hit a bit of a stumbling block.
What I'm trying to do is have a master/child package relationship, with several child packages and where the child packages themselves are dynamically called (i.e. the master package may call a different child package based upon some value or state of data already processed.)
When I try and create an expression for the PackageNameFromProject property... well, that property doesn't appear to set dynamically. I know how to do this for old style packages by creating expressions for the package name etc; but that way I can't use the package parameters I have from my master package.
I am not comfortable with DTS 2000 but I need to execute a encapsulated DTS 2000 package from a SSIS package. The real problem is when I need to pass SSIS variables to DTS 2000 package. The DTS 2000 package have 3 global variables that I can identify on " Execute DTS 2000 Package Task Editor - Inner Variables ". I believe the SSIS variables must be mapped on " Execute DTS 2000 Package Task Editor - OuterVariables ". How can I associate the SSIS variables(OuterVariables ) to "Inner Variables"? How can I do it? Much Thanks.
I created a package which passes some infornmations( through parameters) to its child package.
I need to do some processing in parent package based on execution status of child package.i.e.
if child fails then some operation and if child succeeds then other operation.
To determine the status of execution of child package I am using two differnt constraint ..one constraint is having value "Success" and other having value "Failure".
My problem is that when child packge is executed successfully the constraint with value = "Success" works properly but when child fails the constraint with value "Failure" does not work.
I have created a SSIS package with a Foreach Loop including a Data Flow Task, which in turn include a Row Count component which pass the row count value to variable with package scope. The variable is used in an Execute SQL Task following the Data Flow Task.
The package executes successfully when executed on its own, but when executed as a child from a parent package (which only include an Execute Package Task) the variable from the Foreach Loop becomes NULL.
There are a lot of other variables in the package receiving values dynamically without any problem, the row count variable however is the only variable in the package that receives a value as part of a Data Flow (and used in following tasks within the Foreach Loop).
Why does the variable become Null? For your information, I am using a variable with package scope and no variables from the parent package are used or passed from the child package to the parent package.
(For your information, we are running the 64 bit version)
The master package has a configuration file, specifying the connect strings The master package passes these connect-strings to the child packages in a variable Both master package and child packages have connection managers, setup to use localhost. This is done deliberately to be able to test the packages on individual development pc€™s. We do not want to change anything inside the packages when deploying to test, and from test to production. All differences will be in the config files (which are pretty fixed, they very seldom change). That way we can be sure that we can deploy to production without any changes at all.
The package is run from the file system, through a job-schedule.
We experience the following when running on a not default sql-server instance (called dkms5253uedw)
Case 1: The master package starts by executing three sql-scripts (drop foreign key€™s, truncate tables, create foreign key€™s). This works fine.
The master package then executes the first child package. We then in the sysdtslog get:
Error - €œcannot connect to database xxx€? Info - €œpackage is preparing to get connection string from parent €¦€?
The child package then executes OK, does all it€™s work, and finish. Because there has been an error, the master package then stops with an error.
Case 2: When we run exactly the same, but with the connection strings in the config file pointing to the default instance (dkms5253), the everything works fine.
Case 3: When we run exactly the same, again against the dkms5253uedw instance, but now with the exact same databases defined in the default instance, it also works perfect.
Case 4: When we then stop the sql-server on the default instance, the package faults again, this time with
Error - €œtimeout when connect to database xxx€? Info - €œpackage is preparing to get connection string from parent €¦€?
And the continues as in the first case
From all this we conclude, that the child package tries to connect to the database before it knows the connection string it gets passed in the variable from the master package. It therefore tries to connect to the default instance, and this only works if the default instance is running and has the same databases defined. As far as we can see, the child package does no work against the default instance (no logging etc.).
We have tried delayed validation in the packages and in the connection managers, but with the same results (error).
So we are desperately hoping that someone can help us solve this problem.
We have one main package from which 7 other child packages are called. We are using ParentPackage variables to assign values for variables and database connections.
While the values from ParentPackage variable get assigned to some of the packages properly, to others it doesn€™t assign the value.
For example: Except for one of the packages the database connection string gets assigned properly to all other packages.
Similarly, in another package one of the variables doesn€™t get assigned. In other packages it is assigned properly.
We have checked all the other property values and they are exactly the same.
We cannot make any head or tail of this erratic behavior.
I am interested in Passing value from a child Package variable to the Parent package that calls it in ssis.
I am able to call the Child package using the execute package task and use Configurations to pass values from the parent variable to the child, but I am not able to pass the value from the child to the parent.
I have a variable called datasetId in both the parent and child. it gets computed in the child and needs to be passed to the parent...
Here's the deal. I have a child package, (say, pack01.dtsx), which uses a dtsconfig file for its connection string, which can be called from other packages, but which also can be called by itself.
However I also have another package (say, pack02.dtsx) which uses the same dtsconfig file for its connection string. It calls on pack01.dtsx.
When I use DTEXECUI and run pack01.dtsx, specifying the proper .dtsconfig file, it goes well. But when I try and run pack02.dtsx, an error occurs saying pack01.dtsx connection cannot be established.
How do I pass the connectionstring being used by pack02 to pack01, without having to remove the configuration file setting of pack01? Can a Parent Package configuration and a configuration file try and map to the same property?
I am creating the report and setting the datasource(dataset) dynamically to the report. I want to set the parameters also dynamically. am using Report Viewer to process the report.
When i set the parameters locally, i am getting the below error: An error occurred during local report processing
Here is my code: da = new SqlDataAdapter(strqry, conn); ds = new DataSet(); da.Fill(ds);
ReportDataSource ReportDataSourceX = new ReportDataSource(); ReportDataSourceX.Value = ds.Tables[0]; ReportParameter[] parm = new ReportParameter[2];
parm[0] = new ReportParameter("Business_Function", "SQMO"); parm[1] = new ReportParameter("Application", "ETMRS"); parm[0] = new ReportParameter("Owner", "Subbu"); //parm[0] = new ReportParameter("Business_Function", ds.Tables[0].Columns["Business_Function"].ToString()); //parm[1] = new ReportParameter("Application", ds.Tables[0].Columns["Application"].ToString()); //parm[0] = new ReportParameter("Owner", ds.Tables[0].Columns["Owner"].ToString());
I have posted concerning this-I don't mean to double post but I have isolated the problem and have new information. (and I'm desperate!!) Please direct me to a DTS forum if this is not the place for this, I didn't see one.
I am trying set the data connection object (ODBC to Sybase) twice in the same package but the Data Pump Tasks that use this connection don't seem to be using the new setting in the connection after the second time it is set, although at the end of the package I can see that the connection has been set. The pumps just use the original connection a second time instead of using the 2nd connection.
Here is the progress: 1. SQL Task1: Gets server information from a SQL Server table, and sets it to a recordset global variable in the DTS. 2. ActiveX Task1: Begins loop, and sets the connection to server2 and sets sql of datapump tasks 3. Several data pumps run. 4. ActiveX Task2: just loops around to ActiveX Task1 5. ActiveX Task1: Sets the connection object to server2 (msgbox tells me this happens successfully) 6. The datapumps continue to use server2(??) 7. Package finishes succesfully 8. I right click on first datapump and do a "execute task". It executes using server2, which for some reason it did not do during execution of the entire package.
The loops works fine (as I have gotten it from you Allan), it clearly goes through twice when I have specified two servers (I can see the data in the resulting table is doubled from the source server, and I can see it in the progress window going through the data pump steps twice). After, if I run the data pump tasks by right clicking and doing an "execute task" then it works using server2 was set.
I put a msgbox in the ActiveX Task1 to confirm the server is being set correctly.
Any ideas here? Is there some other property I need to set? Is it a setting with the Data Pump task? This all seems very weird-I feel like I must be missing something obvious.
Here is the part of the code of ActiveX Task 1 (only part) that sets the connection (let me know if you want to see the package):
Dim pkg Dim SCCScon Dim objRS Dim SQLStatement
set pkg = DTSGlobalVariables.Parent set SCCScon = pkg.Connections("Connection_Name")
set stpEnterLoop = pkg.Steps("DTSStep_DTSDataPumpTask_1") set stpFinished = pkg.Steps("DTSStep_DTSActiveScriptTask_5")
Set objRS = DTSGlobalVariables("gvDataSources").Value
I am trying to dynamically set the name of a checkpoint file and I have used expressions in the package property and evaluated the following expression correctly
I have a SSIS Package with a "Execute Package Task" to call a child package. I am trying to have the master/parent package complete its execution regardless the outcome (failure or success) of the child package. The overall structure of the master package is:
1. Perform Pre-load tasks (stored procedure).
2. Execute Package Task (call child package)
3. Perform Post-load Tasks (stored procedure)
I have try everything and cannot get the results that I want... I have tried the combination of "failparentonfailure", "forceexecutionvalue", etc. The master package stops at childs failure. I would like to resume to completion
I have two SSIS packages in a project, one calling the other. The parent package works fine in my local mechine. After they are deployed to the production, I schedeul jobs to run the packages in the SqlServer. The child package works fine if I run it alone, but the parent package could not find its child package if I run the parent package . As I checked, all xml config files and the connection string pointing to the child package were set correctly. It seems the parent package did not use the xml config file. Can someone help me? Thanks in advance.
Hi, In my application, i have two package, parent package and child package. the parent package is executing child package using a Execute Package Task. "Execute Out Of Process" property of Execute Package Task is set to TRUE. means the child package will be run in separate process not in the process of Parent package. this was working fine, but at a particular client location. its failing the error is "not able to load child package". for me it seems some setting on server restricting to create separate process for child package execution. when "Execute Out Of Process" property of Execute Package Task is set to FALSE. its working fine.
can anyone help what could cause its failure with property set to TRUE.
Can you dynamically set the name of the table in the SelectCommand section of the SqlDataSource? (If it is relevant, I code in C#) For example, <asp:SqlDataSource runat="server" ID="SqlDataSource1" ConnectionString="<%$ ConnectionStrings:Test-MySQL %>" ProviderName="<%$ ConnectionStrings:Test-MySQL.ProviderName %>" SelectCommand="SELECT * FROM TestTable"> I would like to replace 'TestTable' with the name of a table that is extracted from a string array in the code-behind. Appreciatively,Peter
Does sombody have experience on dynamically set or change the default value of a report parameter?
Assuming: report parameters p1, p2, p3, p4 have been set up(and have their default value 'all') with the creation of the report1; report browseing is through reportviewer that embedded in the web application; datasource is datacube
What I want to do: based on the login user of the my web application, set default value of p1 as the user's username.
What I did is:
Microsoft.Reporting.WebForms.ReportParameter reportParam = new Microsoft.Reporting.WebForms.ReportParameter("P1","Mary");
I would like to fetch the data flow component name while package is executing. Since system variable named [System::SourceName] only fetches name of the control flow tasks? Is there a way to capture them?
Hi, I have just started with ASP, and have a problem. I have created a stored procedure which looks to a specific tablename for information, based upon the users choice from a dropdown list. The control works fine when executed from within visual web developer, and I manually enter the value that the variable expects. However I can not get the dropdown listbox value to be written to the SQL value. I have tried for days, traweled the net for answers, borrowed 3ft in height of SQL books! so either I am doing something fundamentally wrong, or I am missing something. My SP is: ALTER Procedure GenericTableSelect @tablename VarChar(20) AS Declare @SQL VarChar(1000) SELECT @SQL = 'SELECT [base model] FROM ' SELECT @SQL = @SQL + @tablename Exec ( @SQL) and from the page the command to call it is: SelectCommand=generictableselect></asp:SqlDataSource> But this fails to compile and comes back with "@tablename not defined" any pointers in the right direction would help. The object of this is for two drop down boxes - the first is populated from one database of categories, the selection of which populates the second drop down list with items from within that category. Cheers, Richard