SQL 2012 :: Set SSIS EvaluateAsExpression Property Of Variable To TRUE?
Jul 23, 2015Where do you set an SSIS EvaluateAsExpression property of the variable to TRUE?
[URL]
Where do you set an SSIS EvaluateAsExpression property of the variable to TRUE?
[URL]
I am trying to use the idea as mentioned by Jamie at:
http://blogs.conchango.com/jamiethomson/archive/2005/12/09/2480.aspx
which is to build dynamic SQL using a variable evaluated as an expression.
Set Expression="SELECT * FROM MyTable WHERE MyColumn = " + @[VariableContainingFilterValue]
Everything works fine. The entire package works.
My next step is to log the variable so that I know, after package execution, exactly what SQL statement the package is executing.
I tried to do it by a couple of ways in a Script task:
1) Dts.Events.FireInformation(0, String.Empty, String.Format("SQL: {0}", Dts.Variables("SourceSQL").Expression), String.Empty, 0, False)
Gives me just an expression without actually evaluating it
2) Dts.Events.FireInformation(0, String.Empty, String.Format("SQL: {0}", Dts.Variables("SourceSQL").Value), String.Empty, 0, False)
Produces an error:
The expression for variable €œSourceSQL€? failed evaluation. There was an error in the expression.
Regards,
Yitzhak
I have a main package calling another package through the Execute Package task.
The main package is passing the Job Instance id as a parameter to the other package.
When i execute the Execute Package task the concerned package is not showing any execution progress.However when i set the Delay validation Property to True , I saw that the package executed instantly and the desired result was obtained.
I am not sure how the Delay Validation property worked for the cause , as in my package I had no scenario of a temp table being called or any other temporary variables being used which needed a Delay Validation.
I have a new SSIS package that I add a new string variable to with the EvaluateAsExpression property set to True. In the Properties window, I click inside the Expression property, but do not receive the Browse Button (...) to launch the expression builder window.
This appears to be specific to my machine since this same package behaves as expected on other PCs. I have tried to build my expression in a text editor and paste it into the Expression property, but only the first line of the expression is pasted. Am I missing a configuration or install somewhere? Any thoughts on solutions or work arounds will be greatly appreciated.
Hello,
I am having a hard time setting the executable path for an Execute Process Task in SSIS. I have a variable that is initialized at package statup which holds the path to an executable in Windows. When I set the property "Executable" Path in an expression, I get a warning that the path for the executable is not set. One workaround was to try and initialize the variable with a bogus path with the hopes that the "correct" value will be written on run-time. NO LUCK. I still get the error and I cannot run the package until I put a static path.
Does anyone have a clue as to what is going on??
Mike
I have a table which contains a column ([Connection String]) that has every connection string for the SQL instances on our estate.
I have a package which will import data from a single instance, but how do I set the connection manager so that it uses a variable and is populated by a selecting the connection strings from my central database?
Do I use a cursor to select the next connection string?
How do you pass a value in a parameter to a variable ?
View 2 Replies View RelatedHow to use table variable in SSIS 2012, is it possible to use table variable in SSIS.
I want to insert some results from EXECUTE SQL TASK to this table variable and use this variable in OLEDB SOURCE task in data flow where it is used in SQL query with IN Operation.
The table variable contain multiple values like '100','234','XYZ' Is it possible to do or is there any other solution to achieve this?
updating a recordset contained in an System.Object variable during runtime.
I am trying to execute multiple file actions (plus parsing those files into a set of staging tables) at separate locations in parallel. I know I can do this in C# but I have a business requirement to use SSIS for all ETL operations.
Any one site can have 0 to many of 1 to 3 files. I would like to run multiple sites at the same time, so when all files of all types are completed at that site then go on to the next site in the list. I know I can do a single site at a time in a foreach loop but if I can run lets say 3-5 sites concurrently then I should be able to save execution time.
My thought is to have a recordset of the sites, when any 1 of the 3 (or more) "control flows" is open, update the recordset to let it know that site being actioned, when that site is complete, update the recordset that the site is completed, and so on.Or am I running in the wrong direction?
I am using vs 2010 and I have an .xls file that I am trying to import into SQL Server 2012, and I have most of it figured out, but I have a date field that is giving me problems, and what I would like to do is put that date in a variable so I can add it to every record in my SQL Table.
I am using a SQL Task Editor with an excel connection and I have no problem getting other data from the excel document and putting into my variable, its just the date that I have problems.
I have declared one variable in Project param with some value.
I want to edit that varaiable through Script task using C# / VB code.
Looking for C#/VBÂ code which needs to be used in Script task to edit project param level variable[not for package level variable].
As part of my package, i require a date (Only date, not DateTime) which is 10 months previous to get date.Eg: for today if the package executes, then i want 12/1/2014 , which i will use in my package as a filter like 'where date='?' where ? is a paramter which is is derived from the above logic
So, I have a project parameter @ppdate with value as  -10. I create a variable with DateTime (because there is NO date type for SSIS) and gives the expression as below
dateadd("Month",@ppdate, DATEADD("D",-(DAY(GETDATE()))+1,GETDATE())) , I am getting '7/1/2011 11:33:38 AM' which i don't want - i want only '12/1/2014'. How can i get it?
To get '12/01/2014', Â If i change the variable from DateTime to string, then i think i cant use the value in the filter condition like ''where date='?' because this does not accept string. Is this correct?
I have a cube that has a Dimension set up with several values some of which are bools. While Browsing in Excel or SSMS, two new values, when used as a filter shows (All) (Blank) and (True) for selections instead of (All) (True) and (False).Â
View 2 Replies View RelatedIt is my understanding that when having LinkedServers, the option "enable promotion of Distributed Transactions for RPC" should be set to TRUE, so we can rollback , if needed, remote transactions. At least, that's my understanding of that setting.
Having said that, the TRUE setting is affecting this particular TSQL code, inside an sproc, which I would prefer not to alter:
Insert into #TempTable
EXEC ServerB.MyDatabase.MyStoreProcedure
@param1= '',
@param2= ''
When set is set to TRUE (current setting) I get this error:
OLE DB provider "SQLNCLI11" for linked server "ServerB" returned message "The partner transaction manager has disabled its support for remote/network transactions.".
Msg 7391, Level 16, State 2, Line 28
The operation could not be performed because OLE DB provider "SQLNCLI11" for linked server "ServerB" was unable to begin a distributed transaction.
... when set to off, the error goes away.
how do you store a datareader propety to a variable? Below is my current code. I have already declared my connectionString and sqlComm objects, as well as the userName and such. I need to store the value from the dr to the variables, UserName, UserPass, and serverName. ThanksTry
sqlCon.Open()
dr = sqlComm.ExecuteReader
While dr.Read
userName = dr("uName").ToString
LogInfo("userName = " & userName)
userPass = dr("uPass").ToString
LogInfo("userPass = " & userPass)
serverName = dr("sName").ToString
LogInfo("serverName = " & serverName)
End While
dr.Close()
Catch obug As Exception
LogEvent("Credentials Error: " & obug.Message)
Finally
sqlCon.Close()
End Try
I have a text file that is being insert into a table in a remote db. I have a dev server named dbname_trunk and a production server named dbname. The dataflow task refers to
[dbname_trunk]..[temp_ActivitiesAccessControlListUsers]
On the production version I would want the above to be:
[dbname]..[temp_ActivitiesAccessControlListUsers]
Is it possible to put the following into the OpenRowset property of the DataFlowComponent's custom properties?
"[" + @[User:bName] + "]..[temp_ActivitiesAccessControlListUsers]"
Actually, I know that doesn't work. What I need to know is what would work to accomplish my purpose. Ultimately, I would like to put the value in the configuration file.
Thanks
First and foremost, I'm officially and thoroughly confused between Variable, Expression and settings. I need help clearify this up for me.
I'm going to use the Flat File Connection as an example.
I have:
> A global variable named: FileToImportFullPath (string) that stores the full path of the file I want to import. Note: the file path will change has the package executes (the location of the file depends on a number of conditions, namely the Date and Time of the package execution)
> A Flat File Connection Manager used by a Flat File source in the package's main Data Flow
So ideally, I want the Flat File Connection Manager's ConnectionString to be set to whatever the value of FileToImportFullPath variable is at the time. To accomplish this, I set the ConnectionString Expression to equal to FileToImportFullPath. First question, if I set the ConnectionString Expression, is it okay for me to leave the ConnectionString property (i.e. in the Editor or the Property Editor) blank? Second, whenever I leave the ConnectionString property blank, I will get a warning stating: "A valid file name must be selected". Since this is a warning, I ignored, but during exeuction, the value really is blank. Also, I'm 100% certain that the FileToImportFullPath variable is set correctly before the Data Flow step is executed.
All in all, I'm just confused if an object's property must be set if there is already an express for it (e.g., in the file move task, the source and the destination properties).
Thanks,
Jason
Hi,
A few Q's if I may:
When configuring a variable you have to set its Value, even if EvaluateAsExpression = TRUE and an expression is specified. Presumably the Value is a compile-time default which holds until runtime?
I have 2 Execute SQL tasks preceeding a Data Flow Task. Precedence constraints require both Exec SQL Tasks to have completed successfully, before the data flow task can be entered. Each Exec SQL Task sets a variable, and a 3rd variable, which is evaluated as an expression, is a function of the 1st 2. Does the expression get re-evaluated automatically+immediately, every time one or both of its dependent variables' values changes? Thus can it be GUARANTEED that the 3rd variable has indeed been re-evaluated BEFORE the data flow is entered?
Per 1000 runs I get a dozen or so of the following errors: 'expression for variable ... failed evaluation. There was an error in the expression.' Any ideas..?
I want to store data warehouse source tables and files in an Archive schema and then delete / drop them after a specified period of time.
Is there a table property that I can set (can't find one) or some other mechanism so that I can easily identify these tables with a script.
If there is no such property or feature within the database engine I will define a metadata table and record it there, but a property or similar that I can set at archive time would be very handy.
What are the tables are having the Default Constraint those table's Script the User can't generate but remaining tables He/She can generate,If i want DENY he/she to do not generate the script those are not having the DF Constraints what should i do.....
View 1 Replies View RelatedI have a date held in a varchar field in a temporary sql table and I want to convert it into a sql date and it doesn't work. I can replicate this as below -
So I run
select
cast ('14/02/2014' as date)
and I get a conversion failed error.
I'm working on an SSIS package that uses a vb.net script to grab some XML from a webservice (I'd explain why I'm not using a web service task here, but I'd just get angry), and I wish to then assign the XML string to a package variable which then gets sent along to a DataFlow Task that contains an XML Source that points at said variable. when I copy the XML string into the variable value in the script, if do a quickwatch on the variable (as in Dts.Variable("MyXML").value) it looks as though the new value has been copied to the variable, but when I step out of that task and look at the package explorer the variable is its original value.
I think the problem is that the dataflow XML source has a lock on the variable and so the script task isn't affecting it. Does anyone have any experience with this kind of problem, or know a workaround?
I have a SQL Task that updates running totals on a record inserted using a Data Flow Task. The package runs without error, but the actual row does not calculate the running totals. I suspect that the inserted record is not committed until the package completes and the SQL Task is seeing the previous record as the current. Here is the code in the SQL Task:
DECLARE @DV INT;
SET @DV = (SELECT MAX(DateValue) FROM tblTG);
DECLARE @PV INT;
SET @PV = @DV - 1;
I've not been successful in passing a SSIS global variable to a declared parameter, but is it possible to do this:
DECLARE @DV INT;
SET @DV = ?;
DECLARE @PV INT;
SET @PV = @DV - 1;
I have almost 50 references to these parameters in the query so a substitution would be helpful.
Dan
I'm new to SSIS, but have been programming in SQL and ASP.Net for several years. In Visual Studio 2005 Team Edition I've created an SSIS that imports data from a flat file into the database. The original process worked, but did not check the creation date of the import file. I've been asked to add logic that will check that date and verify that it's more recent than a value stored in the database before the import process executes.
Here are the task steps.
[Execute SQL Task] - Run a stored procedure that checks to see if the import is running. If so, stop execution. Otherwise, proceed to the next step.
[Execute SQL Task] - Log an entry to a table indicating that the import has started.
[Script Task] - Get the create date for the current flat file via the reference provided in the file connection manager. Assign that date to a global value (FileCreateDate) and pass it to the next step. This works.
[Execute SQL Task] - Compare this file date with the last file create date in the database. This is where the process breaks. This step depends on 2 variables defined at a global level. The first is FileCreateDate, which gets set in step 3. The second is a global variable named IsNewFile. That variable needs to be set in this step based on what the stored procedure this step calls finds out on the database. Precedence constraints direct behavior to the next proper node according to the TRUE/FALSE setting of IsNewFile.
If IsNewFile is FALSE, direct the process to a step that enters a log entry to a table and conclude execution of the SSIS.
If IsNewFile is TRUE, proceed with the import. There are 5 other subsequent steps that follow this decision, but since those work they are not relevant to this post.
Here is the stored procedure that Step 4 is calling. You can see that I experimented with using and not using the OUTPUT option. I really don't care if it returns the value as an OUTPUT or as a field in a recordset. All I care about is getting that value back from the stored procedure so this node in the decision tree can point the flow in the correct direction.
CREATE PROCEDURE [dbo].[p_CheckImportFileCreateDate]
/*
The SSIS package passes the FileCreateDate parameter to this procedure, which then compares that parameter with the date saved in tbl_ImportFileCreateDate.
If the date is newer (or if there is no date), it updates the field in that table and returns a TRUE IsNewFile bit value in a recordset.
Otherwise it returns a FALSE value in the IsNewFile column.
Example:
exec p_CheckImportFileCreateDate 'GL Account Import', '2/27/2008 9:24 AM', 0
*/
@ProcessName varchar(50)
, @FileCreateDate datetime
, @IsNewFile bit OUTPUT
AS
SET NOCOUNT ON
--DECLARE @IsNewFile bit
DECLARE @CreateDateInTable datetime
SELECT @CreateDateInTable = FileCreateDate FROM tbl_ImportFileCreateDate WHERE ProcessName = @ProcessName
IF EXISTS (SELECT ProcessName FROM tbl_ImportFileCreateDate WHERE ProcessName = @ProcessName)
BEGIN
-- The process exists in tbl_ImportFileCreateDate. Compare the create dates.
IF (@FileCreateDate > @CreateDateInTable)
BEGIN
-- This is a newer file date. Update the table and set @IsNewFile to TRUE.
UPDATE tbl_ImportFileCreateDate
SET FileCreateDate = @FileCreateDate
WHERE ProcessName = @ProcessName
SET @IsNewFile = 1
END
ELSE
BEGIN
-- The file date is the same or older.
SET @IsNewFile = 0
END
END
ELSE
BEGIN
-- This is a new process for tbl_ImportFileCreateDate. Add a record to that table and set @IsNewFile to TRUE.
INSERT INTO tbl_ImportFileCreateDate (ProcessName, FileCreateDate)
VALUES (@ProcessName, @FileCreateDate)
SET @IsNewFile = 1
END
SELECT @IsNewFile
The relevant Global Variables in the package are defined as follows:
Name : Scope : Date Type : Value
FileCreateDate : (Package Name) : DateType : 1/1/2000
IsNewFile : (Package Name) : Boolean : False
Setting the properties in the "Execute SQL Task Editor" has been the difficult part of this. Here are the settings.
General
Name = Compare Last File Create Date
Description = Compares the create date of the current file with a value in tbl_ImportFileCreateDate.
TimeOut = 0
CodePage = 1252
ResultSet = None
ConnectionType = OLE DB
Connection = MyServerDataBase
SQLSourceType = Direct input
IsQueryStoredProcedure = False
BypassPrepare = True
I tried several SQL statements, suspecting it's a syntax issue. All of these failed, but with different error messages. These are the 2 most recent attempts based on posts I was able to locate.
SQLStatement = exec ? = dbo.p_CheckImportFileCreateDate 'GL Account Import', ?, ? output
SQLStatement = exec p_CheckImportFileCreateDate 'GL Account Import', ?, ? output
Parameter Mapping
Variable Name = User::FileCreateDate, Direction = Input, DataType = DATE, Parameter Name = 0, Parameter Size = -1
Variable Name = User::IsNewFile, Direction = Output, DataType = BYTE, Parameter Name = 1, Parameter Size = -1
Result Set is empty.
Expressions is empty.
When I run this in debug mode with this SQL statement ...
exec ? = dbo.p_CheckImportFileCreateDate 'GL Account Import', ?, ? output
... the following error message appears.
SSIS package "MyPackage.dtsx" starting.
Information: 0x4004300A at Import data from flat file to tbl_GLImport, DTS.Pipeline: Validation phase is beginning.
Error: 0xC002F210 at Compare Last File Create Date, Execute SQL Task: Executing the query "exec ? = dbo.p_CheckImportFileCreateDate 'GL Account Import', ?, ? output" failed with the following error: "No value given for one or more required parameters.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
Task failed: Compare Last File Create Date
Warning: 0x80019002 at GLImport: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. The Execution method succeeded, but the number of errors raised (1) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "MyPackage.dtsx" finished: Failure.
When the above is run tbl_ImportFileCreateDate does not get updated, so it's failing at some point when calling the procedure.
When I run this in debug mode with this SQL statement ...
exec p_CheckImportFileCreateDate 'GL Account Import', ?, ? output
... the tbl_ImportFileCreateDate table gets updated. So I know that data piece is working, but then it fails with the following message.
SSIS package "MyPackage.dtsx" starting.
Information: 0x4004300A at Import data from flat file to tbl_GLImport, DTS.Pipeline: Validation phase is beginning.
Error: 0xC001F009 at GLImport: The type of the value being assigned to variable "User::IsNewFile" differs from the current variable type. Variables may not change type during execution. Variable types are strict, except for variables of type Object.
Error: 0xC002F210 at Compare Last File Create Date, Execute SQL Task: Executing the query "exec p_CheckImportFileCreateDate 'GL Account Import', ?, ? output" failed with the following error: "The type of the value being assigned to variable "User::IsNewFile" differs from the current variable type. Variables may not change type during execution. Variable types are strict, except for variables of type Object.
". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
Task failed: Compare Last File Create Date
Warning: 0x80019002 at GLImport: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. The Execution method succeeded, but the number of errors raised (3) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "MyPackage.dtsx" finished: Failure.
The IsNewFile global variable is scoped at the package level and has a Boolean data type, and the Output parameter in the stored procedure is defined as a Bit. So what gives?
The "Possible Failure Reasons" message is so generic that it's been useless to me. And I've been unable to find any examples online that explain how to do what I'm attempting. This would seem to be a very common task. My suspicion is that one or more of the settings in that Execute SQL Task node is bad. Or that there is some cryptic, undocumented reason that this is failing.
Thanks for your help.
Hi.
I have a master package, which executes child packages that are located on a SQL Server. The Child packages execute other child packages which are also located on the SQL server.
Everything works fine when I execute in process. But when I set the parameter in the mater package ExecutePackageTask to ExecuteOutOfProcess = True, I get the following error
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "Row Count" (5349).
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "SCR Custom Split" (6399).
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "SCR Data Source" (5100).
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "DST_SCR Load Data" (6149).
The child packages all run fine when executed directly, and the master package runs fine if Execute Out of Process is False.
Any help would be greatly appreciated.
Thanks
Geoff.
I have a pile of around 50 packages (*.dtsx files) or so, and we have found that not all of these packages are set to have the critical bits fail the package on failure of the critical bits. This gets to be slightly annoying when there is a job of some 20 or 30 packages running, and one of these fails and the rest of the job runs along as if nothing (bad) happened. This leads to having to hunt down what failed, diagnosing why it failed, and then re-running the parts that failed in different and inventive ways after the unhandled exception.
So far, I have broken open a couple of the offending SSIS packages in notepad, and the string "FailPackageOnFailure" appears in a number of places. Unfortunately, due to the fact that some of the packages have Sequence Containers that hold some of the SQLTasks (all of the "critical bits" are SQL Tasks, I believe), and this changes the XML path to the parts that I want to scan. This leaves me with the prospect of opening each package, wait for the package to be validated, going to the SQL Task in the package, right clicking for the properties of said SQL Task, and finding the Fail Package on Failure bit (OK, more likely I will delegate that task to some poor schmuck, but I promise I will feel some regret over it).
To make certain SSIS features work there are many properties that need to be set over and over for most containers in the package. For example with CheckpointRestart, you need to set (in most cases) all of the tasks' FailPackageOnFailure to True. If you miss one, the package may not restart properly and you might never know. There are other situations where as a development team we want certain properties to be usually set the same but differently from the SSIS default.
Is there a way to control the defaults that the SSIS IDE uses?
I remember back in classic VB that if you wanted to change the defaults of a bare form you could create a template form adjusted the way you like and put it in a templates folder. Then new forms added to a project would be based on the template form.
Everytime when I set this value to true; it saves but when I close the package and reopen it. It returns to false, I am using OLE DB connection... Help Please
View 1 Replies View RelatedCan the WaitForMe Task property be used for building eventing based tasks (that is, "tasks with no defined execution lifecycle"), as Haselden's IS book terms it?
The SSIS run-time would appear to never interrogate this property.
I tried setting a breakpoint in a custom task on the WaitForMe
property. The breakpoint was never hit at run-time. Design-time only. It seems that the run-time ignores this property.
The documentation states that "On completion of any task, the runtime examines tasks that are still running in
the container, and if any of these tasks have WaitForMe set to
false, those tasks will be canceled." Well, the tasks with WaitForMe set to false are in fact not cancelled (e.g. System::CancelEvent is not signalled).
The intention of the WaitForMe Task property is to allow for event based tasks to not prevent a package (or other container) from finishing if the event never happens.
Tasks like the WMI Event Watcher, Message QueueTask set to receive, and third party tasks like file watchers, or service broker queue receivers would seem to fall in this category.
SSIS data flow transformation - Lookup task - best practice concerning Cachetype
I would like to know if there's any best practice concerning the CacheType property for the the Lookup task. Default value is "Full", but if the SSIS package is working with at lot of data, i.e. +10 mill. records from the OLE DB source to be handled through a variated numbers of data flow transformation tasks, it must have an impact memory usage if the lookup table also is a large table, i.e. +8 mill. records? When should I consider turning the property value to "none"
I see various references to the options for a package's protection level (including http://msdn2.microsoft.com/en-us/library/ms141747.aspx) but I can't seem to find anything telling me how to actually look at and/or change the protection level of a package - thus my question is "How do I change the protection level of an SSIS package?"
Thanks!
- Lance
Hi,
I want to transform an xml flow to an html flow. For this, I create an XmlDataDocument, I add on it, all that I want, and I store it in a file (in a dataflow task).
After that, in an xml task, I set the input properties like that:
Operation type: xslt
SourceType: File connection
Source: ras.xml
All works fine. But now, I want to use a variable to store my xml data. So, instead of storing my data in ras.xml, I store it in a variable like that:
Dim v As Microsoft.SqlServer.Dts.Runtime.Wrapper.IDTSVariables90
Me.VariableDispenser.LockOneForWrite("RAS", v)
v("RAS").Value = doc.OuterXml
v.Unlock()
and in the xml task, I set configuration like that:
Operation type: xslt
SourceType: Variable
Source: user::RAS
But when I execute, I got this following exception:
Error: 0xC002F304 at Format HTML Mail, XML Task: An error occurred with the following error message: "Root element is missing.".
Error: 0xC002928F at Format HTML Mail, XML Task: Property "New Source" has no source Xml text; Xml Text is either invalid, null or empty string.
Furthermore, when I want to debug it, in replacing the xml task, by a script task, I can see that in my RAS variable, I get the xml data which is well formed.
<?xml version="1.0" encoding="UTF-8"?>
<root> €¦
</root>
(When I store it in a file, I can see it in IE)
Have you got an idea ?
I am having problems exporting data into a flat file using specific code page. My application has a variable "User::CodePage" that stores code page value (936, 950, 1252, etc) based on the data source. This variable is assigned to the CodePage property of desitnation file connection using Property expression.
But, when I execute the package, the CodePage property of the Destination file connection defaults to the initial value that was set for "User:CodePage" variable in design mode. I checked the value within the variable during runtime and it changes correctly for each data source. But, the property of the destinatin file connection doesn't change and results in an error.
[Flat File Destination [473]] Error: Data conversion failed. The data conversion for column "Column01" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.".
[DTS.Pipeline] Error: The ProcessInput method on component "Flat File Destination" (473) failed with error code 0xC02020A0. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
If I manually update the variable with correct code page and re-run the ETL, everything works fine. Just that it doesn't work during run-time mode.
Can someone please help me resolve this.
Thanks much.