Can I design data flows with an XML Source pointing to file system xml files, and but run them against variables? I can see the properties at design time (Accessmode=0,1,2; XMLDataVar="User::XMLData), but the XMLSource object doesn't appear to have expressions enabled to change this at runtime, nor do these properties seem to be exposed to configurations.
The scenario:
I have a package that reads through (potentially thousands) of XML files, and having identified the embedded message "type", transforms them via XSLT to an easier XML format (XML Datasource can't cope well with multiple namespaces etc). Then send this the new file off to a type specific dataflow task to send contents to database.
For performance (and other) reasons I'd rather XSLT to a single shared variable and use this variable as the XML source in the data flows (Rather than writing out to the filesystem [via xslt] and immediately reading it back in each time [via datasource]).
But designing the data flow task using a variable as xml source is frustrating for a bunch of reasons - not the least being:
(a) The variable needs to be populated with sample xml data at design time. But this could only ever match one dataflow at a time (fine at run time, but painful at designtime - leading to validation errors popping up all around the place.)
Yes I could have a seperate variable for each xmlsource - but that just makes things more complicated up front and also leaves me with 20+ large, type specific, xml variables floating around at runtime of which only one is ever being used per import - which just doesn't feel right to me.
(b) populating the variable with sample xml at design time is painful due to it being a string - and not accepting hard returns etc that are in text files, design time changes etc)
Using a file XML source at design time is infinitely easier, when I get a design time change - I can just amend the XSLT and run the appropriate task to produce a sample XML file to design against.
Hi, I am relatively new to SSIS. Please advice me on how to change the value of a variable during runtime. i.e. user should be able to key in the value.
All, I need to pass a variable to get source data from Oracle on OLE DB source. The variable is defined by an Execute SQL task (previous step) on the control flow and it is a package level variable. In the data flow, OLE DB source using €˜SQL command from variable€™. The variable sql command works when it just: Select *from table But I need the variable SQL command like this: Select * from table where last_modified_date > ? - - variable1 the variable1 is defined by anther Execute SQL task at runtime
I am having a Data flow task in For each loop which will gets 100 sourcetable names and 100 target table names...
am having a simpleData flow task which trasferes from OLEDBSource to OLEDBDestination. I am repeating the Dataflow task which transfers from sourcetablename extracted from for loop to a destination table var.
The problem am gettting is for the first table it is able to transfer correcly because I did mapping for those tables at design time...but for the next coming sourcetable-desttable (which r having different no of cols,datatypes) its giving Validation failed...and...needs to refresh metadata....
is there any way to refresh the metadata of Data flow task (I set the property of OLEDBSource validate external meta to false then also same error is coming)
Hello! I'm using SQL Server 2000. I have a variable which contains the name of another variable scoped in my stored procedure. I need to get the value of that other variable, namely:
CREATE table #myTable(value VARCHAR(20)) INSERT into #myTable values('@operation')
SELECT top 1 @parameterValue = value from #myTable -- Now @parameterValue is assigned the string '@operation' -- Here I need some way to retrieve the value of the @operation variable I declared before (in fact -- another process writes into the table I retrieved the value from), in this case 'DIS'
DROP TABLE #myTable
I've tried several ways, but didn't succeed yet! Please tell me there's a way to solve my problem!!!
I have to create a script for changing the datasource at runtime.
Here is my screnario, While development I am using Data source name called "DevDatasource1" and when I am deploying it to other evnironment the datasource name will change let us say "QADatasource".
I have to create a script for changing the datasource(i.e. DevDatasource1 to QADatasource). How I can achieve I this using the setItemdatasource?
How can i change the connection string at runtime?
Scenario is, I want my customer to enter server name, database name, and some parameters at the installation time then i will use these entries to build the connection string and persist it into the app.config.
also not only at installation time but also at runtime while the program is running. i wanna make these entries as options which can be modified at any time
I'm trying to create a custom data flow transformation, I need to set the value of a user variable which will be used by a downstream component(for example: a data flow destination) at runtime, is this possible? If possible, in which method should I do that? Thanks.
I have a defined data source to an oracle server. I've alredy intalled oracle client, and setup my data source to save the user and password. I'm using .NET provider/OracleClient DataProvider Connection. When I click on "Test Connection" Button, SSIS reports SUCCESS. In Connection Manager TAB I created on connection called "OracleServer" from my oracle data source, described above.
In my package, I defined a DataReaderSource task, I specified "OracleServer" as a connection to it. I can preview data and view oracle's columns name..., so it make me think that everything is fine. But when It execute the task it FAIL and notify logon error and that password can't be blank.
I have a flat file.I am trying to set the value for the property "HeaderRowsToSkip" during runtime.I have set an expression for this in my "flat file connection manager". But this is not working.The connection manager is not able to take the value during runtime.
My expression is as follows:
DataRowsToSkip : @[user:: Var]
where "Var" is my variable which gets the value from the rowcount component and trying to set it back to the "HeaderRowsToskip" property.
I ve even tried setting the value to the "HeaderRowsToSkip" property in the expression builder.
Please help figure out what is wrong with my code. The script is supposed to load a package (from file). The loaded package already has everything set up to run a query against a local server and output the results to an Excel file. The reason for the outer script is because I need to change the query based on a global variable. When the query changes, though, I think the existing dataflow Path is no longer valid, so I should remove it and re-create another one with the new input mappings. Here is my code, which runs and throws an exception at the AcquireConnections call.
The error is
Error: 0x2 at Script Task: The script threw an exception: Exception from HRESULT: 0xC020801B
I pieced together this code from the examples in the online books, but I am not sure what to do.
' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Imports Microsoft.SqlServer.Dts.Pipeline
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Public Class ScriptMain
Public Sub Main()
'
Dim app As Microsoft.SqlServer.Dts.Runtime.Application = New Application()
Dim package As Microsoft.SqlServer.Dts.Runtime.Package = _
Hi all, I have several reports using single shared datasource. I want to change at a runtime database that is used by that datasource. Can this be achieved? If not what are the other solutions €“ I guess that using not shared datasource for each report may be the solution (is it?) but it is not the best solution for me. My goal is to allow users to run the same set of reports, viewed in ReportViewer control, but using different databases (connection string dependant).
I've seen a number of posts similar to this but i still cannot figure out what i need to do to get it working. So here goes with a couple of newbie questions.
Question 1: Once created how do i go about executing a SSIS package. I want to be able to call it from a C# application from which i pass in a couple of parameters?
Question 2: How do i go about setting the file path of my Excel source to a dynamic value passed at runtime. I want to be able to loop through a number of Excel files and do some processing on them. I've set up a variable (which i think i need to do) after that i get stuck however. Some other posts suggest configuration packages but i cannot get my head around how they work?
Any help on this matter would be gratefully recieved.
1 20031012121212 200 (recordtype, CreationDateTime(YYYYMMDDHHMISS), Rec_Count) -- Header records 2 ABCD, XYZ, 9999, 999999999, 1234 ---- Detailed Record 2 ABCD, XYZ, 9999, 999999999, 1234 ---- Detailed Record
For the above given sample data I am having two outputs at Condition Split (based on the recordtype). I want to store the 1st record datetime value into a variable and then I want to use that variable value into 2nd and 3rd row.
Basically, detailed records would be stored into the database not the header record. Is there any way I can use the variable while doing processing for 2nd and 3rd records.
Hi, I want to transform an xml flow to an html flow. For this, I create an XmlDataDocument, I add on it, all that I want, and I store it in a file (in a dataflow task). After that, in an xml task, I set the input properties like that: Operation type: xslt SourceType: File connection Source: ras.xml
All works fine. But now, I want to use a variable to store my xml data. So, instead of storing my data in ras.xml, I store it in a variable like that:
Dim v As Microsoft.SqlServer.Dts.Runtime.Wrapper.IDTSVariables90 Me.VariableDispenser.LockOneForWrite("RAS", v) v("RAS").Value = doc.OuterXml v.Unlock()
and in the xml task, I set configuration like that:
But when I execute, I got this following exception:
Error: 0xC002F304 at Format HTML Mail, XML Task: An error occurred with the following error message: "Root element is missing.". Error: 0xC002928F at Format HTML Mail, XML Task: Property "New Source" has no source Xml text; Xml Text is either invalid, null or empty string.
Furthermore, when I want to debug it, in replacing the xml task, by a script task, I can see that in my RAS variable, I get the xml data which is well formed.
Hi -- I am fairly new to SSIS development, although I am starting to appreciate it more an more, especially since I have started getting into extending the object model. Here's my question:
I have a data flow that pulls data from any number of different delimited files with different numbers of columns. I have had no problem dealing with setting up run-time file locations and file names by using the expressions of a flat file data source, and i have been able to pretty easily deal with varying file delimiters by standardizing the files before they get into the data flow. however, I have not been able to come up with a solution that will allow my data source to discover its column info at run time, and then pass that information on to the data flow task. all i really care about is being able to properly parse the individual rows into individual column data by the flat file data source because the data flow itself is able to discover the actual data that the columns hold at run-time.
i would very much appreciate any feedback from anyone on possible solutions for this.
I have a number of DTS packages I am trying to convert and am totally new to Integration Services.
I have a flat file that I reference to get a date that I use in some SQL statements in the package. It seems like the best way to do this would be to create a variable to hold the date and set it at run time from the flat file. I've been searching the documentation and the forum and can't figure out how to do this. This is so basic that I feel like an idiot putting it on the forum but I need some help. This whole Integration Services thing is a big shift from DTS (which I eventually became pretty good at) Now I'm back to ground zero.
Any help would be appreciated. If you can direct me to any basic training resources that would be cool too.
In our company, after switching to Office 2007, Access 2007 runtime is installed on our general user machines. Everything is just fine and beautiful, except that there is one issue with runtime:
Several forms in my Access Project use stored procedures in SQL Server 2000 as their record source. When i try to open the forms in Access 2007 Runtime on user machines, i receive an error that "The record source dbo.Myprocedure speified on this form or report does not exist ". I use dbo.ProcName format and the forms open correctly in my full version Access. What could be the problem?
Hi, I have this Gridview in which I show attendees to a course (per course) In the same gridview I want to be able (by clicking a button) to show all attendees to all courses simply by leaving out a part of the SQL; WHERE Eventid = @eventid (resulting in all attendees no matter what course) In other words how can I alter the selectcommand of the SQLDataSource? Seems simple, but can't hack it. Thanks in advance, Lx
I am building a data warehouse for a customer who has systems located in two different countries.
I need to import that data from four seperate databases, which all share the same structure.
To do this i have created 20 packages to import that data from the source database. What i would like to do, is at run time set which database the SSIS package should get its data from.
In sql 2k this was easy with a global variable that was set, then use a dynamic properties task to set the data source.
How can i achieve the same result in SSIS? the data source is an ODBC connection, with the four ODBC connections having similar names, eg ABC_NZ, ABC_AU
We had existing cubes in our Analysis Server, we were required to move them on another Reporting Server which would be using Data Replicated every night to that server. Problem is now source data is divided into 2 Reporting Database Servers. Table Names/View Names are the same in all the Databases. I just want to change the data source pointing to existing Database to the new Reporting Server. Can you tell me how this can be achieved?
I'm importing some data from an excel file to sql server 2005.
I created an Excel Data Source inside my Data Flow Task but it is assuming that the source columns DataType is double-precision float [DT_R8]. It isn't, even though some rows may containg numeric string in the column's cell.
If I go to the Data Sources advanded editor, and modify the data type property of the column, SSIS complains the the error output and the source output are not of the same DataType. If I try to change the error output's data type in the advanced editor I get this error: "Property Value". The deailed error states:
Error at MOVIM 04 [MOVIM 04 [1]]: The data type for "output "Excel Source Error Output" (10)" cannot be modified in the error "output column "Agente Protector" (7662)".
Error at MOVIM 04 [MOVIM 04 [1]]: Failed to set property "DataType" on "output column "Agente Protector" (7662)".
If i let SSIS correct the error by itself, it changed the dource column back to double-precision float [DT_R8].
Hi, I am trying to import data from Oracle RDB into SQL Server 2005 using SSIS. Created a ODBC data source to connect to Oracle and used DataReader Source component and ADO.net to connect to the ODBC data source.
Under the Component properties tab, the SQL Command looks something like this.
Select ID, ADDRESS, REVISED from ADDRESS
The data type for the source columns are Integer, Varchar(30) and DATE VMS.
Now when I look at the Input and Output properties window,
The External columns has the following data types.
ID - four-byte signed integer [DT_I4] ADDRESS - Unicode string [DT_WSTR], length = 0 REVISED - database timestamp [DT_DBTIMESTAMP]
The Output columns has the following data types
ID - four-byte signed integer [DT_I4] ADDRESS - Unicode string [DT_WSTR], length = 0 REVISED - database timestamp [DT_DBTIMESTAMP]
When I tried to change the length of the ADDRESS on the output column, I get the following error.
Error at Data Flow Task [DataReader Source [1]]: The data type of output columns on the component "DataReader Source" (1) cannot be changed.
Is this the default length for the Unicode string type. I was not able to load the ADDRESS column as it gets truncated before I load it into destination. Even if I use Derived or Data Conversion transformation, the ADDRESS is getting truncated before it reaches this transformation.
I have an Execute SQL Task that selects one column value from one row, so General > ResultSet = Single row. Result Set > Result Name = 0 (the first selected value) and Variable Name = User::objectTypeNbr. The task runs successfully, but after the it runs the value of User::objectTypeNbr is not changed.
User::objectTypeNbr > Data Type = Int32. When I declared the variable Value could not be empty so I set it to 0 arbitraily, assuming it would be overwritten when assigned a new value by the Execute SQL Task, but it remains 0 after the task runs. What am I missing here?
I have a data flow that reads from a flat file source, goes through one data transformation component to change from unicode to normal text and writes the data to a SQL Server table. This has been working fine throughout development using a specific source file as input. I have now manually changed the path and name of the input source file in the connection manager to point to a new file and the task continues to process the old text file.
I open the connection manager and check its properties and preview the data and it all looks fine - it is finding the new file. I also edit the flat file source component and preview the data and it shows the data from the new file. I run the data flow by right-clicking and selecting Execute Container and it continually reads the old file and processes it! (I do the right-click thing because this is just one small part of a larger package.)
This has got to be a bug, but just where I wonder. Anyone ever see this before? I'm going to try to run the entire package in debug mode, instead of right-clicking, next and see if that's any different. Anyone have any ideas on how to force a refresh of the necessary internal components to make it read the new file? All the external properties point to the new file, but it's not being read.
Joe
Update:
I ran the entire package and found no difference in execution - it's still reading the wrong input file. I then deleted and recreated the connection manager, again specifying the new file to read. It still continued to read the old file. I deleted the flat file source component and recreated it, specifying the latest connection manager (twice, since I must have pointed it at the wrong one the first time and it read a completely different file). I still have the same problem of it reading the wrong input file. I don't know what else to recreate that would have any effect. Does anyone have any ideas? I need to change the pointer to different files multiple times and have it read several different input files. This has got to work somehow. Any help is appreciated.