I have a SP that outputs a valid XML record set. Using a Execute SQL
Task, I can put the XML results into a string variable. When I try to
use the variable in an XML Source nothing happens and I get no error..
If I access the same XML from a file, I get the desired results.
Hi, I want to transform an xml flow to an html flow. For this, I create an XmlDataDocument, I add on it, all that I want, and I store it in a file (in a dataflow task). After that, in an xml task, I set the input properties like that: Operation type: xslt SourceType: File connection Source: ras.xml
All works fine. But now, I want to use a variable to store my xml data. So, instead of storing my data in ras.xml, I store it in a variable like that:
Dim v As Microsoft.SqlServer.Dts.Runtime.Wrapper.IDTSVariables90 Me.VariableDispenser.LockOneForWrite("RAS", v) v("RAS").Value = doc.OuterXml v.Unlock()
and in the xml task, I set configuration like that:
But when I execute, I got this following exception:
Error: 0xC002F304 at Format HTML Mail, XML Task: An error occurred with the following error message: "Root element is missing.". Error: 0xC002928F at Format HTML Mail, XML Task: Property "New Source" has no source Xml text; Xml Text is either invalid, null or empty string.
Furthermore, when I want to debug it, in replacing the xml task, by a script task, I can see that in my RAS variable, I get the xml data which is well formed.
A little background first. I have a header table and a detail table in my staging area/ods. I need to join them together to flatten them out for load. The Detail Table is pretty deep - approx 100 million rows.
If I use the setting (table or view) and set the table name (say, the detail table), the package starts up nicely. But if I switch the OLE DB Source to using a SQL Statement and then join the tables in the SQL, then the Pre-Execute phase of the package takes a VERY long time. I have waited as long as 30 minutes for this phase to complete, but it never finished.
Another twist...If I take the join select statement out of the OLEDB Source and put it in a view on the server, then swith the OLE DB Source to look at the view using the (table or view) mode, then the package gets through the Pre-Execute phase just fine.
Can someone go into detail as to what the Pre-Execute phase does and why a deep table might make it take a long time? I know already that the pre-execute phase caches the lookups, but not much else.
I want have multiple records from 1-4500 that I must insert a picture into the database for. I am trying to set up a variable to do this if someone can show me my error I would apprieciate it. This table has 4 columns (id# Float is a FK) (ImageName nvarchar (10)) (ImageFile Varbinary(Max)) (rec# int PK Increment Seed).
declare @jpg as int
Set @jpg = 0
While @jpg <4500
begin
set @jpg = (@jpg + 1)
use consumer
insert photo (id#, ImageName, ImageFile)
Select @jpg, '@jpg.jpg',
bulkcolumn from Openrowset (Bulk 'D:DataPics@jpg.jpg', Single_Blob) as 'ImageFile'
There are 3 views in VS IDE, Designer, Code and Source. In source view, there are html codes, how do I add a variable into it ? For example: <asp:SqlDataSource ID="SqlDataSource1" runat="server" ConnectionString="<%$ ConnectionStrings:JJConnectionString %>" SelectCommand="SELECT * FROM [Tbl]" ... How to change the SelectCommand to like:SelectCommand="SELECT * FROM [Tbl] WHERE [CREATEDBY] = '" & User.Identity.Name & "'" The User.Identity.Name is not valid in source view, but i need it to work, are there any way ?
I am trying to have a DataReader Source that can run a variable which I used to store the SQL statement. For example, I have:
Variable #1
Variable name: tablename
Data Type: string
Value: "name_of_table"
Variable #2
Variable name: sql_stmt
Data Type: string
Value: "SELECT * FROM " + @tablename
I want to use DataReader Source to run Variable #2 in the SqlCommand that connects to an ODBC connection. If it is possible by any way, please let me know. Thanks in advance.
I am using a table variable as a source in my OLE DB Source task ( I declare and populate this table variable from TSQL script) to load data in my destination table however the data gets lost after the task, ie I can preview the data from the OLEDB Source task but when I go to the next task which is multicast tranformation the data is lost, I have used data viewers to see the data but they dont have any rows, any help is appreciated.
I read several posts on this topic and I would like to confirm my understanding.
This question has to do with parameterizing the select statement in the OLE DB Source editor. Initially, I thought I could use Data access mode: SQL command, and somehow use a user variable in there to build my Select statement.
But in researching further, it looks like I need to store the entire SQL command in a one long string variable (let's call it A), where A is built off of another variable that has my parameter. I then use A in Data access mode: SQL command from variable.
Am I correct? And is there not a way to accomplish the samething with only one variable that is your parameter value?
It seems a bit clumsy to store the entire SQL expression in a string variable.
I have a package-level variable [User::viewName], type = string, containing a view name. I want to setup an OLE DB source to use this variable value as the source, so Data access mode = "Table name or view name variable". The Variable name dropdown contains [User:viewName], so I select it. When I click OK to leave the edit dialog I get the error:
The variable User::viewName is required to be of type "VT_BSTR".
Only variables of type String occur in the Variable name dropdown; if I try changing it to a type other than string it doesn't occur in the dropdown. What is VT_BSTR and how can I change the variable type to it?
Using Jamie's, John's and Rafae post i have got this far. My scenario demands me to do an incremental load. So i have followed your article and have created a one variable 'vDate' (Dataflow Task, Datetime, 3/3/2008) and another variable SourceSQL (Dataflow Task, String). I have set this SourceSQL evaluation to True and have added this; "select * from stagingperfdata where startdatetime = "+ (DT_WSTR,10) @[vDate].
I want to call this within a OLE DB source, so i select the data access method to SQL command from variable. My variable name is visible and i am able to select it and then save and debug. I also have a derived column transformation to increment the variable vDate by one day. The flow is successful but there is no records written to the target. I tried the same query (available in the variable's value column) in SSMS and it is retrieving 12 records. I went back to the Ole Db source and wanted to preview the data, but i get "Query timeout expired (Microsoft SQL Native Client)" error. Am i missing something fundamental?
We can pass XML to the XML Source in a variable, but I haven't seen anywhere how much data can be passed this way? Is there a limit beyond the limits of system memory?
Also, what data types are valid for the variable? Just String?
We are storing incoming flatfiles into a text field in a table and then we are processing this table on a regular basis. What I would like to do is to get this flatfile from the textfield and populate a flatfile source with it, but so far I have only been able to do that with XML files as there are no option for doing that with the flatfile source. Using the disk as a temporary storage for the flatfile is prohibited.
Does anyone have any suggestions on how to solve this?
Hello. I€™m new to SSIS. This forum and Kirk Haselden€™s book are my teachers. I€™m having a hard time grasping something basic to get started defining a set of packages to automate the ETL process, however, and perhaps I€™m simply misunderstanding the best practices of SSIS.
I have source data in two different transactional databases, and use OleDb connection managers (and OleDb Source components in the Data Flow) to extract the data. I use a Script Task and several Lookup widgets in the Data Flow to transform the data, and output each to two different package-scope DataSet variables.
How do I join these two datasets in a third Data Flow task for loading into my data warehouse? It seems I can iterate through them in the Control Flow, but I can€™t write a query against them in the Data Flow, since there is no connection manager that allows me to €œconnect€? to a package-level variable. Should I instead be storing my extracted, transformed data in temporary database tables, and then joining these to do the final load?
Any advice greatly appreciated. Thanks in advance.
I have created a DTS Package in Integration Services 2005. Within the DTS Package declared a variable named xxx and passed a value 1234.
In the control flow i dropped a Data flow task and in the Property Expression Editor of DataFlow Task i defined Property = [DataReader Source].[sqlCommand] Expression = Variable name.
Now in the DataFlow Task Canvas dropped DataReaderSource.
How can i pass variable value to the SQLCommand ="Select * from table where name = Variable value.
Is there a way to configure the XML source component to use a schema file whose filename is in a variable? Now I know it doesn't make any sense to have an XML source whose schema can vary, however, I have a parent package with a ton of child packages, all of which process the same XML input using the same schema. What I would like to do is for the parent to load the filename of the XML input file into a variable, and the filename of the XML schema file into another. Each child would have its XML source configured to pickup these variables from the parent, so that if the location or name of the schema changes, I don't have to edit each package one-by-one. However, it doesn't seem possible to specify anything but a filename for the schema. Any suggestions?
I am trying to make a SSIS package that will loop trough all files in a directory and load information from them.
I can do this with Raw File Sources since they allow me to use a variable name as the file path, but I cant seem to do the same with Flat File Sources. Is there a way to change the connection to a Flat File Source on each iteration of a loop? Actually, if this is possible with all types of file sources (like excel files) I would love to know about it too.
I was thinking about renaming the file through a script task but that does not seem like the most elegant solution so decided to see if some one here knows of a more proper way before I go that direction.
Can I design data flows with an XML Source pointing to file system xml files, and but run them against variables? I can see the properties at design time (Accessmode=0,1,2; XMLDataVar="User::XMLData), but the XMLSource object doesn't appear to have expressions enabled to change this at runtime, nor do these properties seem to be exposed to configurations.
The scenario:
I have a package that reads through (potentially thousands) of XML files, and having identified the embedded message "type", transforms them via XSLT to an easier XML format (XML Datasource can't cope well with multiple namespaces etc). Then send this the new file off to a type specific dataflow task to send contents to database.
For performance (and other) reasons I'd rather XSLT to a single shared variable and use this variable as the XML source in the data flows (Rather than writing out to the filesystem [via xslt] and immediately reading it back in each time [via datasource]).
But designing the data flow task using a variable as xml source is frustrating for a bunch of reasons - not the least being:
(a) The variable needs to be populated with sample xml data at design time. But this could only ever match one dataflow at a time (fine at run time, but painful at designtime - leading to validation errors popping up all around the place.)
Yes I could have a seperate variable for each xmlsource - but that just makes things more complicated up front and also leaves me with 20+ large, type specific, xml variables floating around at runtime of which only one is ever being used per import - which just doesn't feel right to me.
(b) populating the variable with sample xml at design time is painful due to it being a string - and not accepting hard returns etc that are in text files, design time changes etc)
Using a file XML source at design time is infinitely easier, when I get a design time change - I can just amend the XSLT and run the appropriate task to produce a sample XML file to design against.
We have a job running the SSIS Package. I am working with that SSIS Package. The SSIS package has a script task. In the script task they used the variable of source path. I want to change the value of that variable (Need to modify the source path).
I modified the variable value in that package and in the config file but getting error message
Exception has been thrown by the target of an invocation.
I am wanting to capture the file name I am using to load data from and use it in a SQL statement to insert it along with the data that I load.
I am using a ForEach container to load all my .txt files but cannot figure out how to capture the name of each source file as it loops through the files and then add it to my insert statement that is populating my history table. I would think that the ForEach container has that information, but I do not see how to access it and assign it to a variable that I could use in my SQL statement.
All, I need to pass a variable to get source data from Oracle on OLE DB source. The variable is defined by an Execute SQL task (previous step) on the control flow and it is a package level variable. In the data flow, OLE DB source using €˜SQL command from variable€™. The variable sql command works when it just: Select *from table But I need the variable SQL command like this: Select * from table where last_modified_date > ? - - variable1 the variable1 is defined by anther Execute SQL task at runtime
I'm working on the viability of using SSIS to process data files that are (currently) sent by email or FTP and processed by hand. They are .csv or Excel and have some columns that are required, some that are included but the data is discarded, usually have some anomaly (like columns out of place, or columns missing) from source to source, and if they have headers the actual headers are inconsistent from source to source (and sometimes from submission to sumission ). Some sources send only one file, others might send a dozen. Due to the nature of the sources - many are not very technically inclined and sometimes there are other factors involved in how the data is exported - there's not a lot I can do to leverage them into all using the exact same format. Creating an SSIS package for each source (over 100 in all) is not a viable solution.
The compromise that I've come up with is I would like to come up with a universal SSIS package that can:
Take a .csv file with headers (I think we can force the sources to include the right headers if nothing else) as input , with or without quote delimiters. Not be sensitive to the order of the columns. Not be sensitive to the number of columns. Loop through multiple files from one source. Place the ouput in a staging table for further processing. Reading a .csv file, looping and the staging table is not a problem. What I'm struggling with is the middle portion. I've tried to make the column headers dynamic, but then I lose the ability to have quote delimiters. My next instinct was to the source file(s) and create one interim file with a different delimiter (such as pipe or ## or something) but that doesn't seem to work quite the way I want it - since the source has comma delimiters, the output has both the pipe and the comma. Another way I could do it is replace the commas within quote delimiters with a blank space in the data flow, which would eliminate the quote delimiters in the output. That I haven't tried yet.
Any suggestions? If I could bring down the hammer and tell every source "You must send it this way" I would, but my hands are tied.
I am trying to set a variable to be passed to my asp page but the value is not carrying over. I don't even know if this is supposed to work or not below: <td colspan="3" style="font-size: 10pt; vertical-align: top; color: <%=Div7fontcolor%>; background-color: <%=Div7bgcolor%>; text-align: center; width: 0px;" id="TopBox" runat="server">
The above Div7fontcolor and Div7bgcolor are variables that I set in the VB file as shown below: Dim obConnection As SqlConnection = New SqlConnection("Data Source=localhostsqlexpress;Initial Catalog=OrgBoard;Integrated Security=True")Dim obCommand As SqlCommand = New SqlCommand("SELECT Divisions.*, Dept19.*, Dept20.*, Dept21.*, ConfigDisplay.* FROM Divisions CROSS JOIN Dept19 CROSS JOIN Dept20 CROSS JOIN Dept21 CROSS JOIN ConfigDisplay", obConnection) obConnection.Open()Dim dr As SqlDataReader = obCommand.ExecuteReader() dr.Read() Dim Div7bgcolor As String = dr("Div7color").ToString().Trim()Dim Div7fontcolor As String = dr("Div7textcolor").ToString().Trim() dr.Close() obConnection.Close()
The web page runs fine with no errors but the value does not carry over and change the property correctly. I am able to set the bgColor value of the TopBox within the VB file if I use "TopBox.BgColor = Div7bgcolor" but I can't set other values on my web page like the font color. If I can do this from the VB file then please correct me. Using variables within the page will allow me to set almost any value but I don't even know if what I want is possible. Any help is greatly apprectiated. Thank you,
I am building an SSIS package that loops through a table in SQL Server and dynamically builds a select statement that i would like to use as an ole db source. I have been having a difficult time with this as the select statement that i am generating is over 200,000 characters long so using an sql variable is out of the question.
I ended up placing the select statement into a table where each row of the table represents a piece of the select. I then use an execute_sql task that selects the entire rowset from this table into a variable object. I then use a for each loop to shred the variable and concatenate it into on big string variable called user:: sql_statement that is my select.
After setting up the loop and testing to see if the user:: sql_statement variable populates correctly i then added a data flow transfer with an ole db source and destination. I then go into the advanced editor for the source and set it to accept an sql statement from a variable and use my user:: sql_statement variable. I was forced to set validate external metadata option to false to avoid an error since there is no way to validate the columns until the for each loop runs during run time.
Now thats all fine and good but what is causing my problem is that during run time, when the package gets to the data flow task, the select statement doesn't seem to be populating the input columns of the data source. I have been searching to no avail on a way to tell the data source to update the input columns but every time it gets there, the package bombs out telling me the ole db source has no available output columns.
Specifically the error i get is : [DTS.Pipeline] Error: "output "OLE DB Source Output" (6616)" contains no output columns. An asynchronous output must contain output columns.
Does anyone know how I can use a user variable in a sqlcommand in a Datareader source with an ODBC connection as the source? I am storing a date value in a user variable(Date) I fill with a SQL Task and then want to use the value in the sqlcommand I use in the Datareader Source. It won't let me use the @variablename in the sql command. Can anyone help with some advice on how I can make this work? Appreciate any help I can get. Thank you
Hello, I have a 'ForEach Loop Container' that does a select from a table and I use Variable Mappings to map each row set result to Package Variables in SSIS. This works fine. However, in the Data Flow, I have a Script component in which I need to access the values of those variables that have been set in the ForEach Loop Container. How do I do this?
Perhaps I'm on the wrong track but I guess I need to access the ADO Object Source variable set in the ForEach Loop Container? Through my own experimentation, I know that accessing the VariableDispenser collection only returns whatever default values that happen to be assigned to the variables in the SSIS GUI (not the values assigned during each iteration of the ForEach Loop).
I am able to use a custom script task to receive a MSMQ package and save the package contents to a flat file.
I can also use the bulk load task to push the flat file contents into a SQL table.
However, I would like to save the package contents to a variable (done, it works), and then pass that string variable to a data flow task for SQL upload. In other words, I don't see any reason to persist the msmq package contents to disk.
My question is: Which data flow source can I use that will accept a string variable? The string variable will then need to be processed with bulk load or an execute sql task.
Btw, the content of the string variable is a csv style string:
I have to read data from an MDB file into SQL Server. Simple, but I need to place a literal into every row. In my OLE DB Source I would use:
SELECT '001' AS col1, foo, bar FROM Table1
But the problem is that I pass my value for col1 into the package from C# using Managed DTS.
So I set up my OLE DB Source using "SQL command from variable." And I have two variables one to receive the value of for col1 (col1) and one that is an evaluated expression that has a dynamic sql statement (dynSql).
col1 is a String as is dynSql. dynSql looks like this:
"SELECT " + @[col1] + " as col1, foo, bar FROM Table1"
Now as long as I set the value of col1 to '001' that should work. Right? Well not exactly. The destination for col1 in SQL Server is a char( 3 ). But the package makes the col1 column from the source a DT_WSTR. Which causes the famed: "Text was truncated or one or more characters had no match in the target code page.".
My task is to read the filename from a database table and transfer the flat file data in to a table. In SSIS,I am able to fetch the file name using a Data Reader Source; but how to pass this fileName parameter to Flat File Source ?
In DTS I have used ActiveX script to pass filename variable as flatfilecon.Source. Any help ?
I am writing a package that will process delimited flat files that will come in one of a few different versions. Within each flat file, the number of delimited columns will be the same, but each version of the file has a different number of columns. I have tried configuring the flat file data source to expect the version with the largest number of columns, but it will then throw away rows that have less than this number of columns (warning: There is a partial row at the end of the file).
Is it possible to use a single flat file data source that will work with all of the different width files?