Urgent: How To Get Source Data With Runtime Variable
Feb 15, 2006
All,
I need to pass a variable to get source data from Oracle on OLE DB source. The variable is defined by an Execute SQL task (previous step) on the control flow and it is a package level variable.
In the data flow, OLE DB source using €˜SQL command from variable€™. The variable sql command works when it just: Select *from table
But I need the variable SQL command like this:
Select * from table where last_modified_date > ? - - variable1
the variable1 is defined by anther Execute SQL task at runtime
Can someone give some light on this?
Many thanks
View 1 Replies
ADVERTISEMENT
Apr 12, 2007
Hiya guys,
The question:
Can I design data flows with an XML Source pointing to file system xml files, and but run them against variables? I can see the properties at design time (Accessmode=0,1,2; XMLDataVar="User::XMLData), but the XMLSource object doesn't appear to have expressions enabled to change this at runtime, nor do these properties seem to be exposed to configurations.
The scenario:
I have a package that reads through (potentially thousands) of XML files, and having identified the embedded message "type", transforms them via XSLT to an easier XML format (XML Datasource can't cope well with multiple namespaces etc). Then send this the new file off to a type specific dataflow task to send contents to database.
For performance (and other) reasons I'd rather XSLT to a single shared variable and use this variable as the XML source in the data flows (Rather than writing out to the filesystem [via xslt] and immediately reading it back in each time [via datasource]).
But designing the data flow task using a variable as xml source is frustrating for a bunch of reasons - not the least being:
(a) The variable needs to be populated with sample xml data at design time. But this could only ever match one dataflow at a time (fine at run time, but painful at designtime - leading to validation errors popping up all around the place.)
Yes I could have a seperate variable for each xmlsource - but that just makes things more complicated up front and also leaves me with 20+ large, type specific, xml variables floating around at runtime of which only one is ever being used per import - which just doesn't feel right to me.
(b) populating the variable with sample xml at design time is painful due to it being a string - and not accepting hard returns etc that are in text files, design time changes etc)
Using a file XML source at design time is infinitely easier, when I get a design time change - I can just amend the XSLT and run the appropriate task to produce a sample XML file to design against.
Any ideas?
Ta.
Gus.
View 2 Replies
View Related
Nov 29, 2006
Hello
I have a defined data source to an oracle server. I've alredy intalled oracle client, and setup my data source to save the user and password. I'm using .NET provider/OracleClient DataProvider Connection. When I click on "Test Connection" Button, SSIS reports SUCCESS. In Connection Manager TAB I created on connection called "OracleServer" from my oracle data source, described above.
In my package, I defined a DataReaderSource task, I specified "OracleServer" as a connection to it. I can preview data and view oracle's columns name..., so it make me think that everything is fine. But when It execute the task it FAIL and notify logon error and that password can't be blank.
Please help, I need read from the oracle server!!
Thank you.
View 2 Replies
View Related
Sep 13, 2007
Please help figure out what is wrong with my code. The script is supposed to load a package (from file). The loaded package already has everything set up to run a query against a local server and output the results to an Excel file. The reason for the outer script is because I need to change the query based on a global variable. When the query changes, though, I think the existing dataflow Path is no longer valid, so I should remove it and re-create another one with the new input mappings. Here is my code, which runs and throws an exception at the AcquireConnections call.
The error is
Error: 0x2 at Script Task: The script threw an exception: Exception from HRESULT: 0xC020801B
I pieced together this code from the examples in the online books, but I am not sure what to do.
' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Imports Microsoft.SqlServer.Dts.Pipeline
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Public Class ScriptMain
Public Sub Main()
'
Dim app As Microsoft.SqlServer.Dts.Runtime.Application = New Application()
Dim package As Microsoft.SqlServer.Dts.Runtime.Package = _
app.LoadPackage("c:systimeExcelOutExcelOutExcelOutDo.dtsx", Nothing)
Dim pkgVars As Variables = package.Variables
Dim gsVar As Variable = pkgVars("User::gsExcelFile")
Dim currVars As Variables = Dts.Variables
Console.WriteLine(Dts.Variables("User::gsExcelFile").Value)
gsVar.Value = Dts.Variables("User::gsExcelFile").Value
pkgVars("User::gsQuery").Value = Dts.Variables("User::gsQuery").Value
pkgVars("User::gsCreateTable").Value = Dts.Variables("User::gsCreateTable").Value
Dim e As Executable = package.Executables("ExcelOutTask")
Dim thMainPipe As Microsoft.SqlServer.Dts.Runtime.TaskHost = _
CType(e, Microsoft.SqlServer.Dts.Runtime.TaskHost)
Dim dataFlowTask As MainPipe = CType(thMainPipe.InnerObject, MainPipe)
' Get the source component.
Dim SourceComponent As IDTSComponentMetaData90 = _
dataFlowTask.ComponentMetaDataCollection("Local Source")
Dim srcDesignTime As CManagedComponentWrapper = SourceComponent.Instantiate()
srcDesignTime.ProvideComponentProperties()
' Reinitialize the metadata.
srcDesignTime.AcquireConnections(vbNull)
srcDesignTime.ReinitializeMetaData()
srcDesignTime.ReleaseConnections()
' Get the destination component.
Dim destination As IDTSComponentMetaData90 = _
dataFlowTask.ComponentMetaDataCollection("Excel Destination")
Dim destDesignTime As CManagedComponentWrapper = destination.Instantiate()
destDesignTime.ProvideComponentProperties()
' Create the path.
dataFlowTask.PathCollection.RemoveAll()
Dim path As IDTSPath90 = dataFlowTask.PathCollection.New()
path.AttachPathAndPropagateNotifications(SourceComponent.OutputCollection(0), _
destination.InputCollection(0))
'Console.WriteLine(dataFlowTask.PathCollection.Count)
Dim ret As DTSExecResult
ret = package.Execute()
Console.WriteLine(ret.ToString)
Dts.TaskResult = Dts.Results.Success
End Sub
End Class
View 3 Replies
View Related
Mar 20, 2007
Hi all,
I have several reports using single shared datasource. I want to change at a runtime database that is used by that datasource. Can this be achieved? If not what are the other solutions €“ I guess that using not shared datasource for each report may be the solution (is it?) but it is not the best solution for me. My goal is to allow users to run the same set of reports, viewed in ReportViewer control, but using different databases (connection string dependant).
Thanks in advance for any suggestions
View 4 Replies
View Related
May 10, 2006
Hi -- I am fairly new to SSIS development, although I am starting to appreciate it more an more, especially since I have started getting into extending the object model. Here's my question:
I have a data flow that pulls data from any number of different delimited files with different numbers of columns. I have had no problem dealing with setting up run-time file locations and file names by using the expressions of a flat file data source, and i have been able to pretty easily deal with varying file delimiters by standardizing the files before they get into the data flow. however, I have not been able to come up with a solution that will allow my data source to discover its column info at run time, and then pass that information on to the data flow task. all i really care about is being able to properly parse the individual rows into individual column data by the flat file data source because the data flow itself is able to discover the actual data that the columns hold at run-time.
i would very much appreciate any feedback from anyone on possible solutions for this.
thanks!
View 1 Replies
View Related
Apr 18, 2008
Hi,
I am having a Data flow task in For each loop which will gets 100 sourcetable names and 100 target table names...
am having a simpleData flow task which trasferes from OLEDBSource to OLEDBDestination.
I am repeating the Dataflow task which transfers from sourcetablename extracted from for loop to a destination table var.
The problem am gettting is for the first table it is able to transfer correcly because I did mapping for those tables at design time...but for the next coming sourcetable-desttable (which r having different no of cols,datatypes) its giving Validation failed...and...needs to refresh metadata....
is there any way to refresh the metadata of Data flow task (I set the property of OLEDBSource validate external meta to false then also same error is coming)
Thanks
Radhika
View 4 Replies
View Related
Feb 23, 2008
RE: XML Data source .. Expression? Variable? Connection? Error: unable to read the XML data.
I want my XML Data source to be an expression as i will be looping through a directory of xml files.
I don't see the expression property or the connection property??
I tried setting the XMLData property to @[User::filename], but that results in:
Information: 0x40043006 at Load XML Files, DTS.Pipeline: Prepare for Execute phase is beginning.
Error: 0xC02090D0 at Load XML Files, XML Source [108]: The component "XML Source" (108) was unable to read the XML data.
Error: 0xC0047019 at Load XML Files, DTS.Pipeline: component "XML Source" (108) failed the prepare phase and returned error code 0xC02090D0.
Information: 0x4004300B at Load XML Files, DTS.Pipeline: "component "OLE DB Destination" (341)" wrote 0 rows.
Task failed: Load XML Files
Information: 0xC002F30E at Bad, File System Task: File or directory "d:jcpxmlLoadjcp2.xml.bad" was deleted.
Warning: 0x80019002 at Package: The Execution method succeeded, but the number of errors raised (2) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "Package.dtsx" finished: Failure.
The program '[3312] Package.dtsx: DTS' has exited with code 0 (0x0).
Thanks for any help or information.
View 3 Replies
View Related
Oct 18, 2007
Hello!
I'm using SQL Server 2000.
I have a variable which contains the name of another variable scoped in my stored procedure. I need to get the value of that other variable, namely:
DECLARE @operation VARCHAR(3)
DECLARE @parameterValue VARCHAR(50)
SELECT @operation='DIS'
CREATE table #myTable(value VARCHAR(20))
INSERT into #myTable values('@operation')
SELECT top 1 @parameterValue = value from #myTable
-- Now @parameterValue is assigned the string '@operation'
-- Here I need some way to retrieve the value of the @operation variable I declared before (in fact
-- another process writes into the table I retrieved the value from), in this case 'DIS'
DROP TABLE #myTable
I've tried several ways, but didn't succeed yet!
Please tell me there's a way to solve my problem!!!
Thank you very much in advance!
View 7 Replies
View Related
Aug 16, 2007
We can pass XML to the XML Source in a variable, but I haven't seen anywhere how much data can be passed this way? Is there a limit beyond the limits of system memory?
Also, what data types are valid for the variable? Just String?
View 4 Replies
View Related
Nov 17, 2006
Hi!
These problem does't look like a problem, but i am not able to solve it...
I have a stored procedure (in an Execute SQL-Task) what returns a few rows. These rows a written in an object variable. No problem.
Now i want to use these result set in a dataflow as source. How can i manage these?
1. Use a script task what returns the whole result set from the object variable? How?
2. Is there any other possibility to use the results of a stored procedure in the data flow?
All ideas are urgently welcome...
Torsten
View 6 Replies
View Related
Apr 11, 2007
I'm working on the viability of using SSIS to process data files that are (currently) sent by email or FTP and processed by hand. They are .csv or Excel and have some columns that are required, some that are included but the data is discarded, usually have some anomaly (like columns out of place, or columns missing) from source to source, and if they have headers the actual headers are inconsistent from source to source (and sometimes from submission to sumission ). Some sources send only one file, others might send a dozen. Due to the nature of the sources - many are not very technically inclined and sometimes there are other factors involved in how the data is exported - there's not a lot I can do to leverage them into all using the exact same format. Creating an SSIS package for each source (over 100 in all) is not a viable solution.
The compromise that I've come up with is I would like to come up with a universal SSIS package that can:
Take a .csv file with headers (I think we can force the sources to include the right headers if nothing else) as input , with or without quote delimiters.
Not be sensitive to the order of the columns.
Not be sensitive to the number of columns.
Loop through multiple files from one source.
Place the ouput in a staging table for further processing.
Reading a .csv file, looping and the staging table is not a problem. What I'm struggling with is the middle portion. I've tried to make the column headers dynamic, but then I lose the ability to have quote delimiters. My next instinct was to the source file(s) and create one interim file with a different delimiter (such as pipe or ## or something) but that doesn't seem to work quite the way I want it - since the source has comma delimiters, the output has both the pipe and the comma. Another way I could do it is replace the commas within quote delimiters with a blank space in the data flow, which would eliminate the quote delimiters in the output. That I haven't tried yet.
Any suggestions? If I could bring down the hammer and tell every source "You must send it this way" I would, but my hands are tied.
View 4 Replies
View Related
Dec 1, 2007
I am trying to set a variable to be passed to my asp page but the value is not carrying over. I don't even know if this is supposed to work or not below:
<td colspan="3" style="font-size: 10pt; vertical-align: top; color: <%=Div7fontcolor%>; background-color: <%=Div7bgcolor%>; text-align: center; width: 0px;" id="TopBox" runat="server">
The above Div7fontcolor and Div7bgcolor are variables that I set in the VB file as shown below:
Dim obConnection As SqlConnection = New SqlConnection("Data Source=localhostsqlexpress;Initial Catalog=OrgBoard;Integrated Security=True")Dim obCommand As SqlCommand = New SqlCommand("SELECT Divisions.*, Dept19.*, Dept20.*, Dept21.*, ConfigDisplay.* FROM Divisions CROSS JOIN Dept19 CROSS JOIN Dept20 CROSS JOIN Dept21 CROSS JOIN ConfigDisplay", obConnection)
obConnection.Open()Dim dr As SqlDataReader = obCommand.ExecuteReader()
dr.Read()
Dim Div7bgcolor As String = dr("Div7color").ToString().Trim()Dim Div7fontcolor As String = dr("Div7textcolor").ToString().Trim()
dr.Close()
obConnection.Close()
The web page runs fine with no errors but the value does not carry over and change the property correctly.
I am able to set the bgColor value of the TopBox within the VB file if I use "TopBox.BgColor = Div7bgcolor" but I can't set other values on my web page like the font color. If I can do this from the VB file then please correct me. Using variables within the page will allow me to set almost any value but I don't even know if what I want is possible. Any help is greatly apprectiated.
Thank you,
Kris
View 1 Replies
View Related
Dec 21, 2006
I am able to use a custom script task to receive a MSMQ package and save the package contents to a flat file.
I can also use the bulk load task to push the flat file contents into a SQL table.
However, I would like to save the package contents to a variable (done, it works), and then pass that string variable to a data flow task for SQL upload. In other words, I don't see any reason to persist the msmq package contents to disk.
My question is: Which data flow source can I use that will accept a string variable? The string variable will then need to be processed with bulk load or an execute sql task.
Btw, the content of the string variable is a csv style string:
"01001","11/21/2006",15
"01001","11/21/2006",1
"01001","11/21/2006",25
"01001","11/21/2006",3
Thanks,
Trey
View 3 Replies
View Related
Feb 1, 2006
I do not know the Excel file name to load in design time.
Would like to pass the value to a variable in the package in run time?
How to do this?
Thanks,
Guangming
View 3 Replies
View Related
Feb 26, 2007
I am writing a package that will process delimited flat files that will come in one of a few different versions. Within each flat file, the number of delimited columns will be the same, but each version of the file has a different number of columns. I have tried configuring the flat file data source to expect the version with the largest number of columns, but it will then throw away rows that have less than this number of columns (warning: There is a partial row at the end of the file).
Is it possible to use a single flat file data source that will work with all of the different width files?
View 1 Replies
View Related
May 3, 2007
In SSIS in Sql task we have option to pass parameter or variable..But in Data Flow Task when we use Data Reader Source using ADO.NET connection..There is no option to pass parameter or variable Or no option to receive a parameter or variable .
I am having a query were it need to pass a parameter.in sql task ...And Data Reader Source have to receive this parameter from sql task .
Sql Task finds a value of parameter and pass to DataReader Source in DataFlow Task .. ...
Please can any one help me to solve this problem of Receiving parameter or variable in DataReader Source using DAO.Net connection in DataFlow Task..thank you dilsa
View 3 Replies
View Related
Jan 12, 2007
Hello,
Please can anyone tell me if or how I have to cross reference to a user defined variable from a script task in the SQL of a Data Reader Source?
The script task creates a variable for the last Sales Ledger Session. The SQL in the data reader source that updates the DW sales invoice lines file based upon those invoice lines where the session number is greater than the value from the script task.
The SQL command in the custom properties doesn't cross reference back to the variable.
If anyone can help or needs further detail to help please post a resonse.
Thanks,
Marcus Simpson
View 4 Replies
View Related
Apr 3, 2007
Thanks for any one can give me a help.
I am try to transfer some tables data from one database server into another database server. I create a package in SSIS, and I use a variable to pass each table name. In Data flow, I use a OLEDB Source, but I cannot set the Data access mode to Table name or view name variable. Ever time, I will get this following error info "===================================
Error at Data Flow Task [OLE DB Source [31]]: A destination table name has not been provided.
(Microsoft Visual Studio)
===================================
Exception from HRESULT: 0xC0202042 (Microsoft.SqlServer.DTSPipelineWrap)
------------------------------
Program Location:
at Microsoft.SqlServer.Dts.Pipeline.Wrapper.CManagedComponentWrapperClass.ReinitializeMetaData()
at Microsoft.DataTransformationServices.DataFlowUI.DataFlowComponentUI.ReinitializeMetadata()
at Microsoft.DataTransformationServices.DataFlowUI.DataFlowAdapterUI.connectionPage_SaveConnectionAttributes(Object sender, ConnectionAttributesEventArgs args)".
Some one can tell me what is the reason, or give me some examples.
Thanks in advance.
View 7 Replies
View Related
May 20, 2007
I'm trying to create a custom data flow transformation, I need to set the value of a user variable which will be used by a downstream component(for example: a data flow destination) at runtime, is this possible? If possible, in which method should I do that? Thanks.
View 4 Replies
View Related
Nov 12, 2007
I am using execute sql task to run a stored procedure in oracle database which returns a resultset. This works. Now I need to send the ouput to a destination table in a sql database. Should I use for each loop to pick the resultset and insert it into the destination one by one (which I dont think is a great idea) or is there a better way to accomplish this task (in data flow task) ?
When I use dataflow task instead of execute sql task, the main issue is I am not able to see the output columns when I execute an oracle stored procedure, but when I see the preview I can see the resultset . But I can see the output columns for a sql server stored procedure.
View 9 Replies
View Related
Aug 23, 2006
Hi,
I've seen a number of posts similar to this but i still cannot figure out what i need to do to get it working. So here goes with a couple of newbie questions.
Question 1:
Once created how do i go about executing a SSIS package. I want to be able to call it from a C# application from which i pass in a couple of parameters?
Question 2:
How do i go about setting the file path of my Excel source to a dynamic value passed at runtime. I want to be able to loop through a number of Excel files and do some processing on them. I've set up a variable (which i think i need to do) after that i get stuck however. Some other posts suggest configuration packages but i cannot get my head around how they work?
Any help on this matter would be gratefully recieved.
Thanks in advance,
Grant
View 5 Replies
View Related
Oct 19, 2007
Hi,
I am relatively new to SSIS.
Please advice me on how to change the value of a variable during runtime.
i.e. user should be able to key in the value.
Regards,
Jiju
View 3 Replies
View Related
Mar 20, 2008
Hi,
1 20031012121212 200 (recordtype, CreationDateTime(YYYYMMDDHHMISS), Rec_Count) -- Header records
2 ABCD, XYZ, 9999, 999999999, 1234 ---- Detailed Record
2 ABCD, XYZ, 9999, 999999999, 1234 ---- Detailed Record
For the above given sample data I am having two outputs at Condition Split (based on the recordtype). I want to store the 1st record datetime value into a variable and then I want to use that variable value into 2nd and 3rd row.
Basically, detailed records would be stored into the database not the header record. Is there any way I can use the variable while doing processing for 2nd and 3rd records.
Please suggest me.
Regards,
View 10 Replies
View Related
Feb 28, 2007
Greetings all,
Apologies if this question has been asked in the past but how I display the valuw of a variable during runtime?
Thanks for your help in advance.
View 4 Replies
View Related
Dec 20, 2005
Hi,
I want to transform an xml flow to an html flow. For this, I create an XmlDataDocument, I add on it, all that I want, and I store it in a file (in a dataflow task).
After that, in an xml task, I set the input properties like that:
Operation type: xslt
SourceType: File connection
Source: ras.xml
All works fine. But now, I want to use a variable to store my xml data. So, instead of storing my data in ras.xml, I store it in a variable like that:
Dim v As Microsoft.SqlServer.Dts.Runtime.Wrapper.IDTSVariables90
Me.VariableDispenser.LockOneForWrite("RAS", v)
v("RAS").Value = doc.OuterXml
v.Unlock()
and in the xml task, I set configuration like that:
Operation type: xslt
SourceType: Variable
Source: user::RAS
But when I execute, I got this following exception:
Error: 0xC002F304 at Format HTML Mail, XML Task: An error occurred with the following error message: "Root element is missing.".
Error: 0xC002928F at Format HTML Mail, XML Task: Property "New Source" has no source Xml text; Xml Text is either invalid, null or empty string.
Furthermore, when I want to debug it, in replacing the xml task, by a script task, I can see that in my RAS variable, I get the xml data which is well formed.
<?xml version="1.0" encoding="UTF-8"?>
<root> €¦
</root>
(When I store it in a file, I can see it in IE)
Have you got an idea ?
View 3 Replies
View Related
Nov 14, 2007
Is there a quick and easy way, other than scripting a MsgBox(), to show the value of a package variable while the package is running?
View 3 Replies
View Related
Jan 10, 2007
I have a number of DTS packages I am trying to convert and am totally new to Integration Services.
I have a flat file that I reference to get a date that I use in some SQL statements in the package. It seems like the best way to do this would be to create a variable to hold the date and set it at run time from the flat file. I've been searching the documentation and the forum and can't figure out how to do this. This is so basic that I feel like an idiot putting it on the forum but I need some help. This whole Integration Services thing is a big shift from DTS (which I eventually became pretty good at) Now I'm back to ground zero.
Any help would be appreciated. If you can direct me to any basic training resources that would be cool too.
View 7 Replies
View Related
Feb 15, 2008
Hello
In our company, after switching to Office 2007, Access 2007 runtime is installed on our general user machines.
Everything is just fine and beautiful, except that there is one issue with runtime:
Several forms in my Access Project use stored procedures in SQL Server 2000 as their record source. When i try to open the forms in Access 2007 Runtime on user machines, i receive an error that "The record source dbo.Myprocedure speified on this form or report does not exist ".
I use dbo.ProcName format and the forms open correctly in my full version Access.
What could be the problem?
Thanks!
Elman
View 10 Replies
View Related
Mar 13, 2008
hi ,
i am trying for a drill through report (rdlc)
ihave written the following code in drill through event of reportviewer, whenever i click on the first report iam getting the error like
An error has occurred during report processing.
A data source instance has no
t been supplied for the data source "DetailDS_get_orderdetail".
the code is
using System;
using System.Data;
using System.Data.SqlClient;
using System.Configuration;
using System.Collections;
using System.Web;
using System.Web.Security;
using System.Web.UI;
using System.Web.UI.WebControls;
using System.Web.UI.WebControls.WebParts;
using System.Web.UI.HtmlControls;
//using Microsoft.ApplicationBlocks.Data;
using Microsoft.Reporting.WebForms;
using DAC;
public partial class _Default : System.Web.UI.Page
{
protected void Page_Load(object sender, EventArgs e)
{
ReportViewer1.Visible = false;
}
protected void Button1_Click(object sender, EventArgs e)
{
DAC.clsReportsWoman obj = new clsReportsWoman();
DataSet ds = new DataSet();
ds = obj.get_order();
ReportViewer1.LocalReport.DataSources.Clear();
ReportDataSource reds = new ReportDataSource("DataSet1_get_order", ds.Tables[0]);
ReportViewer1.LocalReport.DataSources.Add(reds);
ReportViewer1.LocalReport.ReportPath = "C:/Documents and Settings/km63096/My Documents/Visual Studio 2005/WebSites/drillthrurep/Report.rdlc";
ReportViewer1.LocalReport.Refresh();
ReportViewer1.Visible = true;
}
protected void ReportViewer1_Drillthrough(object sender, DrillthroughEventArgs e)
{
DAC.clsReportsWoman obj = new clsReportsWoman();
ReportParameterInfoCollection DrillThroughValues =
e.Report.GetParameters();
foreach (ReportParameterInfo d in DrillThroughValues)
{
Label1.Text = d.Values[0].ToString().Trim();
}
LocalReport localreport = (LocalReport)e.Report;
string order_id = Label1.Text;
DataSet ds = new DataSet();
ds = obj.get_orderdetail(order_id);
ReportViewer1.LocalReport.DataSources.Clear();
ReportDataSource reds = new ReportDataSource("DetailDS_get_orderdetail", ds.Tables[0]);
ReportViewer1.LocalReport.DataSources.Add(reds);
ReportViewer1.LocalReport.ReportPath = Server.MapPath(@"Reportlevel1.rdlc");
ReportViewer1.LocalReport.Refresh();
}
}
the code in method get_orderdetail(order_id) is
public DataSet get_orderdetail(string order_id)
{
SqlCommand cmd = new SqlCommand();
DataSet ds = new DataSet();
cmd.Parameters.Add("@order_id", SqlDbType.VarChar, 50);
cmd.Parameters["@order_id"].Value = order_id;
ds = SQLHelper.ExecuteAdapter(cmd, CommandType.StoredProcedure, "dbo.get_orderdetail");
return (ds);
}pls help me.
View 1 Replies
View Related
Feb 2, 2008
,
Hi
In this code how can I create a new data source and new data source view and model and structure that it run dynamic.
In this code I have a lot of errors, that they are about server and database don€™t have in current code,
In this code, first I should definition server or no?
Database dbNew = new Database (databaseName,
Utils.GetSyntacticallyValidID(databaseName, typeof(Database)));
srv.Databases.Add(dbNew);
dbNew.Update(true);
***********************************************************
How can I create data source and data source view and model and structure?
Please say code of that, and guide me.
databasename and srv is unknown.
Do I add other reference with analysis services?
Please explain about these codes:
************************************************************************
1)
RelationalDataSource dsNew = new RelationalDataSource(
datasourceName,
Utils.GetSyntacticallyValidID(
datasourceName,
typeof(RelationalDataSource)));
db.DataSources.Add(dsNew);
dsNew.ConnectionString = connectionString;
dsNew.Update();
2)
RelationalDataSourceView rdsv;
rdsv = db.DataSourceViews.Add(
datasourceviewName,
Utils.GetSyntacticallyValidID(
datasourceviewName,
typeof(RelationalDataSourceView)));
rdsv.DataSourceID = ds.ID
***************************************************************
3)
OleDbConnection cn = new OleDbConnection(ds.ConnectionString);
OleDbCommand cmd = new OleDbCommand(
"SELECT * FROM [" + tableName + "] WHERE 0=1", cn);
OleDbDataAdapter ad = new OleDbDataAdapter(cmd);
DataSet dss = new DataSet();
ad.FillSchema(dss, SchemaType.Source);
*************************************************************
4)
// Make sure we have the name we thought
dss.Tables[0].TableName = tableName;
// Clone here - the original DataTable already belongs to a DataSet
rdsv.Schema.Tables.Add(dss.Tables[tableName].Clone());
rdsv.Update();
5)
MiningStructure ms = db.MiningStructures.Add(miningstructureName, Utils.GetSyntacticallyValidID(miningstructureName,
typeof(MiningStructure)));
ms.Source = new DataSourceViewBinding(dsv.ID);
ms.CaseTableName = "Customer";
Add columns:
ScalarMiningStructureColumn smsc;
// From table "Customer" we will add a couple of columns
// CustomerID - key
smsc = new ScalarMiningStructureColumn("Customer ID",
Utils.GetSyntacticallyValidID("Customer ID", typeof(ScalarMiningStructureColumn)));
smsc.IsKey = true;
smsc.Content = "Key";
smsc.KeyColumns.Add("Customer", "customer_id", OleDbType.Integer);
ms.Columns.Add(smsc);
*******************************************
6)
MiningModel mm = ms.MiningModels.Add(miningmodelName,
Utils.GetSyntacticallyValidID(miningmodelName,
typeof(MiningModel)));
mm.Algorithm = "Microsoft_Decision_Trees";
mm.Parameters.Add("COMPLEXITY_PENALTY", 0.3);
MiningModelColumn mc = new MiningModelColumn("Customer ID",
Utils.GetSyntacticallyValidID("CustomerID",
typeof(MiningModelColumn)));
mc.SourceColumnID = ms.Columns["Customer ID"].ID;
mc.Usage = "Key";
mm.Columns.Add(mc);
mm.Update();
Please exactly say, whatever I want
Thanks a lot for your answer
Please don€™t move this question because I don€™t know where I should write this.
View 1 Replies
View Related
Mar 2, 2007
I have set up a new connection as a connection from data source, but I cannot see how to use this connection to create my Data Flow Source. I have tried using an OLE DB connection, but this is painfully slow! The process of loading 10,000 rows takes 14 - 15 minutes. The same process in Access using SQL on a linked table via DSN takes 45 seconds.
Have I missed something in my set up of the OLE DB source / connection? Will a DSN source be faster?
Thanks in advance
ADG
View 2 Replies
View Related
Jul 31, 2015
I have a report where in I have a combination of matrix ,table data regions.
The problem what I am facing is that the data tables don't remain fixed in their position and they tend to move down.
E.g. table 1 and table 2Â are on the same page in design time side by side (right and left)however during the runtime the table1 is pushed down and table2 is at its position .
Now how can I keep them all fixed in their same position. Most of the tables have fixed size rows and some who have high size of rows have been put at the end . What settings we can set?
View 6 Replies
View Related