Peculiar Behavior In The Act Of Setting RS Parameter Values...
Jan 21, 2008
we're trying to get a better understanding of how RS behaves when parameters are being set. We see quirky behavior that is a little difficult to describe. Right now we assume that if the revolving green circle (with the phrase "Report is being generated" beneath it) doesnt appear, the report really wasnt rendered properly, even if the report region changes.
One peculiarity that seems pretty consistent is on reports we've prototyped with "from" and "to" date parameters. It seems that when we set one date (doesnt matter which is 1st) things progress normally, ie no "report clearing event" occurs as a result of setting cursor focus in the calendar control and changing its value. The report region doesnt change from what showed previously. But trying to set focus on 2nd (doesnt matter if its "from" date or "to" date, just that its the 2nd date being set) always seems to trigger some kind of event that 1) doesnt allow focus to be on that text box, 2) blanks out the report region including headings. Only after this "event" occurs, can we set focus on the 2nd date, change the value and click the "view report" button for rerendering.
We see similar types of behavior with other types of parameters that include multi value dropdowns and booleans. The toughest part of this is trying to explain it to our users. On some parameters, the event always occurs every time they are changed. On other parameters, it appears that the event only occurs if another parameter was changed beforehand.
I believe we've even seen headings with no data rendered, thinking temporarily that no rows were returned, just to find out that by clicking the "view report" button there really was data to be reported based on current filters. Unfortunately I cant reproduce this scenario when I want to.
Rather than the real code, here's a sample we came up with.
Here's the C# Code: public class sptest : System.Web.UI.Page { protected System.Web.UI.WebControls.Label Label1; private DataSet dtsData;
private void Page_Load(object sender, System.EventArgs e) { // Put user code to initialize the page here string strSP = "sp_testOutput"; SqlParameter[] Params = new SqlParameter[2]; Params[0] = new SqlParameter("@Input", "Pudding"); Params[1] = new SqlParameter("@Error_Text", ""); Params[1].Direction = ParameterDirection.Output; try { this.dtsData = SqlHelper.ExecuteDataset(ConfigurationSettings.AppSettings["SIM_DSN"], CommandType.StoredProcedure, strSP, Params); Label1.Text = Params[0].Value.ToString() + "--Returned Val is" + Params[1].Value.ToString(); } //catch (System.Data.SqlClient.SqlException ex) catch (Exception ex) { Label1.Text = ex.ToString();
} }
Here is the stored procedure:
CREATE PROCEDURE [user1122500].[sp_testOutput](@Input nvarchar(76),@Error_Text nvarchar(10) OUTPUT)AS SET @Error_Text = 'Test'GO When I run this, it prints up the input variable, but not the output variable.
I have a report with a subscription enabled and the default values that are selected for the report frequently change. I have our report server locked down so that the users can't change the defaults, but I now want to empower them to maintain this on their own. Here is my dilemma. When you have the available parameters set up to pull from a query, the defaults on the report server have to be keyed in manually, which is not an option. The only way to get a check box there, is to explicitly specify the available values.I need my available values to be database driven and I need to be able to select my defaults on the report server using check boxes.
I have a dimension with a hierarchy. In this hierarchy it is possible for a non leaf member to get loaded with a fact measure.
Using a sales organization as an example, let's say (1) Bill and Ted are sales reps, and they both report to John. (2) John, not only manages Bill and Ted's sales, but he also makes sales of his own. (3) Bill sells 10 units, Ted sells 8 units, and John sells 5 units.
In this example, John is the non-leaf member because he is above Bill and Ted in the hierarchy. When this data is aggregated in the hierarchy, we want to make sure that 10 for Bill plus 8 for Ted do not overwrite the 5 for John. Instead, we want to include the 5 for John with Bill and Ted's total of 18, thus getting a total of 23 for John.
Here is my question. What is the default aggregation behavior? The behavior I get is that the non leaf data is overwritten by the sum of the leaf data. So in the example above, I get 18 for John, not 23.
Now, from my research, I think the MembersWithData setting on a dimension attribute as something to do with this, but I can't nail this down.
I would like to know what happens when a very large reference data set for a lookup transform with full caching enabled is getting loaded during package execution and the computer memory runs out or is very low. Does SSIS a) give an out of memory error of some sort b) resort to a no caching or partial caching mode c) maintain the full caching mode but will switch to using the paging file(virtual memory).
I think it will resort to using the page file in which case the benefits of in memory lookups are lost and performance would suffer. If I cannot upgrade the memory or shrink the reference set somehow, i should switch that lookup task to use partial caching or no caching with an indexed lookup table. Would this make sense?
I've seen many entries about trailing spaces but have not found one like this.
In the Control Flow I am using an "Execute SQL Task" to populate some SSIS local variables (type string) by: (1) executing a SQL stored proc with output variables (type varchar(100)) to (2) be mapped to the local variable name (the parameter mapping Data Type is VARCHAR).
One of these mapped outputs is used as a path for subsequent operation in the Control Flow. At execution the sproc fires, populating the local variable with the path but with trailing spaces out to 255. Later in the "Script Task" when that path is used I receive an error telling me that the path is too long, and something about 260 or 246 characters.
Here's the oddity. I have two desktop environments running XP and a server environment (server 2003). This package runs just fine on the server - no trailing space issue, no need to trim. But on both my desktops I get the errors. By adding trim statements I can get back the correct path, but varchars should not be including trailing spaces, and the sproc return variable is a varchar (100).
I know this soulds like numerous other posts which indicate the solution is to trim, but I think the question I am asking is why does it work on the server but not the desktop? Is the SSIS variable type string experiencing a bug on different OS's? Not to further complicate the issue but it used to work on my laptop, but through a horrible sequence of events I had to reload the studio in which case the error started to happen on that too.
I ran into a variety of problems trying to set a script task breakpoint in a package containing multiple script tasks. The debugger apparently treats the breakpoint as if it were set in ALL tasks in the package, not just the one in which it is actually set.
At best this results in hitting breakpoints in scripts where they are not set and at worst the debugger brings up the "Send error report" dialog and quits (while the package continues to run). The latter seems to happen most often when the script with the breakpoint has more lines than an earlier script and the breakpoint is set at a line number that exceeds the number of lines in the earlier script--it bombs when the earlier, shorter script starts.
To get the debugger to work under these circumstances I had to add some nonsense code like
While False Dim i as Integer = 0 End While to every script, at the same line number near the beginning of the script (line 40, for example). I then set a breakpoint on the middle statement in one of the scripts (doesn't matter which) to cause the debugger to open at runtime. It doesn't hit the breakpoint because the line is never executed. If the breakpoint is set on a line that can be executed in any script in the package, bad things tend to happen. I then add a "stop" statement to the script that I want to debug. This only works if the debugger is already open, hence the dummy breakpoint above. This workaround is usable, but I am debugging a package that has quite a few scripts and having to insert dummy code in all of them at a fixed line number is rather inconvenient. I would really like to see breakpoints work the way one would expect--only in the scripts where they are set. Is there some other, easier way around this problem? Is there at least an easier way to get the debugger to open so that "stop" will work?
I have parameters in my report. The user can choose the year, month and date (3 parameters). Now I want to set default vaules for the parameters , so that the user sees the report for example for the current day without selecting the parameters. I tried to set the type of the parameters to DateTime and the default value for example for the year to "=Today().Year" . But when I execute the report an error occures . Something like : no validValue for this parameter.
My Attributes for the year month and date are from an Analyis Services Cube from a Server Time dimension . Does somebody know how to make it possible to set default values for this parameters?
Other question :
Does somebody know how I can reduce the values for a parameter. For Example I have a parameter "year" from a server time dimension from a cube. The values which are available are "Year 2004", "Year 2005", "Year 2006", "Year 2007". But I want that the user only can choose "Year 2006" or "Year 2007" ant not every Year or "All". Or Other Example: The User should only choose a Date that is int the past or Today but not a Date in the future.
Is it possible to fill a parameter list with values based on another parameter value? Here's what I have so far (which hasn't worked)... I'd like to generate a report listing information for a student. The report viewer would first select a school from the first drop-down menu, and then the second drop-down menu would populate with the list of students at that school. I have a dataset that calls a sp which returns a list of schools (SchoolID and SchoolName fields from the database table). I have another dataset that calls a sp (with SchoolID as the parameter) which returns a list of students for that school. Both datasets return the appropriate data when tested individually, but when I set up the Report Parameters and build the report, these errors come up... The value expression for the query parameter '@SchoolID' refers to a non-existing report parameter 'SchoolID'. The report parameter 'Student' has a DefaultValue or a ValidValue that depends on the report parameter "SchoolID". Forward dependencies are not valid. ...Is it possible for the reoprt to generate a list of available parameter values based on the value selected for another parameter? Any help you can give me would be great!! Thank you
I have my stored procedure set to Territory_code IN (@Territory)
, now , how do i enter in more then one value. When i select the multi value check box, it gives me more spaces. But then doesnt recognize the values when i put in more then one. am i doing something wrong?
I have a parameter (hidden) that gets its value using an expression base on another parameter. When in the designer, the first time when the designer loads I can select the Parameter that controls the child parameter (expression lies in the default value section). The value changes.
When I change the parent parameter again, the value of the child parameter does not seem to change.
How can I make this parameter change automatically when the parent is changed ?
If I have a Select statement like this in my C# code: Select * From foods Where foodgroup In (@foodgroup) And I want @foodgroup to have these values ... "meat", "dairy", fruit", what is the correct way to add the parameter? I tried meat, dairy, fruit 'meat', 'dairy', 'fruit' but neither worked. Is this possible?
Pardon the newbie question...but I'm trying to load a dimension table in a small data mart that has columns in it that are unique to the dimension and not sourced from any source table. Two of those columns are date columns that I want to default to the system date and the other column I want to load with a default value. I can't figure out how to do this within a data flow task. The source columns flow from the input db source into a scd transform but I can't seem to edit the columns in the target table table if they don't actually come from a data source. There doesn't seem to be a data transform object to handle this.
I am trying to use the code below to set a formview datasource parameter in the page_load section of a user control (ascx file): public void Page_Load(object sender, EventArgs e){ formview_datasource.SelectParameters.Add("@department_id", "e62bbc7d623f44a68e101cba90e839s3");} However I am getting the following error: Exception Details: System.Data.SqlClient.SqlException: Must declare the variable '@department_id'. So it would seem that page_load in my user control isn't being called? I'm not sure why or how to work around it. Has anyone else experienced anything like this or can give me some pointers on where I am going wrong? ThanksBrad
I have a question regarding Execute SQL Task as I combined one statement like select count(*) from destination table, and insert into error table (source count, error count, destination count) values (?,?,?).
If I use two Execute SQL Task, it should work. I was wondering that is it possible to combine select and insert into one Execute SQL Task direct input. How to approch this?
We have a issue with the performance in SQL server database.
Scenario & Issue: We have delivered a .net application to our client. This application is installed in newly built windows 2003 server.
The client is facing performance issues with the application. When compared with the performance in the development server , the performance of the production server is very poor.
Even when we execute the stored procedures in the backend, the performance is poor in the production server.
Example: A stored procedure that takes 16 seconds in the development server takes 17 minutes for the same parameters. The time remains the same even for HOT execution.
System Info:
Database Version - SQL Server 2005 Database Size - 120 plus GB OS Platform - Windows 2003 Database Load - 50 users CPUs - 4 RAM - 8 GB
The OS is Clustered ( failover clustering ).
Points to Note:
1.There is a huge table with 250 million rows ( this table itself takes upto 60 GB )
2.The huge table is partitioned ( SQL server 2005 table partitioning ) and placed in 20 different filegroups (.mdfs).
3.The .mdf's are placed in a SAN and .ldf is in local HD
4.Dynamic queries are used at few instances for performance benefits.
Questions:
1. Any thoughts on why this kind of performance issue arises ?
2. The client DBA wants us to clear the data and stored procedure cache before executing the stored procedure and test the performance.
Will this be would be the case in production scenario ?
3. Will the performance change based on the input parameters ?
4. The client DBA also have stated that a report server that pings the production database server is the cause for frequent clearing of the SQL Server cache.
When does the SQL Server database actually clears the cache memory? Is there any way to control it?
I am tackling with unique request. I have to download the ".zip" files from the https and uncompress them. Now, the fetch process works fine when I am downloading the files. Using .net and .IO namespaces from system library.
Similarly after downloading I have to uncompress the files as these files are treated by compressed file folders by windows xp. I know it can be achieved by using the IO.compression class form system namespace. But the only trouble here is IO.compression supports ".gz" and "Gzipstream.Uncompress" and I wonder how I would be able to get the ".zip" and "Zipstream.uncompress" done.
Thanks a million in all your help and advice. Also I appreciate for your time.
We have a issue with the performance in SQL server database.
Scenario & Issue: We have delivered a .net application to our client. This application is installed in newly built windows 2003 server.
The client is facing performance issues with the application. When compared with the performance in the development server , the performance of the production server is very poor.
Even when we execute the stored procedures in the backend, the performance is poor in the production server.
Example: A stored procedure that takes 16 seconds in the development server takes 17 minutes for the same parameters. The time remains the same even for HOT execution.
System Info:
Database Version - SQL Server 2005 Database Size - 120 plus GB OS Platform - Windows 2003 Database Load - 50 users CPUs - 4 RAM - 8 GB
The OS is Clustered ( failover clustering ).
Points to Note:
1.There is a huge table with 250 million rows ( this table itself takes upto 60 GB )
2.The huge table is partitioned ( SQL server 2005 table partitioning ) and placed in 20 different filegroups (.mdfs).
3.The .mdf's are placed in a SAN and .ldf is in local HD
4.Dynamic queries are used at few instances for performance benefits.
Questions:
1. Any thoughts on why this kind of performance issue arises ?
2. The client DBA wants us to clear the data and stored procedure cache before executing the stored procedure and test the performance.
Will this be would be the case in production scenario ?
3. Will the performance change based on the input parameters ?
4. The client DBA also have stated that a report server that pings the production database server is the cause for frequent clearing of the SQL Server cache.
When does the SQL Server database actually clears the cache memory? Is there any way to control it?
Hello,Is it possible to set a comparison operator using a parameter value?The code below shows what I'm after;declare @co char(1)declare @date datetimeset @co = '<'set @date = '02/02/2002'select * from recipe where date @co @dateI would use an if statement perform two seperate statements depending on the value of co, but this is only one of 13 statements where i need to have different combinations of comparision operators.thanks
I am on the verge of being able to do exactly what I want, but just can't seem to find the right combination of things to do it. I'm sure all of you wonderful folks will be able to point it out to me immediately, but I've been looking at it too long or something....
I have a record of individual sales with the state, and quarter of the sale.
sale_id state quarter 001 NY 2005Q1 003 WI 2006Q2 etc.
I create a report with a matrix to show count(sale_id) with Quarter as the column group and State as the row group. This works fine.
Now what I want to do is to get percentages based on quarterly sales. In other words, what percent of sales for 2005Q1 in NY vs. all sales in 2005Q1. So I create a second dataset (called total) with an SQL query like so:
SELECT count(sale_id) FROM data_table WHERE quarter = @QueryQuarter
Now, back in the matrix I want to use the column that we're in (2005Q1, 2005Q2, etc.) as the value that is passed to this query.
This is a simple concept, but I can't seem to figure out the correct call to pass the column group to the query as the parameter.
Thank you for any pointers you might be able to give. As I said, I'm right on the verge and just can't quite get it.
I want to set defaults for my multi-valued report parameter MONTH so that when the report starts, it automatically selects all the months prior to the current month (effectively creates a YTD report). However, using RS2005, I can't seem to figure out how to do this. I can create an IIF expression in 12 different value entries in the report parameters that returns the month based on the system date, but the first time I pass blanks, null or anything except a valid parameter, it clears the entire parameter list when the report displays.
Does anyone have any suggestions for auto-populating multiple values in a parameter at runtime where one or more of the parameter values may be empty? Checking "Allow Null" or "Allow Blank" doesn't fix this problem.
I tried to pass all the values in a single value entry on the report parameters page, but can't find the syntax that will allow this. I'm not sure if it will let you do that anyway...
Does sombody have experience on dynamically set or change the default value of a report parameter?
Assuming: report parameters p1, p2, p3, p4 have been set up(and have their default value 'all') with the creation of the report1; report browseing is through reportviewer that embedded in the web application; datasource is datacube
What I want to do: based on the login user of the my web application, set default value of p1 as the user's username.
What I did is:
Microsoft.Reporting.WebForms.ReportParameter reportParam = new Microsoft.Reporting.WebForms.ReportParameter("P1","Mary");
When I execute a SELECT query on a table with around 25 million records, I get performance difference based on the passed parameter value .
The below queries returns the output in 1 second.
SELECT TOP 10000 * FROM TestTable WHERE Column = 1 SELECT TOP 10000 * FROM TestTable WHERE Column = 2 SELECT TOP 10000 * FROM TestTable WHERE Column = 3
The below query alone takes 18 seconds to return the output.
SELECT TOP 10000 * FROM TestTable WHERE Column = 4
(FYI: The count of records for the column value 4 is lesser than the other column values)
Could anyone please let me know why this happens and how to resolve this ?
SQL Express installation is bombing out with the "A beta version of the .NET Framework 2.0 or SQL Server was detected on the computer...".
From the log file: Running: PerformSCCAction2 at: 2007/6/26 8:17:27 Loaded DLL:C:WINsystem32msi.dll Version:3.1.4000.2435 Product "{7131646D-CD3C-40F4-97B9-CD9E4E6262EF}" versioned 2.0 is not compatible with current builds of SQL Server.Expected at least version: 2.0.50727.42 The Product Name is ".NET Framework"
Now what is peculiar to me is that 1) this machine should have never had the beta software on it; 2) I have seen people with similar issues except their versioned info always said "versioned 2.0.50727" and the product guid is different, this one only says "2.0"; and 3) I thought that product guid was for the RTM 2.0.50727.42 release!
What is going on here and where is it getting this version information?
Is there a setting in SQL Server that ensures a column is not allowed to have the same value more than once? Or must this be set up in the insert statment itself? Or how about a business rule?
In a stored procedure that I'm fixing, there is a problem with assigning variable values inside a loop. The proc is using dynamic SQL and if statements to build all these statements, but I'm having to add a new variable value to it that is throwing it out of whack.
This is the current structure:
SET @MktNbr = 10
WHILE @MktNbr < 90
BEGIN
DECLARE@sqlstmt varchar(1000)
SET @Market = '0' + CONVERT(char(2),@MktNbr)
SET @sqlstmt = 'SELECT (columns) INTO dbo.table' + @Market + ' FROM #table WHERE marketcode = ''' + @Market + ''' IF @MktNbr = 50 BEGIN SET @MktNbr = 51 END ELSE IF @MktNbr = 51 BEGIN SET @MktNbr = 52 END ELSE IF @MktNbr = 52 BEGIN SET @MktNbr = 55 END ELSE IF @MktNbr = 55 BEGIN SET @MktNbr = 60 END ELSE BEGIN SET @MktNbr = @MktNbr + 10 END EXEC (@sqlstmt)
END
I'm probably having a blonde moment, but I'm trying to replace the if statements with this:
SET @MktNbr = CASE WHEN @MktNbr = 10 THEN 20 WHEN @MktNbr = 20 THEN 30 WHEN @MktNbr = 30 THEN 40 WHEN @MktNbr = 40 THEN 50 WHEN @MktNbr = 50 THEN 51 WHEN @MktNbr = 51 THEN 52 WHEN @MktNbr = 52 THEN 55 WHEN @MktNbr = 55 THEN 60 WHEN @MktNbr = 60 THEN 70 WHEN @MktNbr = 70 THEN 80 WHEN @MktNbr = 80 THEN 81 ELSE @MktNbr END
Clearly it's wrong because the proc bombs every time with a duplicate table error.
It has been suggested to me that I should hold these market values in an external table. This sounds reasonable but I'm ashamed to admit that I don't know how I'd implement that. Can someone maybe give me a nudge in the right direction?
Hello There: I am running a data flow within a ForEach loop wherein I am computing a value called QuotaGap. When it is 0 I do not want any further execution of the loop. I am using a Conditional Transform within this dataflow that writes a record to a table only when the QuotaGap is NOT 0. However, I am unable to terminate the execution of the loop as I am still within the dataflow.
Now, the computation of the gap requires a value from another variable called NetPurchases. I tried using an ExecuteSQL task in the control flow but could not figure out how to pass the value of the variable NetPurchases into the select statement to compute the gap. For example, the select statement would read:
select (QuotaUpperLimit - ?) As QuotaGap from <<tablename>>
I tried setting the parameter as an input as well as an output and it did not work.
Then I tried passing the entire SQL as a string within a variable. This does not work either because in order to compute the math QuotaUpperLimit - NetPurchases, both variables need to be integers but then you cannot concatenate integres together, which is what we need to do to create the SQL.
The other reason I am going through these hoops I guess is that I have not figured out a way to set the value of a variable within a data flow. I compute the value for QuotaGap within the dataflow in a ForEach loop but I have no way to pass this result to a variable called QuotaGap without using an ExecuteSQL task or another ForEach Loop.
I have spent hours on this simple issue and so have given up and looking to the good friends in this forum for help.
If what I have stated is not clear please let me know and I will try to clarify things a bit.
Hoping someone may be able to help with a problem I'm having with SSRS parameters....
My report has 2 parameters - the User Id used to login to the application and the Department(s) within the organisation. Based upon the User's role, the user may have access to data for one or many Departments.
Thus, the first parameter needs to be set in code based upon the User's login, however, the range of the second parameter (i.e. the range of Departments that the user can access) is controlled by the value of the first parameter.
The second parameter is to appear as a drop-down of Departments to which the User has access.
The report is to be produced for the selected Department.
Are you able to advise how to restrict the range of values for the second parameter based upon the value of the first parameter?
Hi everybody, Is there a way to set SelectParameter for SQLDataSource in ASPX file using System.Configuration.ConfigurationManager.AppSettings["SiteID"]) ? Thanks a lot in advance.