I'm trying to understand the cases where it's more interesting to use snapshot and when it's more interesting to use cached instances.
If I have 100 users trying to reach a report, is it better to use snaphsot or cache instance ? In both case, the 100 users will have the same report result. And what about the performance, are they similar ?
I would like to know what is the difference between a snapshot and a cached instance in SSRS?
Which one has the best performances and which one is the best for multiple users and reports containing parameters (the parameters are then passed in the where clause of the sql code; ex: WHERE IN(@param1))?
Hello, This is my first post, and I'm hoping you all can help.
Using Reporting Services 2005, I have several reports that use embedded images. All images render fine when the report execution is set to: 1) Always run this report with the most recent data 1a) Do not cache temporary copies of this report
However, when I change the execution to either a Cache or a Snapshot, the images and some charts render as red "X" placeholders. This is sometimes remedied when the user clicks the page refresh button, but not always.
Of course, I could just have all the concurrent users use the uncached report that hits the OLAP server, but that would be highly inefficient, and just plain slow.
I ran the below 2 select statements and ended up seeing multiple cached instances of the same stored procedure. The majority have only one cached instance but more than a handful have multiple cached instances. When there are multiple cached instances of the same sproc, which one will sql server reuse when the sproc is called?
SELECT o.name, o.object_id, ps.last_execution_time , ps.last_elapsed_time * 0.000001 as last_elapsed_timeINSeconds, ps.min_elapsed_time * 0.000001 as min_elapsed_timeINSeconds, ps.max_elapsed_time * 0.000001 as max_elapsed_timeINSeconds
When I am in Visual Studio 2005, and I try to add an SQL database, I get the following error "generating user instances in sql server is disabled. use sp_configure user instances enabled to generate user instances." I am currently using SQL server 2005 Express. What do I need to do, to create an SQL database? Thanks in advance.
I seem to have a strange problem when applying a snapshot when the tables in the publication have been updated while the snapshot was being generated.
Say for example there is a table called RMAReplacedItem in the publication. When the snapshot starts being applied to the subscriber, a stored procedure called sp_MSins_RMAReplacedItem_msrepl_css gets created that handles an insert if the row already exists (ie it updates the row rather than inserting it). However, after all the data has been loaded into the tables, instead of calling this procedure, it tries to call one called sp_MSins_RMAReplacedIte_msrepl_cssm - it takes the last letter of the table name and adds it to the end of the procedure name.
The worst part is that this causes the application of the snapshot to fail, but it doesnt report what the error is, and instead it just tries applying the snapshot again. The only way i have managed to find which call is failing is to run profiler against the subscriber while the snapshot is being applied and see what errors.
I have run sp_broswereplcmds and the data in there is what is applied to the subscriber - ie the wrong procedure name.
All the servers involved are running sql 2005 service pack 2. The publisher and subscriber were both upgraded from sql 2000, but the distribution server is a fresh install of sql 2005.
I had a server with SQL Server 7.0 I installed a named instance of SQL Server 2000 and then i passed all my DB of the 7.0 instance to the 2000 instance. Then i removed the 7.0 instance, that was the default instance. So at the moment there is only the 2000 version, but it isn't the default instance Can the 2000 instance become the default instance? (So that clients can connect to it simply through computer name, and not creating an alias)
Hello all, I have a report with a table and a chart. It uses dataset1 as the data source. All works fine. I create a new dataset called dataset2. The queries are exactly the same. The only differences between the 2 datasets is the database server and the fact that one of the columns is a smallint (in dataset2) and an int(in Dataset1) I change the datasetName property of both the table and the chart to use dataset2. When I run the report I get a conversion error stating that there was an overflow of int2 while using dataset1. I have verified the report is not using dataset1 anywhere. If I delete dataset1 and run the report the error goes away. If I add it back, I get the error again. Why is the report looking at dataset1 if it is not referenced at all in the report? Does SQL RS cache the datasets and verify each when it compiles?
I am using SQL server 2005. I have a VIEW that joins several tables. One of the table's column can be added dynamically by the user from a GUI interface. However, after a column is added, it does not show up in the VIEW immediately. It will take a while (I haven't figured out exactly how long) before the extra column shows up as the execution result of the VIEW. So it seems like SQL server is caching that VIEW's schema. Is there anyway I can make this view always comes back with the latest schema? Thanks a lot! Penn
I want to check the performance of m query and i just want to remove cached query results. Is there any suggestion how can i do this. I just want to check after each modificatin how much improvement in performance
Phil, great links, really helpful and appreciated.
I just need to verify one thing on the lookup method: --One of the lookup methods people were discussing is non-cached lookup -- which seem to be evaluated to be the fastest. Is the non-cached the default of LookUp transformation? and when I wanted the lookup method to be cached, I need to go into the Advance tab and set it to however %, right? thanks.
I was wondering if anyone had an concrete information about if there is a problem with having too many stored procedures or plans in the cache? Obviously there is an impact on memory but if we can ignore that for the time being, does SQL perform just as well with 100 query plans as it does with 10's millions of plans?
Is it possible to keep a Cached Lookup in memory when executing multiple Data Flows? Executing DFT€™s in parallel will cache and use the same LOOKUP statement. But what if I€™m executing the DFT sequentially, can I keep the LOOKUP from the first DFT in memory for the second DFT? For example, in my case, I€™m caching a lookup against the Customer dimension for invoices. The second DFT then processes credits and again does a lookup against the Customer dimension. I want to use the cached Customer records from the first DFT.
Parameterized queries are only allowed on partial or none cache style lookup transforms, not 'full' ones. Is there some "trick" to parameterizing a full cache lookup, or should the join simply be done at the source, obviating the need for a full cache lookup at all (other suggestion certainly welcome)
More particularly, I'd like to use the lookup transform in a surrogate key pipeline. However, the dimension is large (900 million rows), so its would be useful to restrict the lookup transform's cache by a join to the source.
For example:
Source query is: select a,b,c from t where z=@filter (20,000 rows)
Lookup transform query: select surrogate_key,business_key from dimension (900 M rows, not tenable)
I have a procedure that generates dynamic sql and then executes via the execute(strSQL) syntax. BOL states that if I use sp_executesql with hard-typed parameters passed in variables, the query optimizer will 'probably' match the sql statement with the cached execution path, thus avoiding recompilation and speeding up the results for heavily run procedures.
Can anyone tell me if this is also true if the sql references an object on a linked sql server 2000 database? Technically, the sql is exactly the same, but I'm unsure if there is some exception due to the way linked objects are processed.
Any way to invalidate cached query plans? I would rather target a specific query instead of invalidating all of them. Also any sql server setting that will cause cached query plans to invalidate even though only one character in the queries has changed?
exec sp_executesql N'select cast(5 as int) as DisplaySequence, mt.Description + '' '' + ct.Description as Source, c.FirstName + '' '' + c.LastName as Name, cus.CustomerNumber Code, c.companyname as "Company Name", a.Address1, a.Address2,
[code]....
In this query we have seen (on some databases) simply changing ‘@CustomerId int',@CustomerId=1065’ too ‘@customerId int',@customerId=1065’ fixed the a speed problem….just changed the case on the Customer bind parameter. On other servers this has no effect.the server is using an old cached query plan, but don’t know for sure.
I am frequently going back and forth between the making changes to my reports and previewing those changes, all from within visual studio without publishing the reports each time.
The problem is that frequently changes that I make are not reflected in the preview window unless I either close out of visual studio completely or I wait some length of time, (I'm not sure how long exactly, but 1/2 an hour seems to always do the trick.)
Is there a way to clear any cache and force visual studio to completely reprocess a report?
At least when it is formatting changes I can identify whether the change has stuck, but when I'm fixing bugs in the code, I can't tell if I didn't fix it or if the change just hasn't taken effect.
I'm working on a reporting tool that could bring back hundreds of thousands of results back at once. I need some way to run the actual query only once a day, and then the reporting tool would just pull back this cached results. To be short, I need to figure out how to do this using a minimum amount of resources. Would a DataView work with something like this? How would I have it update only once a day? I appreciate any advice!
I have a datagridview bound to a table that is part of an Entity Framework model. A user can edit data in the datagridview and save the changes back to SQL. But, there is a stored procedure that can also change the data, in SQL, not in the datagridview. When I try to "refresh" the datagridview the linq query always returned the older cached data. Here's the code that I have tried using to force EF to pull retrieve new data:
// now refresh the maintenance datagridview data source using (var context = new spdwEntities()) { var maintData = from o in spdwContext.MR_EquipmentCheck where o.ProdDate == editDate orderby o.Caster, o.Strand select o; mnt_DGV.DataSource = maintData; }
When I debug, I can see that the SQL table has the updated data in it, but when this snippet of code runs, maintData has the old data in it.
way to invalidate cached query plans? I would rather target a specific query instead of invalidating all of them.
Also do you know of any sql server setting that will cause cached query plans to invalidate even though only one character in the queries has changed?
exec sp_executesql N'select cast(5 as int) as DisplaySequence, mt.Description + '' '' + ct.Description as Source,
[Code].....
In this query we have seen (on some databases) simply changing ‘@CustomerId int',@CustomerId=1065’ too ‘@customerId int',@customerId=1065’ fixed the a speed problem….just changed the case on the Customer bind parameter. On other servers this has no effect. I’m thinking the server is using an old cached query plan, but don’t know for sure.
With SQL2005 SP2, we are seeing that when auto stats run on one or more indexes of a large table (1.5M rows), then immediately the stored proc using that table starts acting as if the query plan is no longer any good. This causes a drastic slowdown in response time and a corresponding increase of table reads to get the data. E.g, the next execution of the procedure after the auto stats kick in goes from 355 reads to 755000 reads (as depicted by Profiler). Generally, there are about 25 people using the DB at any one time. They connect through a mid-tier VB component.
I tried adding WITH RECOMPILE to the stored proc in question, but that caused almost all executions to run at the higher number. I thought that the WITH RECOMPILE hint would create a new query plan for each execution of the procedure and that plan would the the latest and greatest. Perhaps it did, but most users got stuck with the higher number of reads anyway. After taking the hint out, everyone went back to getting the 335 number and quick response times.
What we are wrestling with is that when those auto stats hit, it really messes up everyone until we manually recompile the procedure. Daily we delete all records in the table that are over 45 days old, so the table stays pretty much the same size. We also set the recompile flag to cause a new plan to be generated that will reflect the smaller amount of data. Should we also run a stats update before recompiling the procedure? Profiler has been very helpful in capturing what is going on, so I think I have a good handle on that. However, I don't understand why WITH RECOMPILE produced a messed up plan for everyone. The compile itself seems to take only 1 ms when done from the query screen.
Is there a known problem with BIS Deploy not clearing cache. I repeatly (but not always) have an issue where I will do a BIS Deploy of an RDL but when I run it, the old report runs. I can't clear this out until I explicitly log into the Report Manager and explicitly delete the RDL and re-deploy from the BIS.
Is there a workaround, fix, process change, configuration change I can do to keep this from happening?
I am trying to setup cached reports so that one of my larger reports doesn't have to be re-run every time someone wants to view it (the data source only updates every 24 hours).
Anyway I made a new data source, set the report to use that, and in that data source I said to use "SQLexampleUserName" as the stored credentials.
Now when I go to run the report it says: Login failed for user 'username'. The user is not associated with a trusted SQL Server connection.
) ================================================= In my VB 2005 Express, I created a project "KimmelCallNWspWithAdoNet" that had the following code: --Form_Kimmel.vb-- Imports System.Data
Imports System.Data.SqlClient
Imports System.Data.SqlTypes
Public Class Form_Kimmel
Public Sub InsertCustomer()
Dim connectionString As String = "Integrated Security-SSPI;Persist Security Info=False;" + _
End Class ==============================================
I executed the Form_Kimmel.vb and I got no errors. But I did not get the new values insterted in the table "Custermers" of Northwind database. Please help and tell me what I did wrong and how to correct this problem.
We have a debate in our team about embedded SQL vs. Stored Procs.
The argument is why use SP's if you can embed the SQL in the code and SQL2K will cache it on the fly?
I can't find any definitive information on pros and cons between the two methods.
If there are no major performance issues, or gotchas, I guess it comes down to developer preference.
SP Pros: - Great SQL support in VS.NET (dev, debug, integration) - Seperation of database specific code from middle tier. - Less lines of code in middle tier - VS.NET support for .xsd dataset definitions. - Logic closer to data for more demanding processes.
Embedded SQL Pros: - Less artifacts for version control - Better encapsulation of logic
I have a reporting viewer in a windows form that behaves very strange. When I open the form and run the report it shows up nicely in the report viewer. If I print the report it only prints one page. When i print the report a second time the whole report is printed. Next I'll change the report parameters and run the report, then it shows up nicely in the viewer, but when I print the report the first report is printed.
Can anybody tell me..which command should i use to get the list of all the server names, all the sql server instances in each server and all the databases in each sql server instance.
I have a server running one instance of sql server enterprise edition. I want to install an additional application that requires sql enterprise edition. Can I install sql enterprise an create another instance for the new app. I know that I would have to give it's unique instance name.
i am working on 2 instances of sql. I want to do a queries and selects with this 2 instances in a single page. How could I do that? For e.g. assuming the name of the instances are A and B...
thanks
RON ________________________________________________________________________________________________ "I won't last a day without SQL"