Snapshots Vs Cached Instances
Jun 19, 2007
Hello,
I would like to know what is the difference between a snapshot and a cached instance in SSRS?
Which one has the best performances and which one is the best for multiple users and reports containing parameters (the parameters are then passed in the where clause of the sql code; ex: WHERE IN(@param1))?
Thanks for your answers.
Zoz
View 4 Replies
ADVERTISEMENT
Oct 23, 2007
Hi,
I'm trying to understand the cases where it's more interesting to use snapshot and when it's more interesting to use cached instances.
If I have 100 users trying to reach a report, is it better to use snaphsot or cache instance ? In both case, the 100 users will have the same report result. And what about the performance, are they similar ?
Thanks for your time and response,
See you,
Have a nice day!
regards,
sandy
View 1 Replies
View Related
Nov 24, 2014
I ran the below 2 select statements and ended up seeing multiple cached instances of the same stored procedure. The majority have only one cached instance but more than a handful have multiple cached instances. When there are multiple cached instances of the same sproc, which one will sql server reuse when the sproc is called?
SELECT o.name, o.object_id,
ps.last_execution_time ,
ps.last_elapsed_time * 0.000001 as last_elapsed_timeINSeconds,
ps.min_elapsed_time * 0.000001 as min_elapsed_timeINSeconds,
ps.max_elapsed_time * 0.000001 as max_elapsed_timeINSeconds
[code]...
View 4 Replies
View Related
Sep 28, 2007
When I am in Visual Studio 2005, and I try to add an SQL database, I get the following error "generating user instances in sql server is disabled. use sp_configure user instances enabled to generate user instances." I am currently using SQL server 2005 Express. What do I need to do, to create an SQL database? Thanks in advance.
View 4 Replies
View Related
Jul 18, 2001
I had a server with SQL Server 7.0
I installed a named instance of SQL Server 2000 and then i passed all my DB
of the 7.0 instance to the 2000 instance.
Then i removed the 7.0 instance, that was the default instance.
So at the moment there is only the 2000 version, but it isn't the default
instance
Can the 2000 instance become the default instance? (So that clients can
connect to it simply through computer name, and not creating an alias)
thanks
Fede
View 1 Replies
View Related
Aug 8, 2007
Hello all,
I have a report with a table and a chart. It uses dataset1 as the data source.
All works fine.
I create a new dataset called dataset2.
The queries are exactly the same. The only differences between the 2 datasets is the database server and the fact that one of the columns is a smallint (in dataset2) and an int(in Dataset1)
I change the datasetName property of both the table and the chart to use dataset2.
When I run the report I get a conversion error stating that there was an overflow of int2 while using dataset1. I have verified the report is not using dataset1 anywhere. If I delete dataset1 and run the report the error goes away. If I add it back, I get the error again. Why is the report looking at dataset1 if it is not referenced at all in the report? Does SQL RS cache the datasets and verify each when it compiles?
regards,
Bill
View 9 Replies
View Related
Aug 1, 2007
I am using SQL server 2005. I have a VIEW that joins several tables. One of the table's column can be added dynamically by the user from a GUI interface. However, after a column is added, it does not show up in the VIEW immediately. It will take a while (I haven't figured out exactly how long) before the extra column shows up as the execution result of the VIEW.
So it seems like SQL server is caching that VIEW's schema. Is there anyway I can make this view always comes back with the latest schema?
Thanks a lot!
Penn
View 1 Replies
View Related
Mar 11, 2008
Hi, I have a search and I want to create a hyperlinked list of the top 5 search terms below it, what's the most efficient way to go about this?
View 20 Replies
View Related
Sep 20, 2007
I want to check the performance of m query and i just want to remove cached query results. Is there any suggestion how can i do this.
I just want to check after each modificatin how much improvement in performance
View 1 Replies
View Related
May 28, 2008
Phil, great links, really helpful and appreciated.
I just need to verify one thing on the lookup method:
--One of the lookup methods people were discussing is non-cached lookup -- which seem to be evaluated to be the fastest. Is the non-cached the default of LookUp transformation? and when I wanted the lookup method to be cached, I need to go into the Advance tab and set it to however %, right? thanks.
View 3 Replies
View Related
Mar 26, 2007
Have 3 reports that runs using a shared schedule to generate historic snapshots. This has be working for months and months but out of nowhere it changed.
So now when ever the schedule runs it will generate the historic snapshot for 1 report but NOT the 2 remaining. As a strange twist the report that is successful changes from time to time...
SQL Server 2000, RS+SP2.
Any ideas?
Thanks
Regards
Jonas
View 2 Replies
View Related
Aug 16, 2005
I was wondering if anyone had an concrete information about if there is a problem with having too many stored procedures or plans in the cache? Obviously there is an impact on memory but if we can ignore that for the time being, does SQL perform just as well with 100 query plans as it does with 10's millions of plans?
View 9 Replies
View Related
Dec 10, 2007
Is it possible to keep a Cached Lookup in memory when executing multiple Data Flows? Executing DFT€™s in parallel will cache and use the same LOOKUP statement. But what if I€™m executing the DFT sequentially, can I keep the LOOKUP from the first DFT in memory for the second DFT? For example, in my case, I€™m caching a lookup against the Customer dimension for invoices. The second DFT then processes credits and again does a lookup against the Customer dimension. I want to use the cached Customer records from the first DFT.
View 1 Replies
View Related
May 27, 2005
Is there a way to programatically determine if a report should be generated from the cache or run against real-time data?
View 7 Replies
View Related
Jul 8, 2006
Parameterized queries are only allowed on partial or none cache style lookup transforms, not 'full' ones. Is there some "trick" to parameterizing a full cache lookup, or should the join simply be done at the source, obviating the need for a full cache lookup at all (other suggestion certainly welcome)
More particularly, I'd like to use the lookup transform in a surrogate key pipeline. However, the dimension is large (900 million rows), so its would be useful to restrict the lookup transform's cache by a join to the source.
For example:
Source query is: select a,b,c from t where z=@filter (20,000 rows)
Lookup transform query: select surrogate_key,business_key from dimension (900 M rows, not tenable)
Ideal Lookup transform query:
select distinct surrogate_key
,business_key
from dimension d inner join
t on d.business_key = t.c
where t.z = @filter
View 7 Replies
View Related
Jul 13, 2005
We will soon be going to a new ERP system using MS SQL 2000. I'm looking into possible backup strategies. The database size will be about 50 Gigs.
We have a SAN, which the new ERP system will be backed up on. I know a little about a SAN, but I need to make decisions about the types of backups to make. We can have 10 active snapshots, so I could use snapshots during the day as a point in time backup. I could also use the SAN snapshot instead of using a full backup. Would it be safe to use the SAN snapshots instead of the normal SQL Server backups? I’m not sure how long it will take to do a full SQL Server backup because the server is still in testing mode. I’m not getting the big picture about when to use snapshots and when to use regular SQL Server backups.
Is it possible in Windows 2000 Server to stop the transactions and flush the buffers for a few seconds to do a snapshot? From what I understand, the database could be left in a suspect state if you use snapshots to restore a database.
Thanks in advance for any help!
View 5 Replies
View Related
May 16, 2008
I think I've read some conflicting advice in BOL. Maybe someone can clarify it for me.
Under "Database Mirroring and Database Snapshots" it says:
"You can take advantage of a mirror database that you are maintaining for availability purposes to offload reporting. To use a mirror database for reporting, you can create a database snapshots on the mirror database and direct client connection requests to the most recent snapshot"
To my mind, to enjoy a noteworthy performance gain, that mirror database would need to be on another server.
But then you read under "Database Snapshots":
"Multiple snapshots can exist on a source database and always reside on the same server instance as the database."
So how do you get the snapshot to the mirror?
View 2 Replies
View Related
Nov 9, 2007
HiWhat would be the quickest way to create a backup and revert programon an sql (2000) database?- Can you create a transaction on a database, regardless of theconnections and thenrollback it all via an external program- Could you monitor the changes with profiler and then reverse those?- If desperate, could you backup the db from a tool and then restoreit? (too slow to be practical?)Not sure how to do this so any offers would be appreciatedta
View 3 Replies
View Related
May 2, 2006
If multiple snapshots for the same report attempt to generate at the same second, is there something in place to prevent them from conflicting with each other?
Our SSRS 2005 application will use a console app that we are writing to change the default parameters on each report and call the CreateReportHistorySnapshot method. This application will be multi-threaded, so there is a possibility that multiple versions of the same report with different parameters will attempt to run at the same time. We need to be sure that these snapshots will not conflict and return the appropriate SnapshotHistoryID for that report run.
From looking at the History table in the ReportServer database it looks like there are uniqueIDs and other keys that would prevent duplicates, but what if the requests come in at the same time? Is SSRS 2005 smart enough to delay for a second so that there is not a duplicate? Since the way you reference these snapshots in your URL is with the Snapshot Time in UTC format, that only goes out to full seconds (2006-05-02T00:00:00).
Sorry for the length, but I could not think of how to condense this.
Thanks,
Steve
View 4 Replies
View Related
May 13, 2008
Hopefully not too much of a stupid question.
At the moment, we are using SSRS to report off an Oracle database, if we take a snapshot of the data on the Oracle database and put it within SQL, will we expect an increase in performace/response.
And is it a good idea to keep the snapshot in the same place where SSRS is installed?
Thanks in advance.
View 1 Replies
View Related
Mar 12, 2008
I want to share one of my doubt about SSRS and report server,
if i use snapshots or cache to my reports in report server, is it going to make increase in performance?
if it is going to make increase in performance,then SQLSERVER DATABASE is going to have any burdens,
means is it going to down the performance of Database?
Pls give me Responce to my Questions....
View 3 Replies
View Related
Jun 15, 2007
Hi
I am creating a full Merge replication setup between two servers
Is there any way that I can set it up so that I do not have to create a snapshot.
Ie I would gurantee that the data and structutre is EXACTLY the same on both the subscriber and the publisher.
The problem is that my data base is over 4GB and it is quicker for me to get the data from one server to the other by doing a backup and restore.
I am using SQL 2005 with the latest SP
Cheers
Ants
View 4 Replies
View Related
May 1, 2006
I have a procedure that generates dynamic sql and then executes via the execute(strSQL) syntax. BOL states that if I use sp_executesql with hard-typed parameters passed in variables, the query optimizer will 'probably' match the sql statement with the cached execution path, thus avoiding recompilation and speeding up the results for heavily run procedures.
Can anyone tell me if this is also true if the sql references an object on a linked sql server 2000 database? Technically, the sql is exactly the same, but I'm unsure if there is some exception due to the way linked objects are processed.
Thanks!
View 1 Replies
View Related
Apr 29, 2014
Any way to invalidate cached query plans? I would rather target a specific query instead of invalidating all of them. Also any sql server setting that will cause cached query plans to invalidate even though only one character in the queries has changed?
exec sp_executesql N'select
cast(5 as int) as DisplaySequence,
mt.Description + '' '' + ct.Description as Source,
c.FirstName + '' '' + c.LastName as Name,
cus.CustomerNumber Code,
c.companyname as "Company Name",
a.Address1,
a.Address2,
[code]....
In this query we have seen (on some databases) simply changing ‘@CustomerId int',@CustomerId=1065’ too ‘@customerId int',@customerId=1065’ fixed the a speed problem….just changed the case on the Customer bind parameter. On other servers this has no effect.the server is using an old cached query plan, but don’t know for sure.
View 9 Replies
View Related
Apr 11, 2008
Hello,
This is my first post, and I'm hoping you all can help.
Using Reporting Services 2005, I have several reports that use embedded images.
All images render fine when the report execution is set to:
1) Always run this report with the most recent data
1a) Do not cache temporary copies of this report
However, when I change the execution to either a Cache or a Snapshot, the images and some charts render as red "X" placeholders. This is sometimes remedied when the user clicks the page refresh button, but not always.
Of course, I could just have all the concurrent users use the uncached report that hits the OLAP server, but that would be highly inefficient, and just plain slow.
Thanks for any help on this subject.
-michael
View 2 Replies
View Related
Jun 12, 2007
I am frequently going back and forth between the making changes to my reports and previewing those changes, all from within visual studio without publishing the reports each time.
The problem is that frequently changes that I make are not reflected in the preview window unless I either close out of visual studio completely or I wait some length of time, (I'm not sure how long exactly, but 1/2 an hour seems to always do the trick.)
Is there a way to clear any cache and force visual studio to completely reprocess a report?
At least when it is formatting changes I can identify whether the change has stuck, but when I'm fixing bugs in the code, I can't tell if I didn't fix it or if the change just hasn't taken effect.
View 1 Replies
View Related
Aug 28, 2007
I am having the huge db of 80 GB and trying to configure merge replication. The intial snapshot application is failing due to some schema issues. I have made some necessary changes to avoid the schema issues. If I try start the merge agent it is going to re-initialize the snapshot and reapplying the whole snapshot will take atleast 10-15 hrs.
Is there any way to resume the snapshot from the point of failure.. avoiding the reapplying all the data which has already got transfered to subscriber.
View 1 Replies
View Related
Apr 30, 2007
Does someone know if doing a reindex on a clustered or non-clustered index cause the snapshot file to grow? In other words, is the data that makes up the snapshot copied from the source to the snapshot database? If a normal reindex is done on the underlying database, will it block users from acessing the snapshot? Any help would be appreciated.
View 1 Replies
View Related
Aug 7, 2006
Hi
I have setup my web synchronisation with very few problems until now. Initially I set the snapshots to generated by the subscriber when they first synchronise, this worked perfectly every time until a bottleneck occured after I reinitialised all the subscribers. When more than one subscriber then initialises the snapshot agent, the snapshot jobs fail because there is already a job running for user distributor_admin. The agent then retries etc, etc.. until all retries have failed. Ultimately only one of the snapshots is successfully created only after all other jobs have failed and run out of retries, it is then left alone to carry out its job.
Following this I made the decision to run with pre-generated snapshots instead so that I can schedule the server load and avoid these bottlenecks. Using the same publication I unchecked the option "Automatically define a partition and generate..." in the publication properties and proceeded to add each of the partitions specifying the correct "Host_Name()" value etc. After this I selected one partition and clicked "Generate the selected snapshots now". The snapshot was duly created but when synchronised the snapshot is not collected and used. Weirdly the merge porcess begins enumerating changes at the publisher followed by downloading 3100 chunks of data before receiving an error concerning the format of the message from the distributor.
If I check the option to allow auto generation, the synchronisation begins absolutely perfectly but only once it has generate a snapshot (Ignoring the one I created previously).
To add further weirdness, I created a small app which would generate the dynamic partitions using RMO. I used the sample code implicitly from BOL and it duly generated the dynamic snapshot job. I can then create a new Job class passing it the JobServer and JobName, it gives me the correct job but I cannot then call the Start() method because it keeps telling me the Job object is not created even though It is clearly enabled as a job in SSMS and can be right_clicked and started from there.
Even partition jobs created from within the SSMS appear as state "Creating" when I make an instance of it using the Job class.
What am I missing? Are my pre-generated snapshots not available to the subscriber because they seem to still be in the creating state? Is this a SQL Server configuration issue or have I setup the publication incorrectly?
Any help would be much appreciated.
Many thanks
Rab
P.S. Code snippet below
Imports Microsoft.SqlServer.Replication
Imports Microsoft.SqlServer.Management.smo
Imports Microsoft.SqlServer.Management.smo.agent
Imports Microsoft.SqlServer.Management.Common
Public Class frmSnapshot
Private Sub btnGenerate_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles btnGenerate.Click
Dim hostName As String = txtSiteID.Text.Trim
' Define the server, database, and publication names
Dim publisherName As String = "WDU340"
Dim publicationName As String = txtOrgID.Text.Trim
Dim publicationDbName As String = publicationName
Dim distributorName As String = publisherName
Dim publication As MergePublication
Dim partition As MergePartition
Dim snapshotAgentJob As MergeDynamicSnapshotJob = Nothing
Dim schedule As ReplicationAgentSchedule
' Create a connection to the Distributor to start the Snapshot Agent.
Dim distributorConn As ServerConnection = New ServerConnection(distributorName)
Try
' Connect to the Publisher.
distributorConn.Connect()
' Set the required properties for the publication.
publication = New MergePublication()
publication.ConnectionContext = distributorConn
publication.Name = publicationName
publication.DatabaseName = publicationDbName
' If we can't get the properties for this merge publication,
' then throw an application exception.
If (publication.LoadProperties() Or publication.SnapshotAvailable) Then
' Set a weekly schedule for the filtered data snapshot.
If RunJob(publication, snapshotAgentJob, distributorConn, hostName) = False Then
schedule = New ReplicationAgentSchedule()
schedule.FrequencyType = ScheduleFrequencyType.OnDemand
'schedule.FrequencyRecurrenceFactor = 1
'schedule.FrequencyInterval = Convert.ToInt32("0x001", 16)
' Set the value of Hostname that defines the data partition.
partition = New MergePartition()
partition.DynamicFilterHostName = hostName
snapshotAgentJob = New MergeDynamicSnapshotJob()
snapshotAgentJob.DynamicFilterHostName = hostName
' Create the partition for the publication with the defined schedule.
publication.AddMergePartition(partition)
publication.AddMergeDynamicSnapshotJobForLateBoundComClients(snapshotAgentJob, schedule)
RunJob(publication, snapshotAgentJob, distributorConn, hostName)
End If
Else
Throw New ApplicationException(String.Format( _
"Settings could not be retrieved for the publication, " + _
" or the initial snapshot has not been generated. " + _
"Ensure that the publication {0} exists on {1} and " + _
"that the Snapshot Agent has run successfully.", _
publicationName, publisherName))
End If
Catch ex As Exception
' Do error handling here.
MessageBox.Show(String.Format( _
"The partition for '{0}' in the {1} publication could not be created.", _
hostName, publicationName) & ": " & ex.Message)
Finally
If distributorConn.IsOpen Then
distributorConn.Disconnect()
End If
End Try
End Sub
Private Function RunJob(ByVal publication As MergePublication, ByVal snapshotAgentJob As MergeDynamicSnapshotJob, ByVal distributorConn As ServerConnection, ByVal hostName As String) As Boolean
Dim jobs As ArrayList
Dim iJob As Integer
Dim bExists As Boolean = False
jobs = publication.EnumMergeDynamicSnapshotJobs()
For iJob = 0 To jobs.Count - 1
snapshotAgentJob = DirectCast(jobs(iJob), MergeDynamicSnapshotJob)
If snapshotAgentJob.DynamicFilterHostName = hostName Then
'Run the Job
bExists = True
Dim server As New Server(distributorConn)
Dim job As New Job = (server.JobServer, snapshotAgentJob.Name)
MessageBox.Show(String.Format("About to run the dynamic snapshot job: {0} with status: {1}", job.Name, job.State.ToString))
Try
job.Start()
Catch ex As Exception
MessageBox.Show(ex.ToString)
Throw
End Try
Exit For
End If
Next
End Function
End Class
View 7 Replies
View Related
Mar 28, 2007
I posted last year about getting and Expand All/Collapse All link working in reports, and it does work perfectly for on-demand reporting. I was able to get these working using the boolean report parameter and then the Jump to Report option going back to itself changing that parameter.
The issue that I have just discovered is that when a report with these Expand All/Collapse All links are in a report snapshot, clicking those links will cause it to re-render the report from the datasource at the time the link is clicked. This is an issue for one of the datasources we are using because the data is always changing, so when it goes back to re-render the counts and data returned are completely different then they were when the snapshot was first created. As time goes on the data will no longer even be in the datasource since it can only maintain 4 months of data.
Is there another way to get all detail groups to expand/collapse at once that does not require the report to be completely re-rendered from the datasource?
View 1 Replies
View Related
Feb 2, 2007
I am using SQL2005 merge replication and have pull subscribers on a low bandwidth link
I am compressing the snapshot into an alternate folder. Files are not put into the default folder
When I start a synchronization, I would expect the cab file to be copied to the subscriber and then the files to be extracted locally at the subscriber in order to apply the snapshot
However, what appears to be happening is that the files are being extracted from the cab file on the publisher (in a UNC specified directory) and then copied in their uncompressed form to the subscriber - resulting in an extremely slow snapshot application.
Any ideas what I am doing wrong? I have read about the options for using FTP to transfer snapshot files, but I am not clear whether I have to use FTP in order to transmit a compressed snapshot. I don't want to use FTP unless I need to.
Any suggestions would be much appreciated
aero1
View 8 Replies
View Related
Sep 27, 2006
Hi everybody, I'm quite new to SQL 2005 and I€™m trying to understand some key concepts regarding replicas. I need to develop an application with characteristics similar to the Sales Order Sample for Merge Replication, on the client side it should run with the express version of sql server and also the synchronization should only work via web. I try to run the sample but I got an exception in the CreateSubscription method on invoking publisherConn.Connect();
TITLE: Microsoft.SqlServer.ConnectionInfo
------------------------------
Failed to connect to server XXX.
------------------------------
ADDITIONAL INFORMATION:
Login failed for user ''. The user is not associated with a trusted SQL Server connection. (Microsoft SQL Server, Error: 18452)
I can€™t understand how this connection could work if it isn€™t aware of the fact that it should use https to connect with the server.
Can someone point me in the right direction?
Thanks
Angelo
View 6 Replies
View Related
Feb 12, 2007
Hi,
I'm developing a Data Mart and i'm experiencing a performance gap between my fact table and its snapshot.
I create snapshot with the istruction:
CREATE DATABASE DB_SNAP ON
( NAME = DB_SNAP_Data, FILENAME =
'C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDataDB_SNAP_Data.ss' )
AS SNAPSHOT OF DB;
And it works. But executing queries on the snapshot result very "low".
Can anyone tell me why?
Tanks.
F.
View 4 Replies
View Related