Can someone share how to setup an event-driven cache refreshing option? The options provided from SSRS interface are based on a fixed schedule. If I would like to setup a dependency in determining when to refresh reports caching, based on the successful refreshing of the underlyinn SSAS cube, how can I achieve it?
Is there an utility/API provided by SSRS? Any samples scripts are very much appreciated.
Has anyone explored a process whereby the Transaction Log would be backup up based upon a defined threshold, ie. 75% full? All the research against news group posts and SQL2000 literature seems to point to scheduling a log backup job on a periodic basis. My workflow isn't 24 hour consistant and even adjusting the interval during the day isn't a good answer when the multiple databases on a server fill their logs at different rates.
Has anyone explored a process whereby the Transaction Log would bebackup up based upon a defined threshold, ie. 75% full? All theresearch against news group posts and SQL2000 literature seems topoint to scheduling a log backup job on a periodic basis. My workflowisn't 24 hour consistant and even adjusting the interval during theday isn't a good answer when the multiple databases on a server filltheir logs at different rates.
I need to send some reports only when there is data in the relevant tables. This is totally random and could be at anytime during a given day or any day the week. This involves multiple reports and the user is adamant that she does not have time to initiate the reports herself. Any ideas how this can be accomplished?
Im getting this error when trying to set up a cache dependency...are there any special permissions etc?From CS:SqlCacheDependency dep = new SqlCacheDependency("MySite-Cache", "Products");Cache.Insert("Products", de.GetAllProductsList(), dep); From connectionStrings.config:<add name="SiteDB" connectionString="Data Source=localhost,[port]SQLEXPRESS;Integrated Security=true;User Instance=true; AttachDBFileName=|DataDirectory|ASPNETDB.MDF" providerName="System.Data.SqlClient" />Also tried this using my machinename<add name="SiteDB" connectionString="Data Source=<machinename>,[port]SQLEXPRESS;Integrated Security=true;User Instance=true; AttachDBFileName=|DataDirectory|ASPNETDB.MDF" providerName="System.Data.SqlClient" /> From web.config: <caching> <sqlCacheDependency enabled="true" pollTime="10000"> <databases> <add name="MySite-Cache" connectionStringName="SiteDB" pollTime="2000"/> </databases> </sqlCacheDependency> </caching> EDIT: So making progress I can't seem to get the table registered for cache dependency:The sample i have says"aspnet_regsql.exe -E -S .SqlExpress -d aspnetdb -t Customers -et"and the command line response is "Enabling the table for SQL cache dependency..An error has happened. Details of the exception:The table 'Customers' cannot be found in the database."Where does this "Customers" table come from? There is obviously not an application specific "Customers" table in aspnetdb I'm confused probably more by the example than anything....
Is there a way to drop clean buffers at the database level instead of the server/instance level like the undocumented €œDBCC FLUSHPROCINDB (@dbid)€?? Is there a workaround for €œdbo€? to be able to flush procedure and data cache without being elevated to €œsysadmin€? server role?
PS: I am aware of the sp_recompile option that can be used to invalidate cached execution plans. Thx.
I am looking at the plan caches/cached pages from the perspective of sys.dm_os_memory_cache_counters and sql serverlan Cache - Cache Pages
For the first one I am using
select (sum(single_pages_kb) + sum(multi_pages_kb) ) from sys.dm_os_memory_cache_counters where type = 'CACHESTORE_SQLCP' or type = 'CACHESTORE_OBJCP' a slight change from a query in http://blogs.msdn.com/sqlprogrammability/
For the second just perfmon.
The first one gives me a count of about 670,000 pages only for the object and query cache and the second one gives me a total of about 100,000 pages for five type of caches including object and query.
If I am using the query from http://blogs.msdn.com/sqlprogrammability/ to determin the plan cache size
select (sum(single_pages_kb) + sum(multi_pages_kb) ) * 8 / (1024.0 * 1024.0) as plan_cache_in_GB from sys.dm_os_memory_cache_counters where type = 'CACHESTORE_SQLCP' or type = 'CACHESTORE_OBJCP'
it gives me about 5 GB when in fact my SQL Server it can access only max 2GB with Total and Target Server Memory at about 1.5 GB.
My SQL Server 2005 SP4 on Windows 2008 R2 is flooded with the below errors:-
Date  10/25/2011 10:55:46 AM Log  SQL Server (Current - 10/25/2011 10:55:00 AM) Source  spid Message Event Tracing for Windows failed to send an event. Send failures with the same error code may not be reported in the future. Error ID: 0, Event class ID: 54, Cause: (null).  Is there a way I can trace it how it is coming? When I check input buffer for these ids, it looks like it is tracing everything. All the general application DMLs are coming in these spids.
I have been testing with the WMI Event Watcher Task, so that I can identify a change to a file. The WQL is thus:
SELECT * FROM __InstanceModificationEvent within 30 WHERE targetinstance isa 'CIM_DataFile' AND targetinstance.name = 'C:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\Backup\AdventureWorks.bak'
This polls every 30 secs and in the SSIS Event (ActionAtEvent in the WMI Task is set to fire the SSIS Event) I have a simple script task that runs a message box).
My understanding is that the event polls every 30 s and if there is a change on the AdventureWorks.bak file then the event is triggered and the script task will run producing the message. However, when I run the package the message is occurring every 30s, meaning the event is continually firing even though there has been NO change to the AdventureWorks.bak file.
Am I correct in my understanding of how this should work and if so why is the event firing when it should not ?
Server 2003 SE SP1 5.2.3790 Sql Server 2000, SP 4, 8.00.2187 (latest hotfix rollup) We fixed one issue, but it brought up another. the fix we applied stopped the ServicesActive access failure, but now we have a failure on MSSEARCH. The users this is affecting do NOT have admin rights on the machine, they are SQL developers. We were having
Event Type: Failure Audit Event Source: Security Event Category: Object AccessEvent ID: 560 Date: 5/23/2007 Time: 6:27:15 AM User: domainuser Computer: MACHINENAME Description: Object Open: Object Server: SC Manager Object Type: SC_MANAGER OBJECT Object Name: ServicesActive Handle ID: - Operation ID: {0,1623975729} Process ID: 840 Image File Name: C:WINDOWSsystem32services.exe Primary User Name: MACHINE$ Primary Domain: Domain Primary Logon ID: (0x0,0x3E7) Client User Name: User Client Domain: Domain Client Logon ID: (0x0,0x6097C608) Accesses: READ_CONTROL Connect to service controller Enumerate services Query service database lock state
We recently upgraded to SQL 2005 from SQL 2000. We have most of our issues ironed out however about every 1 minute there is a message in the Application Event log and the SQL log that states:
EVENT ID 18456 Login Failed for the users DOMAIN/ACCOUNT [CLIENT: <local machine>]
This is a state 16 message which I thought meant that the account does not have access to the default database. The account is actually the account that the SQL services run under.
Any ideas? We can't seem to figure this one out. We actually upgraded to 2005 from 2000 and had an error appear after every reboot that prevented the SQL Agent from running(This application has failed to start because GAPI32.dll was not found. Re-installing the application may fix this problem.) We did a full uninstall of SQL and reinstalled fresh and restored the databases from .bak files and that is when the EVENT ID 18546 started occuring every minute.
We don't have any SQL heavy hitters here so please be detailed with any possible solutions. That you very much for any help you can provide!
Hi. I have a datasource that depend on parameter A. Parameter A get his values from query and have a defualt value. Parameter B get his default value from query that depend on parameter A.
Now, whan I run the report, parameter A get a value and then parameter B get his value and the datasource run OK.
But the problem is that when I'm changing the A value (from the value list) - The datasource run fine but Parameter B stay with the old value with out any change.
Does any one know how to solve it. I'll be happy to give more explanation if it isn't clear enough.
I am wondering if it is possible to have a report generated by RS refresh periodically automatically. This could be realized by inserting a few lines of JavaScript to the report including the reload() function, but I do not know if there is anyway to do such thing.
Hi,I’m new to SQL Express, but I have created a table and a stored proc to populate it.I have dragged the table into a form so I can view the data in a GridView.I have added a button to add new rows to the table. All the above works fine, except when I hit add, the data gets added, but the GridView doesn’t update and show the new data. Is there some code I can add to the add button that also refreshed the GridView? ThanksMike
I had an interesting problem come up today. I have a report that when you preview on vs or click view report on the report server, the report continually refreshes itself. I cannot see that I have written anything different in this report than any other report.
I saw one other append here on a continually refreshing report but that was linked to a document map and this report has not document map.
Has anyone seen this problem of a continually refreshing report?
It's not a big deal as I figure I will just have to write it from scratch again, but I am interested to see if I can stop it before I re-write it.
and every time i restart and put it back up ,,,,it rewinds the database
the only way i can prevent this is if i restart the server
wait a phew hours then put it back up ,,"which makes it the same way it was when i logged off"
i was told that sql database servers refresh every hour.
is there a way i can make it refresh before i restart it so then i can put the server straight back up
if i restart the server back up straight away it does a rewind for some reason.
tech ::
the server does a rewind because it wouldn't have saved the game so to speak - as in updated everyone's characters. So yes waiting is the only way as far as I know to save the game.
I wrote a stored procedure for SqlServer 2000 and i am using it for paging purpose. The procedure is as follows :
set ANSI_NULLS ON set QUOTED_IDENTIFIER ON go ALTER PROCEDURE [dbo].[OLAP_PagedRows] ( @SelectFields nVarchar(2000) =NULL,
@GroupByFields nvarchar(1000) =NULL,
@BaseTable varchar(100),
@KeyColumn nvarchar(200)=NULL ,
@JoinTables varchar(500) =NULL,
@ConditionalClause varchar(1000) =NULL,
@Pagesize int = 10,
@PageNumber int =1,
@SortExpression nvarchar(200)=NULL,
@SearchText nvarchar(200)=NULL
)
AS
BEGIN
DECLARE @SQLSTMT NVarchar(4000)
DECLARE @SQLSTMT1 NVarchar(4000)
SET @SQLSTMT1 = ''
--check whether page size is given null or not, if so set to default value
IF @Pagesize IS NULL OR @Pagesize = ''
BEGIN
SET @Pagesize =10
END
--check whether page number is given null or not, if so set to default value
IF @PageNumber IS NULL OR @PageNumber = ''
BEGIN
SET @PageNumber =1
END
--Start constructing the query --
SET @SQLSTMT = 'SELECT '
SET @SQLSTMT1 = 'DECLARE @CountValue INT SELECT @CountValue = count(*) From '+@BaseTable
SET @SQLSTMT = @SQLSTMT + @SelectFields + ' FROM '+@BaseTable
If @JoinTables Is Not Null
BEGIN
SET @SQLSTMT = @SQLSTMT + ' ' +@JoinTables
SET @SQLSTMT1 = @SQLSTMT1 + ' ' +@JoinTables
END
DECLARE @StmtWhereClause nvarchar(500)
SET @StmtWhereClause =''
--------------------- Get where conditional clause
If (@SearchText Is Not Null AND RTRIM(LTRIM(@SearchText))<>'')
BEGIN
SET @StmtWhereClause = @StmtWhereClause + ' WHERE ' + @SearchText
END
If @ConditionalClause Is Not Null AND RTRIM(LTRIM(@ConditionalClause))<>''
BEGIN
IF (@StmtWhereClause <> '')
BEGIN
SET @StmtWhereClause= @StmtWhereClause + 'AND ' +@ConditionalClause
END
ELSE
BEGIN
SET @StmtWhereClause = @StmtWhereClause + ' WHERE ' + @ConditionalClause
END
END
SET @SQLSTMT = @SQLSTMT + @StmtWhereClause
SET @SQLSTMT1 = @SQLSTMT1 + @StmtWhereClause
If @GroupByFields Is Not Null And RTRIM(LTRIM(@GroupByFields))<>''
BEGIN
SET @SQLSTMT = @SQLSTMT + ' Group By ' +@GroupByFields
SET @SQLSTMT1 = @SQLSTMT1 + ' Group By ' +@GroupByFields
END
IF @SortExpression Is Not Null AND RTRIM(LTRIM(@SortExpression))<>''
BEGIN
SET @SortExpression = LTRIM(RTRIM(' Order By '+ @SortExpression))
SET @SQLSTMT = @SQLSTMT +' '+ @SortExpression
SET @SQLSTMT1 = @SQLSTMT1 +' '+ @SortExpression
END
SET @SQLSTMT1= @SQLSTMT1+' SELECT @CountValue As MyRows '
--SELECT @SQLSTMT1
--SELECT @SQLSTMT
DECLARE @StartRow INT
SET @SQLSTMT = ' DECLARE temp_Cursor CURSOR SCROLL FOR '+@SQLSTMT
EXECUTE SP_EXECUTESQL @SQLSTMT
Open temp_Cursor
DECLARE @RowCount INT
SET @RowCount = 1
SET @startRow = (@PageSize * (@PageNumber-1))+@RowCount
--SELECT @startRow as 'Current Row'
WHILE @RowCount <= @PageSize
BEGIN
--Select @StartRow 'as @StartRow'
FETCH ABSOLUTE @startRow From temp_Cursor
SET @RowCount= @RowCount+1
SET @StartRow = @startRow + 1
END
deallocate temp_Cursor
EXECUTE SP_EXECUTESQL @SQLSTMT1
END
It is working fine but I have problem with this kind of paging. I need to load the whole data into the cursor and i have to fetch records. The problem is that my table's contains more than Half a million records in it. If I have to load each time this cursor it will be a very big problem on the server side.
Probably it may not be a best solution, but sqlserver 2000 cannot provide more help than this. If I use sub-query for this like using Top <Number> it adversly effecting the nature of the data retrieval.
One solution that I am thinking is Load cursor once and whenever some updations performed on those tables from which cursor is getting data should be automatically reflect the changes.
I have a report in RS that uses a cube as a data source. I made some changes to a cube in AS 2005 and I am not sure how I can refresh my existing datasets without deleting everything?
Im having a report.When i click on one item in Report it navigate to second Report.Here Second report acts as a Child report.Whne i click on refresh button it goes back parent report.Some times i get error
" Failed to load viewstate. The control tree into which viewstate is being loaded must match the control tree that was used to save viewstate during the previous request. For example, when adding controls dynamically, the controls added during a post-back must match the type and position of the controls added during the initial request. "
Hi, I'm having serious issues trying to refresh a schema in a SQLDatasource. It is hooked to a stored procedure that takes two varchar(39) parameters. The default parameters in this case are '%'. Note I am working in the designer. If I set it up as a stored proc, I can't even get the 'test query' to run in the builder. If I set it up as a select statement, ala 'exec <procname> @p1, @p2', the 'test query' will run. In neither case will hitting refresh schema work. It returns 'Invalid length parameter passed to the substring function' The stored proc is nothing special, simply returning a select based upon the parameters. Thanks, Nick H
Can someone tell me how you can automate the refresh of OLAP Cubes? I just inhereted a data warehouse running on SQL7 and the idea of having to go in and manually refresh the cubes everyday is, well... stupid. I can't believe that they've been doing this for a year. Unfortunately, I'm not familar enough with MSOLAP yet and I can't figure this out. Any tips?
I'm getting a connection and then loosing my connection upon refreshing the browser with this script connecting to MSSQL using php, when trying the following:
PHP Code:
$connection = mssql_connect("127.0.0.1","test","") or die("Could not connect mssql db on " .$config['dbhost']);
mssql_select_db("dbName") or die("Could not select database " ."dbName");
Are their other ways to see more error handling in connecting to MSSQL
I have created an SSRS report, but the data does not refresh. I am using the SSRS Report Buidler to create reports and when I run the query in the dataset, the results return as expected. However, when I run the report itself, the dataset appears old.
Hi, I have many buttons in Access ADP that trigger feeding a table with different data. Let's call the table tempTable1. I have 600 buttons that feed the make table with all kinds of data:
Code SnippetSELECT * INTO tempTable1 From AnyDataSourceSPViewTable
. There is another menu bar button that only opens
tempTable1. The result for tempTable1 is always correct when you use Query Analyzer. However Access ADP recognizes the change in the table structure when the connection is refreshed. Is there is any way that we can automate the connection refresh procedure or an easier way to get the desired results?
Am using Access 2003 connected to SQL 2005 and trigering the events via VBA.
Recently we upgraded the SQL sever from 32 bit box to 64 bit and we encounterd some weird results like data was not refreshing unless we manually does a manually refresh.
we just ran some migration scripts from another server to the new 64 bit server and script is correct only thing problem is with data refreshing.
Please send me the solution for the problem and let me is there any script that we can refresh the database using tsql command???
We have scenario where we need to create package for refreshing the cubes. Let me explain it breifly.
We are doing ETL process in different ETL tool (not SSIS) and once the process is done, we are inserting in a table where we have a column like date, Completed C as the status. Once we get this information i.e 'Completed C' status and date, we need to refresh the cube like as follows,
1. Previous day cube need to be refreshed daily once 2. Previous week cube need to be refreshed weekly once. 3. Previous month cube need to be refreshed monthly once. 4. Historical cube need to be refreshed daily once.
We have decided all these above operations needs to be done in SSIS. In this case what are the things to be done while creating a package.
1. What are the control flow items to create it in SSIS? 2. Is there any way to have pooling like every 10 minutes to check whether the table has 'Completed C' status and STOP EXECUTE this package? 3. Is there any way to check the date for Previous day cube, Previous week cube, Previous Month cube and Historical cube etc?
I have been posting to the Data Presentation Controls forum for about a month regarding a problem I've been dealing with.
http://forums.asp.net/thread/1223055.aspx
What it boils down to is that on a button click event, I was updating some records, then re-executing a SELECT statement to get the records back out and rebind my DataGrids. This was happening too quickly and the data was not being updated in time before the SELECT was executed. So my grids would still display "old" data.
How do I get SQL Server to commit the UPDATE before my C# code continues?
I have a SSRS 2000 report that when I view the report data does not refresh until I press the refresh data button in the report. Clearly this can't be right and to expect users to press the refresh button every single time is also rediculous.
HAs anyone had this problem before and know what to do.