Cache Entire Table Then Make Filtered Queries Against It
Feb 11, 2008
Hello,
I have a table that I want to cache. So, if this is my query "SELECT * FROM myTable" that I create the cache from, how can I filter the data?
I really need to cache the whole table, since there are a myriad of different statements being executed against the table, so just caching a specific query won't do.
I've found the best approach to make a question is writing it in code, so how can I do the following:
SELECT ColNames FROM MyTableWhichShouldNowBeCached WHERE whereColumns=someParams.
ColNames and someParams make up for a variation of about to 120 different queries. I use SQL 2005 express edition with advanced services.
I have an SQL Server 2000 database. There is a table for User Logins. I want the query to be executed as case sensitive. I mean if the database has a username like "USER1" and the user enters his username as "user1" then the login should fail.
So, Is there some way I can make SQL case sensitive? Please help
in microsoft doc there is written on the topic of BP Extensions with SSD's in SQL Server 2014: only clean pages are written to disk... does this mean data pages that have not been modified yet? or also those data pages that have already been modified, and where log has finished writing and the transaction has been marked as commited??
why are there clean data pages being written to L2 cache to make space for other not modified pages? I mean, shoudnt they be modified first, before letting other unmodified data pages into the Cache? I mean they have still to be modified..that makes no sense to me to page them out and page them in again just for other data pages...
Hi! I've some data in a table, and I want to add these records to another table, but I tried yet with select ... Into ... and I lose all the data from this one. I need to ADD the data from table to table. Any suggestion? Thx
Hi,I need to lock a table so that Inserts are prevented as well as deleted andupdates. At present I'm thinking this might do it:SELECT * FROM myTable WITH(UPLOCK)but then again I'm not sure whether this will cover the insert case.Thanks,Robin
How can I do in order that a table show all its rows in a page?And if the table can only show a number of rows cause arrive to the end of the page show the table in the next page?
I am giving the following statement outside the begin tran block in a stored procedure: select * from table1 with (holdlock) where column1 = @colValue I have the following two questions regarding above situation:
Being outside the begin tran block, will the select statement cause a lock to be held or it has to be within the begin tran block for the lock to be held? Will the select statement hold a lock only on the row or rows matching the where clause, or it may lock a page or even the entire table in this situation? Thanks
I'm a newbie to script writing. I'm trying to write a script to copy alldata from a table to the same table in a 2nd database. Both databases areon the same server and are identical in design. I can do this with DTS butwanted a script I could email to a user to run in Query Analyzer.Example:Copy entire table called 'Customers' in the 'Data01' database totable 'Customers' in the 'Data02' databaseI want to overwrite all data in the destination table.Thanks
The problem that I am having is that I am not getting the results from an entire table, just a small subset. The sql command is "Select * from login", which has always implied to me, get everything, to end of table, or, until I tell you to stop. I am getting exactly, 10 items as input, and then it stops. I am including the code below in the hopes that someone can spot what it is that I am doing wrong.
The purpose of the routine is to convert a user login database into a new, expanded format.
TIA, Tom
Imports Common.Utility.Utils
Partial Class UserDatabaseConversion
Inherits System.Web.UI.Page
Dim sqlConn As SqlConnection = getSQLConnection(ConfigurationManager.AppSettings("utilityDbName"))
Dim sqlConnN As SqlConnection = getSQLConnection(ConfigurationManager.AppSettings("utilityDbName"))
Dim sqlStringOld As String = "Select * from login"
Dim sqlStringNew As String = "usp_UT_AddNewUser"
Dim buildCnt As Integer = 0
Dim oldUserCnt As Integer = 0
Dim newUserCnt As Integer = 0
Dim oLocation As String = Nothing
Dim errors As Integer = 0
Dim mesg As String = Nothing
Protected Sub Page_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load
Im getting this error when trying to set up a cache dependency...are there any special permissions etc?From CS:SqlCacheDependency dep = new SqlCacheDependency("MySite-Cache", "Products");Cache.Insert("Products", de.GetAllProductsList(), dep); From connectionStrings.config:<add name="SiteDB" connectionString="Data Source=localhost,[port]SQLEXPRESS;Integrated Security=true;User Instance=true; AttachDBFileName=|DataDirectory|ASPNETDB.MDF" providerName="System.Data.SqlClient" />Also tried this using my machinename<add name="SiteDB" connectionString="Data Source=<machinename>,[port]SQLEXPRESS;Integrated Security=true;User Instance=true; AttachDBFileName=|DataDirectory|ASPNETDB.MDF" providerName="System.Data.SqlClient" /> From web.config: <caching> <sqlCacheDependency enabled="true" pollTime="10000"> <databases> <add name="MySite-Cache" connectionStringName="SiteDB" pollTime="2000"/> </databases> </sqlCacheDependency> </caching> EDIT: So making progress I can't seem to get the table registered for cache dependency:The sample i have says"aspnet_regsql.exe -E -S .SqlExpress -d aspnetdb -t Customers -et"and the command line response is "Enabling the table for SQL cache dependency..An error has happened. Details of the exception:The table 'Customers' cannot be found in the database."Where does this "Customers" table come from? There is obviously not an application specific "Customers" table in aspnetdb I'm confused probably more by the example than anything....
Is there a way to drop clean buffers at the database level instead of the server/instance level like the undocumented €œDBCC FLUSHPROCINDB (@dbid)€?? Is there a workaround for €œdbo€? to be able to flush procedure and data cache without being elevated to €œsysadmin€? server role?
PS: I am aware of the sp_recompile option that can be used to invalidate cached execution plans. Thx.
This SQL statement, though carefully written to delete only selected rows, deletes the entire A_Shift_Times table:
DELETE FROM A_Shift_Times WHERE EXISTS ( SELECT 1 FROM Users WHERE (A_Shift_Times.time_in >= CONVERT(DATETIME, '2000-05-29 00:00:00', 102)) AND ((Users.User_Password LIKE N'mrr%') OR (Users.User_Password LIKE N'work%')))
We are using SQl-2000 Sp2 clustured on two compaq servers We have to modify our database shema every month. We are using sp_repladdcolum and sp_repldropcolum to modify table without snapshot.
Owever I would like to add/delete entire table in shema database without using snapshot ?
I'm trying to create a query that searches an entire database for keywords inside of the columns for each table within the database. For instance my tables have 2 columns one named ID and the other Permission, I'd like it be able to return all the lines that are associated with that keyword. So if I search "Schedule" it returns all the lines containing that word in it within that database.
I usually go for the largest datafile and then query in-there...But now I need to automate it for several instances... I need to be able with one script quickly retrieve the top 5 largest tables for the entire instance,not by database...
I am looking at the plan caches/cached pages from the perspective of sys.dm_os_memory_cache_counters and sql serverlan Cache - Cache Pages
For the first one I am using
select (sum(single_pages_kb) + sum(multi_pages_kb) ) from sys.dm_os_memory_cache_counters where type = 'CACHESTORE_SQLCP' or type = 'CACHESTORE_OBJCP' a slight change from a query in http://blogs.msdn.com/sqlprogrammability/
For the second just perfmon.
The first one gives me a count of about 670,000 pages only for the object and query cache and the second one gives me a total of about 100,000 pages for five type of caches including object and query.
If I am using the query from http://blogs.msdn.com/sqlprogrammability/ to determin the plan cache size
select (sum(single_pages_kb) + sum(multi_pages_kb) ) * 8 / (1024.0 * 1024.0) as plan_cache_in_GB from sys.dm_os_memory_cache_counters where type = 'CACHESTORE_SQLCP' or type = 'CACHESTORE_OBJCP'
it gives me about 5 GB when in fact my SQL Server it can access only max 2GB with Total and Target Server Memory at about 1.5 GB.
Using SQL Server 2000, SP1 with 4Gb max memory allocated to the instance. The problem is that one large table is hogging cache and it's dragging down overall query performance. I realise it's in cache because it's getting queried regulary. However, I need to know what options exist to get around this problem - to free up some cache for other tables and indexes? Of course, there is the option of archiving off some the data in the table to reduce its size and we will look at doing this although it will not be as easy as it sounds.
I can imagine that there must be many databases that have at least one large table that is getting hit regularly and is left in cache more-or-less permanently. Therefore, I can't believe I have an usual problem.
My source has 2.2 million of records. I'm performing the incremental load.In the lookup transformation i used the destination table for the reference using Full cache mode.For the first time package executed successfully but when i executed the package second time, Suddenly Package hangs while running.Than i truncate the data from the destination table and restart the SQL Server Services.After doing all this i executed package again and it worked but when i executed package second time, again package hangs up .I have 8GB RAM and i5 2.5 GHz Processor laptop.
Hi All, Any assistance would be greatly appreciated.
I have a current table which I create on a regular basis from a text file with a layout similar to this: TypePolicy #AmountRider 1 AmtRider 2 Amt B1112H24.341212.34
This text file is brought into a staging table with each field (even the amount field) as a varchar (12). I then assign types in a later step in my DTS package.
What I need to do is stack the riders under each policy so for each policy where there is a rider, there is a new row for every rider. So in the example I've given, there would be 2 additional rows for the original first row since there are two riders. TypePolicy #Amount B1112H24.34 R11112H12 R21112H12.34
I plan on doing this by first creating a table with just the Type, Policy #, and Amt fields, and then using a series of insert queries where I take the rider (if there is one) and append it onto the table.
However, I'm getting the following error message when I try: Server: Msg 213, Level 16, State 4, Line 1 Insert Error: Column name or number of supplied values does not match table definition.
Basically, it wouldn't let me put an 'R1' in the Type column. How can I get this to work!?!?
I want to use a Make Table Query to generate a new read-only Table. This I can do but my problem is that I want to be able to name the new Table containing only the records from a calendar year; e.y. Rents2001 and the next year another new table would be created and called Rents2002 and next year Rents2003 ...............
I require the Table to be generated yearly. I know I could do this in other ways but I really require the Table as once I have it I will be doing other things with it to give the final report.
Any suggestions how I can generate the Table with the YEAR being in the Table Name as part of running the Make Table Query? Thanks
What is the best way (short of backing up the entire DB) to make a copy of a Table so that It can be easily restored. We have a table that we want to make some serious changes to, but I want to make sure I can restore if if I need to (if the changes don't work)
In my application i need to access mutiple table. I'm writing a stored procedure in which i need to access tables such as TB01,TB02 .. Basically TBFY where FY is parameter.
I tried for OPENQUERY, but that needs me to add a linked server, which i don't think is a good idea.
Link ( GroupID int , MemberID int ) Member ( MemberID int , MemberName varchar(50), GroupID varchar(255) )
The Link table contains the records showing which Member is in which Group. One particular Member can be in multiple Groups and also a particular Group may have multiple Members.
The Member table contains the Member's ID, Member's Name, and a Group ID field (that will contains comma-separated Groups ID, showing in which Groups the particular Member is in).
We have the Link table ready, and the Member table' with first two fields is also ready. What we have to do now is to fill the GroupID field of the Member table, from the Link Table.
For instance,
Read all the GroupID field from the Link table against a MemberID, make a comma-separated string of the GroupID, then update the GroupID field of the corresponding Member in the Member table.
Please help me with a sql query or procedures that will do this job. I am using SQL SERVER 2000.
I am doing a shopping basket type demo.I have difficulties going from table with fields like (Name,ProductCode) to table(Name, ProductCode, NumberOfItems). One way could be adding just Name and ProductCode fields and let NumberOfItems come automatically. It could have some common value like 1. How should I do this with VB.
I want to take a table (SQL server) and show all it's data in a form of INSERT... (text file) so that I can show it on a web-page and just paste the INSERT text into a MyTableData.sql and then I can just run this script on the Query analyzer and fill my table with data
I need it so I can backup my DATA both in English and other languages... (a replacement for the DTS packages that gives me hard time with the LOCALE / UNICODE translation)
did anyone already made such program ? where can I find something like this ?