SQL Server 2008 :: Temp DB Size Values From Different Tables
May 25, 2015
For finding size values of temp database:
select * from sys.master_files -
size column value here is 1024 for .mdf,size here for .ldf is 64
select * from tempdb.sys.database_files -
size column value here is 3576 for .mdf,size here for .ldf is 224
Why is there a difference and not the same. size columns in the above 2 tables for temp db's do they represent different values ?
Let's say that I have a stored proc that is assigned to a service broker queue and is constantly running while it waits for messages in said queue. When a message comes in on the queue, the stored proc creates a table variable based off of the contents of the message and performs various operations with the data. Since the stored proc is constantly running, do the contents of this table variable ever truly get emptied? Should I be deleting the contents of the table variable at the end of the operation to ensure that stale data doesn't persist?
I've recently started a new position and our production box. Contains a procedure that uses 30 + temp tables. I'm currently not in a position to change this as it's production and I would have to be granted a window to re-design.
However the tempDb is showing some strange activity.
If a table is created #CarrierService (CarrierServiceID,DeliveryZoneID,CollectionZoneID) for example
Once the procedure is called It will appear in the tempDB with the session info appended as expected
However once the session has ended the above table will get dropped and a new one created
#2C78E45A, I now have 7000 of these different Tables in the TempDB
When I Interrogate this using
SELECT o.name, o.create_date,o.modify_date , c.Name,C.Column_Id FROM tempdb.sys.Objects o Inner join tempdb.Sys.Columns c ON o.object_id =c.Object_ID WHERE o.type ='U'
I get the Following results
name create_date modify_date object_id name #2C78E45A26/04/2015 18:0930/04/2015 14:55746120282CarrierServiceId #2C78E45A26/04/2015 18:0930/04/2015 14:55746120282CollectionZoneId #2C78E45A26/04/2015 18:0930/04/2015 14:55746120282DeliveryZoneId
1) "Deferred compile" recompile event occurs because of deferred name resolution. In other words, an object referred to in the statement does not exist at compile time. Later, when the object does exist, it requires a recompile of the statement so that it can create an optimal execution plan. One example of when a deferred compile will occur is if a temporary table is used in a batch and does not exist when the first statements in the batch are compiled.
--Exec Database.Employees --Use Database --Go --Create PROCEDURE AEM.TempTable --AS --BEGIN --Select * into #emptemp From Database.Employees --End --Select * From #emptemp
Is something like this possible? I can get the EXEC to run the "Select * into #emptemp From Database.Employees" statement, but when I try to use the temp table it doesnt see it.
do you have a general rule of thumb for breaking a complex query into temp tables? For someone who is not a sql specialist, a query with more than a few table joins can be complex. So a query with 10+ table joins can be overwhelming for someone who is not a sql specialist.
One strategy is to break a problem into pieces so to speak by grouping together closely related tables into temp tables and then joining those temp tables together. This simplifies complex SQL and although not as performant as one big query it's much easier to understand. So do you have a general rule of thumb as far as a threshold for the number of joins you include in a query before you break the query into temp tables?
I am new in SQL and i need do a query where I need sum values from 2 tables, when i do it the Sum values are not correct. this is my query
SELECT D.Line AS Line, D.ProductionLine AS ProductionLine, D.Shift AS Shift, SUM(CAST(D.DownTime AS INT)) AS DownTime, R.Category, SUM(Cast(R.Downtime AS INT)) AS AssignedDowntime, CONVERT(VARCHAR(10), D.DatePacked,101) AS DatePacked FROM Production.DownTimeReason R left JOIN Production.DownTimeHistory D
I have a table of raw data with supplier names, and i need to join it to our supplier database and pull the supplier numbers.
The issue is that the raw data does not match our database entries for these suppliers; sometimes there are extra periods, commas, or abbreviations (i.e. FedEx, FederalExpress, FedEx, inc.) etc. I'm trying to create a query that will search for entries that are similar.
I tried setting a variable to be equal to the raw data field, and then using a LIKE '%@Variable%' to try and return anything that would contain it, but it didnt return any rows.
I have seen a bunch of ways to get the size of all the tables within a database posted on this board. I decided to modify an older one I found here (http://www.sqlteam.com/item.asp?ItemID=282). I set it up so there is no cursors or temp tables. Pretty much just one select statement to return all the info you would need. It seems to be faster than anything I have seen so far. Take it for whats its worth. Thanks to the original creator.
/* Original by: Bill Graziano (SQLTeam.com) Modified by: Eric Stephani (www.mio.uwosh.edu/stephe40) */
declare @low int
select @low = low from master.dbo.spt_values where number = 1 and type = 'E'
select o.id, o.name, ro.rowcnt, (r.reserved * @low)/1024 as reserved, (d.data * @low)/1024 as data, ((i.used-d.data) * @low)/1024 as indexp, ((r.reserved-d.data-(i.used-d.data)) * @low)/1024 as unused from sysobjects o
inner join (select distinct id, rowcnt from sysindexes where keys is not null and first != 0) ro on o.id = ro.id
inner join (select id, sum(reserved) reserved from sysindexes where indid in (0, 1, 255) group by id) r on o.id = r.id
inner join (select c.id, dpages+isnull(used, 0) data from (select id, sum(dpages) dpages from sysindexes where indid < 2 group by id) c full outer join (select id, isnull(sum(used), 0) used from sysindexes where indid = 255 group by id) t on c.id = t.id) d on r.id = d.id
inner join (select id, sum(used) used from sysindexes where indid in (0, 1, 255) group by id) i on d.id = i.id
how best to approach a problem involving two tables across two different servers.
Table 1: Contains IP Address along with assessment findings. Lets say the fields are IPADDRESSSTR, FINDING
Table 2: Contains Subnet information stored in integer format. The fields are SITE_ID, LOW, and HIGH
What I'd like to do is load the IP range information into memory and then return the findings from table 1 where the IPADDRESSSTR is between the LOW and HIGH integer value.
1) Is there a way to load all of the ranges from table 2 into an array and then compare all the IP addresses (IPADDRESSSTR) from table 1?
2) How do I convert IPADDRESSSTR (a string) to an integer to perform the comparison.
Looking at BOL for temp tables help, I discover that a local temp table (I want to only have life within my stored proc) SHOULD be visible to all (child) stored procs called by the papa stored proc.
However, the following code works just peachy when I use a GLOBAL temp table (i.e., ##MyTempTbl) but fails when I use a local temp table (i.e., #MyTempTable). Through trial and error, and careful weeding efforts, I know that the error I get on the local version is coming from the xp_sendmail call. The error I get is: ODBC error 208 (42S02) Invalid object name '#MyTempTbl'.
Here is the code that works:SET NOCOUNT ON
CREATE TABLE ##MyTempTbl (SeqNo int identity, MyWords varchar(1000)) INSERT ##MyTempTbl values ('Put your long message here.') INSERT ##MyTempTbl values ('Put your second long message here.') INSERT ##MyTempTbl values ('put your really, really LONG message (yeah, every guy says his message is the longest...whatever!') DECLARE @cmd varchar(256) DECLARE @LargestEventSize int DECLARE @Width int, @Msg varchar(128) SELECT @LargestEventSize = Max(Len(MyWords)) FROM ##MyTempTbl
SET @cmd = 'SELECT Cast(MyWords AS varchar(' + CONVERT(varchar(5), @LargestEventSize) + ')) FROM ##MyTempTbl order by SeqNo' SET @Width = @LargestEventSize + 1 SET @Msg = 'Here is the junk you asked about' + CHAR(13) + '----------------------------' EXECUTE Master.dbo.xp_sendmail 'YoMama@WhoKnows.com', @query = @cmd, @no_header= 'TRUE', @width = @Width, @dbuse = 'MyDB', @subject='none of your darn business', @message= @Msg DROP TABLE ##MyTempTbl
The only thing I change to make it fail is the table name, change it from ##MyTempTbl to #MyTempTbl, and it dashes the email hopes of the stored procedure upon the jagged rocks of electronic despair.
Any insight anyone? Or is BOL just full of...well..."stuff"?
I have a dynamic sql which uses Pivot and returns "technically" variable no. of columns.
Is there a way to store the dynamic sql's output in to a temp table? I don't want to create a temp table with the structure of the output and limit no. of columns hence changing the SP every time I get new Pivot column!!
what are some common techniques for ensuring an isolated temp table scope? For example, if 2 different sprocs happen to crud a temp table with the same name? I'm guessing that big SQL shops establish a standard for this early on to avoid conflicts between sprocs.
There are some more columns with more 'nvarchar' (max) and other INT data types. Anyway, I know a page is 8K size. How do I find out how much space does A ROW takes with above datatypes? If users add 5000 Rows per day, how do I figure out how much size the table will increase?
My prod server (only default instance) is configured TempDB 1024 MB data and Log 200MB. when I run 'sqlperf logspace' it shows most of time around 45% 'log space used'. There nothing going on the instance when I ran 'whoisactive' and select * from sys.sysprocesses where dbid = 2!!!
So my questions are is this normal to see log space around 45%, how to find what what CAUSED the tempdb log space to grow 45%? Is there something to do about it?
I have a stored proc that creates a global temp table. How can I have multiple users select records from and insert records in that table without overwriting and/or deleting data?
I've never worked with the XML data type in SQL Server, although I know its been there for a few iterations of SQL SErver. Now I've got a situation in which it might store some configuration data as XML, since that's the way it comes. (We had thought about storing the data in a VARCHAR(MAX) field.)
The first question is does the XML data type have a size limitation? For example do you do something like:
ConfigFile XML(1000) NULL
Or is it just something like this:
ConfigFile XML NULL
The second question is persisting the data to a file. As the name I choose for the variable suggests, we want to save the data from a configuration file into a SQL Server database. How do we go about doing that? We'll be developing a C# application, it will read and write the data both from the SQL table and the user's local HD.
I've got two databases on the same server and replicate some tables from one database to another.The replication is configured so not to drop the table if it exists, but to delete the data based on the filter if one exists. There are two tables on the subscriber that have some extra columns.
I get "field size too large" error when trying to replicate them. Is there a workaround without having to make the publisher and the subscriber tables identical by schema?
We have installed SQL Server 2008 R2 SP1 instance and it's having Share Point 2010 databases.
We have 2 dedicated drives for Tempdb on SAN with 50 GB space. Both tempdb data & log files are created with default size. I would like to presize them.
What are the best values to start with?
U ->Tempdbdata having tempdb.mdf file V->Tempdblog having templog.ldf file
My backups are failing sometimes.My db size is around 400 GB and we are taking backup to the remote server. Free size available on the disk is showing 600 GB but my database full backup run more than 10 hrs and failed. The failure reason is there is not enough space on the disk.
What could be the possible failure reasons? It has more than the the database size on the backup server but why it is showed that msg on the job failures. I noticed same thing occurred couple of times in the past.
Is there any way to find how much the backup file will be generate before we run the backup job?
i.e. If we run the full backup of test1 database now, it will generate ....bak file for that test1 db
currently we are using maintenance plan and with compression (2008R2) and the database has TDE enabled
Any good starting point to understand for a specific db, how many max VLFs are good to have so that it does not cause long startup or backup times?
Also, I need some calculation so that I can identify a best growth parameter I will setup for each database ?
I'm seeing the below msg in errorlog and curious to know the changes (right sizing/growth) to be done? As of now 100 MB of log file growth value is set (refer: [URL] ....)
Database BizTalkMsgBoxDb has more than 1000 virtual log files which is excessive. Too many virtual log files can cause long startup and backup times. Consider shrinking the log and using a different growth increment to reduce the number of virtual log files.
I have an application that I am working on that uses some small temptables. I am considering moving them to Table Variables - Would thisbe a performance enhancement?Some background information: The system I am working on has numeroustables but for this exercise there are only three that really matter.Claim, Transaction and Parties.A Claim can have 0 or more transactions.A Claim can have 1 or more parties.A Transaction can have 1 or more parties.A party can have 1 or more claim.A party can have 1 or more transactions. Parties are really many tomany back to Claim and transaction tables.I have three stored procsinsertClaiminsertTransactioninsertPartiesFrom an xml point of view the data looks like this<claim><parties><info />insertClaim takes 3 sets of paramters - All the claim levelinformation (as individual parameters), All the parties on a claim (asone xml parameter), All the transactions on a claim(As one xmlparameter with Parties as part of the xml)insertClaim calls insertParties and passes in the parties xml -insertParties returns a recordset of the newly inserted records.insertClaim then uses that table to join the claim to the parties. Itthen calls insertTransaction and passes the transaction xml into thatsproc.insertTransaciton then inserts the transactions in the xml, and alsocalls insertParties, passing in the XML snippet
I could not figure out, if this is a feature. I set TempDB to autogrow, it accepts it but when I go back to it it is showing as fixed growth or max. size as 2GB. Has anybody encountered that ?
I am trying to create a SQL data adapter via the wizard, however, I get the error "Invalid object name #ords" because the stored procedure uses a temp table. Anyway around this? Thanks.
I am just learning about temp temps using Iteration, how do I clear the contents of the #temp table because when I re-run the query I get the following error:
Msg 2714, Level 16, State 6, Line 6
There is already an object named '#mytemp2' in the database.
How can we monitor the all tables in all databases and send notifications to the team.Is there a way to check to find the no of rows and size of a table last month and find out growth % now