Computing Full Record Row Size
Dec 19, 2001Is there a quick easy way to calculate rowsize for a record in a table that has data in each column?
View 5 RepliesIs there a quick easy way to calculate rowsize for a record in a table that has data in each column?
View 5 Replies
I have a client program that writes to sql server database 10 records per second . i want to compute the CPU usage and the memory usage for the whole program or CPU usage,memory usage for the insert statement in the program .
Can anybody help me with this?
Hi,
Does anyone have experience with Full Text catalogs on large tables? We have a full text catalog on a table with about 30 million rows. Acording to BOL, once you get over 1 million you'll need to make some adjustements. Our system works, but we randomly get the following errors with ad hoc queries. The server has 8GB of RAM and 4 3GHz processors. Does anyone have experience working with a table this big? Any suggetions as to what could cause these errors? The only thing I could find was BUG#: 469483 on MS's site, but we're not using any OR clauses
Thanks
Here's the errors....
(query)
SELECT * FROM acc_results ar
WHERE CONTAINS(finding_text, 'cell')
#1
Server: Msg 7619, Level 16, State 1, Line 1
Execution of a full-text operation failed. Not enough storage is available to process this command.
#2
Server: Msg 7342, Level 16, State 1, Line 1
Unexpected NULL value returned for column '[FULLTEXT:acc_results].KEY' from the OLE DB provider 'Full-text Search Engine'. This column cannot be NULL.
OLE DB error trace [Non-interface error: Unexpected NULL value returned for the column: ProviderName='Full-text Search Engine', TableName='[FULLTEXT:acc_results]', ColumnName='KEY'].
#3
Server: Msg 7399, Level 16, State 1, Line 1
OLE DB provider 'Full-text Search Engine' reported an error.
[OLE/DB provider returned message: Not enough storage is available to process this command.]
OLE DB error trace [OLE/DB Provider 'Full-text Search Engine' IRowset::GetNextRows returned 0x80004005: ].
The Upgrade Advisor noted that SQL Server 2005 changed the way full text catalogs are utilized. It says that you must ensure tha tthe file group associated with the base table has enough space to accomodate the additional space requirements for full-text indexes. It says to use the formula (2*FTK + 34 bytes) * RC where FTK=full text key size and RC=rowcount of the table. I know which table but if I do sp_help on the table, it doesn't tell me anything about the size of the full text key. I've tried searching TechNet and I'm either not finding the answer or I don't know enough to know if it's there. Can anybody help me?
View 1 Replies View RelatedIn what scenario – diff and full backups have same size? My guess is excessive fragmentation, any thoughts?
------------------------
I think, therefore I am - Rene Descartes
I'm new to the DBA world, and have no one else in the company to look up to. Does anyone know what I might need to check out or do when the Data File Size is 204% full? Or is this not necessarily a bad thing?
I'm getting this from a Diagnostic tool I have.
The number of tables is 148
Data file size 35,941 MB
Data Size 26,549.92 MB
Index Size 177,130.02 MB
Log File Size 5.05 MB
Thanks,
Has anyone come across a more accurate method than sp_spaceused to estimate the size of a full database backup for SQL Server 2000 ?
I have found this to have too great a variance (even after running updateusage) to rely on any accuracy for it. I have also looked at perhaps using the ALLOCATED Pages indicated in the GAM pages but this also seems to be pretty inaccurate.
I have a number of servers where space can be limited and backups using Maintenance Plans have occasionally failed because they delete the old backups AFTER they do the latest one. I am writing a script which can check the space remaining and adjust the backup accordingly but the variance I have observed so far with sp_spaceused is too great.
Any ideas welcomed.
What can be the possible maximum size of one record in SQL 2005 Enterrprise Edition server?
View 1 Replies View RelatedI have a question regarding FUll and differential backup.
We we take full or diff back up, does it create lot of logs ie. Does full or diff backup has any impact on log size?
Thanks
Is there a t-sql shortcut/ quick way to return the full max record.
So basically if I had:
prog_area glh description
HS 200 health and social care
EN 300 engineering
HS 400 health care
EN 250 engineering and construction
so grouping by prog_area and taking max glh would return appropriate descriptions like:
HS 400 health care
EN 300 engineering
This is quite a simple example and I would be using it on much larger datasets. I know how to do it by doing a join to itself with max glh but it seems to me there should be an easier way to return the appropriate full record set. Hopefully somethin fast in t-sql
How do I calculate the Record size? I mean the Row size for each table.
I heard that there are formulas to do that and the calculation is different for Fixed lenght Columns and Variable length Columns.
I need to calculate the record size for nearly 500 tables.This will help me to decide on the DISk capacities.
Suggest me if any scipts or calculations already available.
Thanks in advance.
Samson
Hi,In MS SQL Server, while creating the table, I am getting a warningmessage saying like "maximum row size can exceed allowed maximum sizeof 8060 bytes".Is there any way in SQL Server, to increase this allowed maximum rowsize?The setting like "set ANSI_WARNINGS OFF" is not suitable in my case. Ineed creation of table with around 11000 bytes in the record (Summationof precision of all the columns). And more over, I do not want to looseany data (truncation) in inert operation to keep on the whole recordsize to 8060B.Thanks a lot in advance.Ramakrishna.
View 3 Replies View RelatedHi,
I need a query to know (calculate) the size used by a record in a field by table, I was tried with the follow query (but this only show me the max length of the data type):
SELECT c.name AS column_name
,c.column_id
,t.name AS type_name
,c.max_length
,c.precision
,c.scale
FROM sys.columns AS c
JOIN sys.types AS t ON c.user_type_id=t.user_type_id
WHERE c.object_id = OBJECT_ID('dbo.bloqueos_2005')
ORDER BY c.column_id;
GO
I was tried whit the system views that contains column info, but no one give me the correct information.
If any one have the query or know some way to do this, please help me.
Thanks,
Diego Jaramillo Celis
In my enrv. DBA just migrated SQL Server 2000 Databases from WINDOWS 2000 to WINDOWS 2003 Enterprise Edition and the same SQL version. The problem which i need to analyse
1. Why the full backup size is occupying only 70GB when DB size is 120GB?
Here is the situation
Full Backup is taken once every day ---- 70gb
Diff Backup Taken every 3 hrs till 5 PM ----- size is 50GB
Transaction log backed up every 10 min uptill 8 PM ----- not a big size
I am really confused as to why Full DB Backup is taking only 70GB.............Can some one please throw a Light on how the SQL Server 2000 Backup functions.
I want to calculate average row size of a record. By based on this i want to add some more columns into an existing table. Here is my table structureCREATE TABLE patient_procedure( proc_id int IDENTITY(1,1) CONSTRAINT proc_id_pri_key PRIMARY KEY, patient_id int NULL, surgeon_name varchar(40) NOT NULL, proc_name varchar(20) , part_name varchar(30), wth_contrast int , wthout_contrast int , wth_wthout_contrast int, xray_part varchar(60), arth_area varchar(30), others varchar(30) , cpt varchar(20) , procedure_date smalldatetime NOT NULL, mraloperrun varchar(20),CONSTRAINT patientid_foreign_key FOREIGN KEY(patient_id) REFERENCES dbo.patient_information (Patient_id)) Now i got a requirement that i have to add two more procedures with different columns.The columns overall size is 195 bytes.I can place those two procedures as seperate tables. I dont want to do that becuase of front end requirements.Here the problem is when the user enters these two procedures information remaining fields will store the null value. I know that when we store the null values into corresponding columns min of 1 byte will be occupied. Please suggest me that shall i include these columns into the above table. If i add these columns is performance will be decreased or not. Waiting for valuable suggestions.
View 2 Replies View RelatedLet's imagine some situation in MS SQL 2005:
Table t_sample with single column SampleString of type varchar(200). If to this column will be inserted value containing 4 chars, real record (not column) size in database would be 4 or 200? In other words: do MS SQL 2005 records have always constant size (total size of all columns max sizes) or real size of record depends on actual size of inserted data?
Hi all,I am dealing with a very large database, and as soon as a record issubmitted I need to run a full-text query against it. I believe itmight take a while before the record is fully indexed and thereforewould not return a result.How can I check whether the record in question is already indexed, ifat all?This is MS SQL 2005Thanks in advance..
View 1 Replies View RelatedHi,
I'm having a DB designed for me, and I'm inspecting it and wondering what in general is the better way to do this.
We have a product, which we are counting "product views". The DB designer has created columns called "view_today" and "views_alltime".
I specified I wanted a normalized database, I'm thinking this is technically not normalized ? Am I correct ?
Wouldn't it be better to have a query that counted the views off the logging table ? I can't see any advantage to doing it the way its been designed except to save time.
Thanks for any input !
Mike123
Has anyone here heard or come across an article or write up about GridComputing in SQL Server 2000?Bharathihttp://www.vkinfotek.com
View 2 Replies View Relatedhi please help me,i have a table queried using this sql, select name,(select count(*) from myTable a where a.name = r.name ) as Total, (select count(*) from myTable b where b.name = r.name and dnum > '1') as Used, (select count(*) from myTable c where c.name = r.name and dnum < '1') as remainingfrom myTable r group by namebut i need one more thing in this table that should look like this,nameTotalUsedRemainingPercentageA126650%B2021810%C150150% this is to add the BOLD field from the above table, but my problem is that the computation is "Used / Total = Percentage%"so how can i do this, please help methanks
View 4 Replies View Relatedhi everybody,
i'm trying to calculate the 'SUM' of time spent in hrs. n min. How can i do this using SQL Server?
What i mean is, i've a column 'TIME_SPENT' that has 'datetime' datatype. This column saves time spent for an activity in format 'hh:mm'. Suppose a user spends 45min for activity 'A' and say 1hr 25 min for activity 'B' then i want to calculate the 'SUM' of 'TIME_SPENT' for the user which should appear as 'Total time spent =2:10'
Can somebody pls help me with this?
Thnx in advance.
I am attempting to compute Service Levels for an interaction based upon business hours. For example, an email arrives at 4pm and is handled the following day at 10am. Call Center Hours are 8-5.
Essentially I have a number of different alternatives, and have found some potential solutions, including:
www.dbforums.com/arch/7/2003/9/914261
However, my situation has a couple of additional twists to the standard 8hrs of business M-F. The call center is open different hours depending upon the day of the week. For example, 8-5 M, 10-7 T, 8-5 W Th F, 10-2 Sat, 10-12 Sun
Additionally, I would like to remove Holiday's from the calculation for service level as well.
I have explored a number of different table DTD's, but none seem to be a perfect fit for determining the number of "open" hours between when an interaction arrived, and when it was handled.
The DTD I have for the Holiday table is as follows:
CREATE Table Holidays (HolidayDate DateTime)
GO
Insert Into Holidays (HolidayDate) Values ('12-25-2004')
Please let me know what you feel would be the DTD for storing the business hours and also the query for extracting the number of Open hours between two dates
Thank you in advance
I encountered a tricky problem. The original data, say, table_o, is shown below:
IdsStatusDate
ID1402-May-13
ID1310-May-13
ID1216-May-13
ID1120-May-13
ID2308-May-13
ID2210-May-13
ID2119-May-13
The final resulting table, e.g., table_f, is:
Ids4->3 3->2 2->1
ID18 6 4
ID2NULL 2 9
The values in the final table are the days used by each ID transferring from status i to status i-1. E.g., ID uses 8 days (10-May-13 - 2-May-13) to go to status 3 from status 4.
It is hard for me to come up with a table like the final table, although I know that the difference between two adjacent rows can be computed by using self-join and timediff().
Dear All.
I'm a fairly new SQL programmer so apologies if this is a silly question.
I'm trying to create a new column/variable from 3 other variables where the new column = column 1 unless column 1 is blank, then = column 2, unless column 2 is blank, then = column 3.
But I don't know where in my query to begin building this. Should I build it in a subquery? Thanks in advance for any replies.
Hi,
I want to create a view where I can calculate the sum of a couple of bit value columns,
aswell as keeping track of the total number of bits set to true.
Here is an example:
dbo.Band
BandID int
Name nvarchar(50)
Country nvarchar(50)
dbo.Record
ID int
Name nvarchar(50)
BandID int
Label nvarchar(50)
InProduction bit
InSkodne bit
From these tables I created this view:
dbo.TestView
SELECT dbo.Band.Name, dbo.Band.Country, dbo.Record.Name AS Recordname, dbo.Record.Label, CONVERT(int, dbo.Record.InProduction) AS InProduction,
CONVERT(int, dbo.Record.InSkodne) AS InSkodne, CONVERT(int, dbo.Record.InProduction) + CONVERT(int, dbo.Record.InSkodne) AS Total
FROM dbo.Band INNER JOIN
dbo.Record ON dbo.Band.BandID = dbo.Record.BandID
I use the convert function to be able to use SUM() across my bit columns, which works fine. Problems is I´m not sure that the way I´m creating the TOTAL column is the best way to go. Any other ideas?
I´m having some problems using this view and the TOTAL column in particular when referencing this view from applications outside SQL Server...
Hello all!
I´m currently devoloping an application where users can register errors related to recieved purchase orders.
I store these values in i table where the purchase order id i PK, and the possible errors that
can exist are stored as bit.
Now I want to be able to put a price on these errors.
I´m thinking about adding another table, containing all possible errors as columns, and then storing the cost of each error as an integer, and probably also a datetime for keeping track of when the costs was last updated.
I´m pretty sure this problem has been solved alot of times before, so I don´t wanna do something stupid here :-)
I´m also wondering about how it would be best to show the computed values?
Should I use a view for this?
For example:
SELECT (dbo.Orders.QuantityError * dbo.Costs.QuantityError )
FROM dbo.Costs CROSS JOIN
dbo.Record
assuming now that the Costs table only contains one row.
Is this the right way to go, or can you guys give me hints to a better solution?
Regards
Daniel
hi !
Just started playing around with SQL 2000 and I createda sample table called 'actor' which has 4 columns
1. actorID
(formula= LEFT(NEWID(), 3)+ LEFT([actorFirst], 2) +
LEFT([actorLast], 2) + RIGHT(NEWID(), 3))
2. actorFirst
varchar(20) NOT NULL
3. actorLast
varchar(20) NOT NULL
4. actorName
(formula = [actorFirst] + ' ' + [actorLast]
Now my problem is that I want to set a primary key constraint on actorID but it doesn't let me because it the NULL check mark is automatically checked and I cannot check it off ... and I can't set a primary key on something which is allowed to be NULL....
I don't understand why 'actorName' column which is also calculated doesn't have that default NULL checked and locked ....
What am I doing wrong ? Please help ....
Hello all, I known this is a SQL Server forum, but some people maybe have worked with oracle and can help me.
Microsoft's article "http://www.microsoft.com/sql/prodinfo/compare/oracle/mythandreality.mspx :"
reads as follows:
"Oracle's purported Grid enablement in 10g is based on its Oracle Real Application Clusters (RAC) technology that is no more than a local cluster. RAC is a local cluster of computers with no geographic distribution capabilities. This marketing campaign relabeled existing features to exploit current industry trends. "
My question is how can I support the above paragraph? I would like to known more reasons about why oracle grid is a local cluster instead of grid computring oriented.
Thanks in advance.
Hernán.
I am trying to resize a database initial log file from 500M to 2M. I€™m using€?
ALTER DATABASE <DBNAME> MODIFY FILE ( NAME = <DBLOGFILENAME, SIZE = 2 ) "
And I'm getting "MODIFY FILE failed. Specified size is less than current size." I tried going into the database properties and setting the log file to 2M, but it doesn€™t keep the changes.
Any help with this process?
I have the following a computing column
(isnull(TotalProductSaleCost,0) * 7) / 100
I would like the output to be formatted to decimal (12, 2) not sure how to achieve this?
I am new to SSRS, so perhaps its a trivial question. I was wondering that since all controls have names in the report, is it possible to programatically access values of different textboxes, do some computation and then assign to another text box? I know how to do it using the Aggregate functions and operators, but am not sure if I can access values from textboxes within two different tables and assign the computed value to a third text box on the page (not belonging to any table or other control).
somethig like.... txtTotal.Value = FormatCurrency(txtSalesTotal.Value) - txtDiscount.Value));
Any ideas??
DNG.
Hello,I am in the process of making a very simple stats page that will show us how many tasks we've completed. Here is what I have so far: Here is the SQL that makes it work: SelectCommand="SELECT Count(TicketID), Category FROM Tickets GROUP BY Category ORDER BY Count(TicketID) DESC ">My problem is with how the totals seem to go on forever. Instead of being in proportion to each other as a percentage of a the total amount of tickets.. they just increase in size with each additonal entry. Can someone help me restructure this so that I can calculate the totals individually and as a whole and then apply the totals to create a proportional bar graph?Thank you greatly for your help,Mark
View 2 Replies View RelatedI need to write a process to get file size in kb and record count in a file. I was planning on writing a c# console app that takes the file path and name as a param however should i use a CLR?
I cant put a script in the ssis when it's bringing the file down because it has been deemed that we only use ssis for file consumption.