SQL 2012 :: Check Size Of Database After Implementing Data Compression Across All Tables
Dec 4, 2014
I found it pretty interesting. I checked the size of a database, before implementing database compression across all the user tables in a database. And Post implementation of compression too I checked the size of the database.
I did not find any difference. But if I expand the table and check propetires->storage and I can see that PAGE compression is implemented across all the tables, but no compaction in the size of the db. It still remains the same.
View 6 Replies
ADVERTISEMENT
Jun 26, 2007
Hi
I am trying to check the size of each table in my database?
SELECT <TableName> , 'Size in bytes/megabytes' FROM DATABASE
I can't for the lif of me figure out how this is done.
Any help would be greatly appreciated
Kind Regards
Carel Greaves
View 10 Replies
View Related
Aug 30, 2005
Has anyone implemented split data for an application between two databases because the data size is extremely large? If so could you please point me to relevant information.In this split data scenario, a table will automatically carry over to another database whenever the size limit for the current database is reached. The challenge is here for the DAL (data access layer) to automatically look into the appropriate database when the next row of data is in another database. OR Perhaps there is another solution to this terasize data problem..Any help on this would be greatly appreciated.
View 8 Replies
View Related
Apr 1, 2014
Where can I get detailed notes on "Data compression in SQL server"?It would be useful if the notes are accompanied with examples.
View 4 Replies
View Related
Mar 24, 2015
How to estimate the size of datafile while creating the database?
View 2 Replies
View Related
Apr 10, 2014
I need find out the number of columns in flat file before i process that particular file.
I have file name in @filename variable and file path is @filepath variable.
But do not not that how i will check the column name in before i will process that file.
@filePath = C:DatabaseSourceFilesCAHCVSSourceFiles
And I am using for each loop container to read the file one by one and put the file name in @filename variable.
and my file name like
Product_20120607060930.txt
Product_20130708060930.txt
My file structure is:
ID,Name,City,Country,Phone
1,Riya,Pune,India,454564
2,Jiya,New Jersey,India,454564
3,Riya,St Louis,USA,454564
4,Riya,Belleville,USA,454564
5,Riya,Miami,USA,454564
Now what i have to do is i need to make sure that ID,Name,City,County,Phone is there in flat file. if it is not there then i have to send mail to client saying that file is not valid.
Let me know how i will do it.I need to also calculate the size of flat file.
View 0 Replies
View Related
May 21, 2014
We're changing Netbackup to DPM but not sure if "backup compression default" option should be ON or OFF or doesn't matter.
View 3 Replies
View Related
Feb 11, 2015
I have huge table, often I need to query the table on the whole; ie., data may required at any part of the table. So I think of to compress the table with the aim of reducing I/O.
1. is that really suitable for this requirement? If So which compression is good (Row or page )?
2. Or is there any other alternate to reduce the I/O (Except Indexing)?
3. Pros and cons ?
View 2 Replies
View Related
Jul 31, 2015
I have a report where in I have a combination of matrix ,table data regions.
The problem what I am facing is that the data tables don't remain fixed in their position and they tend to move down.
E.g. table 1 and table 2 are on the same page in design time side by side (right and left)however during the runtime the table1 is pushed down and table2 is at its position .
Now how can I keep them all fixed in their same position. Most of the tables have fixed size rows and some who have high size of rows have been put at the end . What settings we can set?
View 6 Replies
View Related
Jan 21, 2015
ALTER TABLE [dbo].[TableNameExample] REBUILD PARTITION = ALL
WITH (DATA_COMPRESSION = PAGE)
The table is 110 GB, so will take time to compress. But it is one table in a database with 60 tables. Why would executing this statement, cause lock timeout on Object Explorer in SQL Server Management Studio? Users cannot drill down objects in this database without getting lock timeout.
When I cancel the compression script, users of SSMS can access objects in this database again from the GUI. Why does compressing a specific table affect access to all tables metadata? I cannot find anything on the internet but I am sure this has happened to other people.
View 3 Replies
View Related
Sep 17, 2015
I am working on Sql Server 2012. and I have multiple databases there. Out of those, i want to move one of my databases to other SQL server 2012, For that i was trying to get approximate size of my database on current server. As i don't have the admin rights, so i can't get that. Can i get the approximate size by right clicking on database and by using the size property Under Database category to get the size idea?
View 4 Replies
View Related
Dec 15, 2014
how to check the errors when a database is configured in Always on in sql 2012?
View 1 Replies
View Related
Apr 21, 2014
finding the database size from the backup file.I have SQL 2012 backup file, is there any way to find the estimated database size from the backup.I tried restoring , i got an error saying " no space need additional xxx bytes " ...does this error gives the exact space needed to restore ?
One more question....one of the backup file size is 7.2 GB, when i try to restore it ....it throws error saying it needs 292GB extra space while only 100 Gb is available. How come 392 Gb sized database becomes 7.2 Gb .bak file ?
View 5 Replies
View Related
Jun 9, 2015
I have a 50gb database, with 3 files at the primary filegroup, each one of those has around 16gb I truncated 2 tables releasing 33gb, so the database should have around 17gb now, but when I check at the properties it says that each file doesn't have any empty space
this is on a MSSQL 2012 SP2 CU1
View 6 Replies
View Related
Nov 4, 2003
The following procedure will display the size of all the user tables in a database.
CREATE proc sp_tablesize
as
if exists (select * from sysobjects where name = 'sp_tablesize')
begin
goto calculate_tablesize
end
else
begin
CREATE TABLE #SpaceUsed
(
TableName sysname,
TableRows int,
TableSize varchar(10),
DataSpaceUsed varchar(10),
IndexSpaceUsed varchar(10),
UnusedSpace varchar(10)
)
goto calculate_tablesize
end
calculate_tablesize:
declare @tablename nvarchar(50)
declare @cmd nvarchar(50)
declare c1 cursor for select name from sysobjects where xtype='u'
open c1
fetch c1 into @tablename
while @@fetch_status = 0
begin
set @cmd='exec sp_spaceused['+@tablename+']'
insert into #SpaceUsed exec sp_executesql @cmd
fetch next from c1 into @tablename
end
select * from #SpaceUsed
drop table #SpaceUsed
deallocate c1
View 20 Replies
View Related
Jun 10, 2014
script in SSIS to check data connections/sources and send an email if it fails validation. I found this script online, but I'm not real good with scripting and only VB scripting at that.How can I modify this script or is there another one here that will do what I need. The data connection in question links to Salesforce and will fail validation if the password/security token is invalid.
The reason I need this script is that we have no control over the password changes and need to know if the salesforce team changes the password without informing us.
/* The Script Task allows you to perform virtually any operation that can be accomplished in
* a .Net application within the context of an Integration Services control flow.
*
* Expand the other regions which have "Help" prefixes for examples of specific ways to use
* Integration Services features within this script task. */
#endregion
#region Namespaces
using System;
[code]...
View 9 Replies
View Related
Apr 14, 2015
I need to increase the file size for a mirrored database. I am new to using mirroring for replication. Will increasing the file size break the mirror?
View 2 Replies
View Related
Nov 17, 2014
I have 57 tables, 7 views and 1 stored procedure. Just wanted know based on these requirements how can I find the size of the database. Though the DB contains lots of tables, views and procedures. I am moving these details to new DB server. So I need to put right requirements.
View 2 Replies
View Related
Jul 12, 2007
I have a Db that is 1.7 gigs. The table data takes approximately 200megs. The transaction logs were truncated. Where else can this large size be coming from and how can I confirm?
DB is generally small. ~25 tables, 100 SPs, 10 views, etc.
Note:
I have 4 queues using SQL Notifications, but when selecting from them results in no data.
Thanks
Scott
View 1 Replies
View Related
Nov 16, 2015
I need to build a query to calculate the size of all tables in a database ...
View 5 Replies
View Related
Sep 9, 2014
Does sql server 2012 support varbinary data type for replication (Merge or transaction)?
And if so, is there a limitation of data size?
View 1 Replies
View Related
Jan 17, 2001
Does DTS do any data compression when transferring data from a table on Server A to another table on server B?
View 1 Replies
View Related
Feb 28, 2001
Can anyone tell me whether there is any data compression in SQL6.5. Have concerns with network traffic, and was wondering if data compression was a function SQL6.5, or if the data compression have to be coded into the actual database?
View 1 Replies
View Related
Jan 22, 2004
I have seen a bunch of ways to get the size of all the tables within a database posted on this board. I decided to modify an older one I found here (http://www.sqlteam.com/item.asp?ItemID=282). I set it up so there is no cursors or temp tables. Pretty much just one select statement to return all the info you would need. It seems to be faster than anything I have seen so far. Take it for whats its worth. Thanks to the original creator.
/*
Original by: Bill Graziano (SQLTeam.com)
Modified by: Eric Stephani (www.mio.uwosh.edu/stephe40)
*/
declare @low int
select @low = low from
master.dbo.spt_values
where number = 1 and type = 'E'
select o.id, o.name, ro.rowcnt, (r.reserved * @low)/1024 as reserved,
(d.data * @low)/1024 as data, ((i.used-d.data) * @low)/1024 as indexp,
((r.reserved-d.data-(i.used-d.data)) * @low)/1024 as unused
from
sysobjects o
inner join
(select distinct id, rowcnt
from sysindexes
where keys is not null and first != 0) ro on o.id = ro.id
inner join
(select id, sum(reserved) reserved
from sysindexes
where indid in (0, 1, 255)
group by id) r on o.id = r.id
inner join
(select c.id, dpages+isnull(used, 0) data from
(select id, sum(dpages) dpages
from sysindexes
where indid < 2
group by id) c full outer join
(select id, isnull(sum(used), 0) used
from sysindexes
where indid = 255
group by id) t on c.id = t.id) d on r.id = d.id
inner join
(select id, sum(used) used
from sysindexes
where indid in (0, 1, 255)
group by id) i on d.id = i.id
where o.xtype = 'U'
order by reserved desc
View 3 Replies
View Related
Jun 21, 2007
Hello All,
When creating my database I have modeled some of the tables after the Adventureworks sample database.
There are some fields or entire tables in Adventureworks that I do not see an imediate use for, however; I would hate to ommit them to find out later they would have been benificial. (.eg territory table).
In general terms what would the impact be on size and performance of a database which contains tables or fields that do not contain data.
Thanks for your help!
View 1 Replies
View Related
Oct 15, 2015
I have a database consisting of two main tables and 12 sub tables.
This was leading to increase in database size. So we thought of storing the sub tables data in the main tables in form of xml in a column of varchar(2000) type.
So we created a new database that only had 2 tables and we stored the data of the sub tables in the new column of the main table.
Surprisingly we saw that the database size increased rather than decreasing .
View 9 Replies
View Related
Jun 19, 2007
Hi All
I have a database which is 72GB, which is backed up every night as part of the maintenance plan. I have plenty of storage space, and the server that runs the database is fairly powerful (quad-processor 3.2ghz, 64bit, 48GB RAM) and is part of an active-passive cluster. The database backup is also copied to a SAN location.
My issue is with the size of the backup file. As part of the Disaster Recovery plan, I need to copy this database backup file accross the network to a remote site, so that in the event of a disaster at the site, business can continue at the remote site after restoring the database backup file. However, my database backup file is so big that I cannot copy it accross the network in time for the next morning. I have tried using WinRar and have managed to achieve a file about 20% of its original size, but it takes 2 hours to produce this file.
Is there any recommended reeading for this type of issue? Log shipping / mirroring has been investigated and will be part of the DR model but the 'powers that be' insist on having a full copy performed to the remote site.
Any suggestions? Thanks in advance guys n gals :-)
View 4 Replies
View Related
Apr 30, 2007
I am replicating an 80GB database between NY can CT and would like toknow why table sizes are different between the two.Here is an example of sp_spaceused::NY IOI_2007_04_23 rows(279,664) reserved(464,832)data(439,960) index_size(24,624)CT IOI_2007_04_23 rows(279,666) reserved(542,232)data(493,232) index_size(48,784)Thanks,
View 1 Replies
View Related
Sep 21, 2015
I have deleted nearly 30 million rows from a table. But however when I used the sp_spaceused command to calculate the data occupied by the table I don't see any difference in the data size of the table. In fact the data has increased to few MBs after the deletion, but not much.
View 8 Replies
View Related
Feb 28, 2005
Does anyone know a wayto compress data between two database connections over "low bandwidth" lines in order to speedup datatransfer?
Used connections are Oracle<->SQL2000 and SQL2000<->SQL2000.
View 3 Replies
View Related
Apr 27, 2008
Hi,
i'm trying to write this script that check my database file and log size(in MB) and insert them into a table.i need the following columns
dbid,dbname,compatability_level,recovery_model,db_size_in_MB,log_size_in_MB.
i try to write this a got stuck.
select sysdb.database_id,sysdb.name,sysdb.compatibility_level,
sysdb.recovery_model_desc,sysmaster.size from sys.databases sysdb,sys.master_files sysmaster
where sysdb.database_id = sysmaster.database_id
can anyone help me with this script?
THX
View 13 Replies
View Related
Dec 27, 2006
Hi, I back up SQL Server 2000 and SQL server 2005 databases to hard disk using the SQL Server Backup Wizard and maintenance plans. Then, I copy the resulting backups to tape using third party tape backup software and compression by the backup software and hardware. I do not use the SQL Server Agent available for the third party backup software. Is this acceptable, or does the compression performed by the third party backup system introduce opportunities for database corruption or other negative effects?
Thanks
View 1 Replies
View Related
Jun 16, 2015
I tried to disable the data compression for granular backup purposes on SQL 2014 with following query: EXEC [dbo].[prc_EnablePrefixCompression] @online = 0, @disable = 1
I received following error message: Msg 2812, Level 16, State 62, Line 1...Could not find stored procedure 'dbo.prc_EnablePrefixCompression'.
View 7 Replies
View Related