How can I get the table sizes for data and transaction logs just like we had in SQL Server 7.0 on the first screen of the Entreprise Manager?
I remember having a bar showing used space in blue and unused in magenta. I bet there are a couple of functions that can be added in a script that will retreive this info.
How can I get the table sizes for data and transaction logs just like we had in SQL Server 7.0 on the first screen of the Entreprise Manager?
I remember having a bar showing used space in blue and unused in magenta. I bet there are a couple of functions that can be added in a script that will retreive this info.
The below query works perfectly fine, except that it produces many outputs instead of one continuous table that can be easaly converted to xml / csv by copying it from the "Results" window.
What I need is a query that will produce a single result for all the tables in all the databases on the server.
The results: The query:
DECLARE @begin INT = 1, @end INT, @sql NVARCHAR(MAX) DECLARE @CREATE_TEMPLATE VARCHAR(MAX); DECLARE @DBNAME VARCHAR(255); DECLARE @SQL_SCRIPT VARCHAR(MAX); SELECT @end = COUNT(name) FROM sys.databases SET @CREATE_TEMPLATE = '
Hi, I am looking to runa query to get the sizes of the tables in my SQL 7 DB. I know I can access the info in Enterprise Manager, under "Tables & Indexes". But I need to get this info via a query. I need rows and size. I figured out how to get rows through the sys tables: select sysobjects.name, sysindexes.rows from sysobjects,sysindexes where sysobjects.name = sysindexes.name and xtype = 'U'
Is the size of each table stored in a sys table as well? I can't find it.
Is there a query I can run to retrieve a list of all tables and their sizes in a database? I want something that is like the feature in Enterprise Manager when you click on a database and then the 'Tables & Index' link. It lists the tables and their respective size. I want to push this into a spread sheet.
The reason why I am doing this is the compare data between 2 different databases. Since I cannot find a tool that will compare the data, the closest I can get (without bcp-ing out all data and comparing) is to look at the sizes of each table.
I have a table in my database and the table has almost 45 columns and the rowsize is 10468 bytes.in that most of the colums have varchar datatypes and and i think coz of poor knowledge of the data most of the columns with varchar data were given more column length. Now i want to decrease the size of those columns and to see the row size would be around 8k Bytes.If i do this now, does it affect the table performance much....Infact can i do this as there is lot of data (almost 2 million rows) in the table.If it is possible is there anything to be taken care before changing the column lenghts.
INSERT #Databases EXEC ('EXEC sp_databases'); SELECT@@SERVERNAME AS SERVER_NAME, DATABASE_NAME, DATABASE_SIZE AS 'KB', ROUND(DATABASE_SIZE / 1024, 2) AS 'MB', ROUND((DATABASE_SIZE / 1024) / 1024, 2) AS 'GB', CONVERT(date, getdate()) AS Date FROM #databases
I have a few databases that are using Partitioned Views in order to manage the table sizes and they all work well for our purposes. Recently I noticed a table that had grown to 400+ million rows and want to partition it as well, so I went about creating new base tables based on the initial table's structure, just adding a column to both table and primary key to be able to build a Partitioned View on them.The first time around, on a test system, everything worked flawlessly but when I put the same structure in place on the production system I get the dreaded "UNION ALL view 'DBName.dbo.RptReportData' is not updatable because the primary key of table '[DBName].[dbo].[RptReportData_201405]' is not included in the union result. [SQLSTATE 42000] (Error 4444)" error.
I have searched high and low and everything I see points to a few directives in order for a UNION ALL view to be updatable:
- Need a partitioning column that is part of the primary key - Need a CHECK constraint that make the base tables exclusive, i.e. data cannot belong to more than one table - Cannot have IDENTITY or calculated columns in the base tables - The INSERT statement needs to specify all columns with actual values, i.e. not DEFAULT
Well, according to me, my structure fulfills these conditions but the INSERT fails anyway. CREATE scripts below scripted from SQL Server. I only modified them to be on a single row - it is easier to verify that they are identical in a text editor that way.
SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO SET ANSI_PADDING ON GO
I have one .mdf and two .ndf files on the same drive. The .mdf file size =275GB, one .ndf file size = 300GB and other .ndf file size = 135GB. Is this normal to have 3 different file size? if not what can I do to fix this? I don't have option to make all files to initial size equal to 300GB as a .ndf.If I have to add a .ndf file (in case of running out the above drive), what initial file size should I set up for new file on new drive? And how data gets distributed across all 4 files (including new .ndf on different drive)?
I'm using SQL Server 2000 and need to restore a large database onto a different node. The problem is the original database has a 74 gb first datafile and the node where I need to restore it doesn't have a single drive that big. I'm trying to use a backup of the original database and restore it into an existing database on another node and am using the move options to put the files in the right places.
Is there a way I run the restore to split the 74 gb datafile across drives on my target node?
I have an instance with 4 datafiles for tempdb each set at initial size of 4G and growth rate of 100MB. After some time the initial file sizes seem to have changed automatically. They now read 3962,100,3688 and 2847 respectively. Is this something done by SQL Server itself? I cannot imagine that it was done manually.
I don't think there was a restart after the initial sizes of 4G were set, could this be related to the problem?
I'm having a problem to which I'm sure the answer is simple...
All I want is a list of databases on my server with their allocated size and the free space within. Something similar to the first table that sp_spaceused gives you but on a server wide scale.
As I say, I'm sure there's a simple solution out there, but alas Google has failed me.
Hi All, I have a Problem while updating one table data from another table's data using sql server 2000. I have 2 tables named TableA(PID,SID,MinForms) , TableB(PID,SID,MinForms) I need to update TableA with TableB's data using a single query that i have including in a stored procedure.
An SSIS package to transfer data from a DB instance on SQL Server 2005 to SQL Server 2000 is extremely slow. The package uses an OLEDB Source to OLEDB Destination for data transfer which is basically one table from sql server 2005 to sql server 2000. The job takes 5 minutes to transfer about 400 rows at night when there is very little activity on the server. During the day the job almost always times out.
On SQL Server 200 instances the job ran in minutes in the old 2000 package.
Is there an alternative to this. Tranfer Objects task does not work as there is apparently a defect according to Microsoft. Please let me know if there is any other option other than using a Execute 2000 package task or using an ActiveX Script to read records from one source and to insert them into the destination source, which I am not certain how long it might take and how viable will that be?
sql server 2000I am currently maintaining a table that contains 30 Million+ records,30 columns, and 11 indexes and will double within the next six mouths.The application that accesses this table, mainly for read onlypurposes, runs without any problems. We have begun using Crystalreports and are now having problems. When we create reports thataccesses the large table our server has significant performance dip.The application begins to time out and the reports take a very longtime, even with simple selects on indexed field.I have began looking into partitioning the large table on its key fieldand creating a partition view. But from what I have read this willonly help if we key on the partitioned field. And all other searcheswill actually take a little longer.Archiving old data is not an option. All the data is being usedAny suggestions will be appreciated. Thanks in advance.Rick
Unfortunately a table is Deleted by me from my database. Now How I can Recover that I have no data and structure of that table Now. It Was Very IMP. Table . So Plz Help.......
CREATE TABLE T2(I INT unique ,name varchar(10) ,CHECK(name NOT IN ('US')) PRIMARY KEY(I,name))
Then Created the partitioned View using below script CREATE VIEW V AS SELECT * FROM T1 UNION ALL SELECT * FROM T2
I am able to insert the US records using view insert into V values (1,'us')
Problem: I am not able to insert the non US records like UK and JN... It is thorwing the exception like partition column not find. eg : insert into V values (1,'uk')
PLease help me to insert the non US records into the View
Anyone here with a ready to go sqlscript that lists all db's, files, sizes, owner etc? I guess it's a combination of sp_databases, sp_helpdb and sp_helpdb [db].
Hi anyone please help! I have created the database driven web application with asp.net and sql server 2000. now I want keep track three operation(insert, updata and delete) that have been made on tables in a SQL Server 2000 database. what i did is: 1, create a audit table with columns: auditTable, actions, actionUser, actionTime 2, create three trigger(insert, update and delect respectivily) for every table my problem is that i can not get right user name. I use form authentication and i stored user login information in the database. every time, no matter who is logining to the web application, the action user is always SA. I user user-name() function to get userName(actionUser). Please anyone can help me to get current login user name, or tell the best way to track operations on a table. Thanks jili
I have looked on google and haven't found a query (as of yet) to perform this function.
Essentially I am using VB.NET with Excel and have a mapping between a worksheet and a table in my database. I wrote an import function to pull the data out of excel and put it into SQL Server but I want to try catching errors before i do that.
What is the SQL query to get column sizes from a table. Meaning in a table I have column1 that is allowed a size of int(5). How do i retrieve that information from a query opposed to just looking at it in SQL Server EM??
Hi, Can anybody give me some information on Table data types of SQL2K. How and where can it be used. Does it make the queries faster, in case there are many users using ? Please could anybody give me these details, to make me decide whether I can use it or not ?
Venkatesh writes "In MS-Access table a coulmn is specified as Auto number property. I want to migrate this table into SQL Server.
We can create a new column that has identity property in SQL Server 2000, which simulate autonumber property of MS-Access.
My access table contains 700 records and I need to set the column (ListID) as Auto number. i.e., I m going to modify this coulmn with identity property.
But I cant set the identity property for the existing column.
Can u pls send me the sql query that modify the existing column(ListId) with Identity property.