I can run this command to make changes for 1 DB USE [master]
GO
ALTER DATABASE [DBName] MODIFY FILE ( NAME = N'Name_log', FILEGROWTH = 10000KB )
GO
Is it possible to create a script which changes log file size to let's say 100MB for all DBs in all servers instead of running the above command by logging into each server? We have about 200 servers and close to 3000 DBs.
We are running into the following error while changing a column data type from nvarchar (1200) to varchar(8000) "Msg 1105, Level 17, State 2, Line 1
Could not allocate space for object 'dbo.TBL1 '.'PK_CL_ID' in database 'Client01' because the 'PRIMARY' filegroup is full. Create disk space by deleting unneeded files, dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup.
The statement has been terminated."
Now tried to change the filegrowth of the log file to unlimited
ALTER DATABASE Client01 MODIFY FILE ( NAME = Client01_log, MAXSIZE = unlimited);
The query executes without error but I do not see the auto growth as unrestricted. It's still 2GB
From BOL, I see these remarks with respect to the MODIFY FILE subcommand (my underline added):
Initializing Files By default, data and log files are initialized by filling the files with zeros when you perform one of the following operations:
Create a database
Add files to an existing database
Increase the size of an existing file
Restore a database or filegroup
Which leads me to believe that expanding the size of a datafile will also wipe out (my definition of 'initialize') any existing data within that file.
I may be misunderstanding 'initialize', because when I tested it out, I found this wasn't the case - my table data written to the file was still there after a resize.
Need to clarify to what degree I'd be taking a risk by increasing the file size on a datafile which already has data in it.
Hi guys, If I have a temporary table called #CTE With the columns [Account] [Name] [RowID Table Level] [RowID Data Level] and I need to change the column type for the columns: [RowID Table Level] [RowID Data Level] to integer, and set the column [RowID Table Level] as Identity (index) starting from 1, incrementing 1 each time. What will be the right syntax using SQL SERVER 2000?
I am trying to solve the question in the link below: http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=2093921&SiteID=1
Thanks in advance, Aldo.
I have tried the code below, but getting syntax error...
ALTER TABLE #CTE ALTER COLUMN [RowID Table Level] INT IDENTITY(1,1), [RowID Data Level] INT;
I have also tried:
ALTER TABLE #CTE MODIFY [RowID Table Level] INT IDENTITY(1,1), [RowID Data Level] INT;
SQL server. By-mistake I updated values of a column in a database hosted online, is there any way undo the transaction. I didn't created any backup of the database. I read that still it can be recoverd through the .ldf (log file) but unable to access it. Is there anyway to get access of the Log file or is there any way to recover the data.
my aim is to modify the two fields to change its data type. BUt when im trying to run this command in the query analyzer, itsays "incorrect syntax error '(' "
Hello, I have gone to work for a new company and I have seen this company is running out of drive space on a daily basis. The DBA before me created huge databases. For example we have a database that it's created size is 103 GB. Only about 52 GB is ever needed. I know you can resize the tempdb but how about user databases? I have never tried this. Can this be done for a user database?
For tempdb Start SQL from cmd line sqlserver -c -f -s%instance_name% alter databse tempdb modify file
Hi all I was wondering how to do an ALTER command on a Table but without specifying Column Names but rather attempting to overwrite the Table itself with the new fields specified? For instance if I have Table_1 consisting of the following fields:
IDFirstNameSurname
Then use the following ALTER command:
Code Snippet ALTER Table Table_1 ( ID Int, FirstName VarChar(50) ) This would then drop Surname from the Table and leave only ID and FirstName inside it. Is this possible? I have been searching google but can't seem to find what I am looking for.
Is there a way to (automatically) remove/disable the first statements like SET ANSI_NULLS ON and SET QUOTED_IDENTIFIER ON which are generated by modify sp via mms 2014 interface?
-- SET ANSI_NULLS ON -- SET QUOTED_IDENTIFIER ON ALTER PROCEDURE [dbo].[sp_SendMail] @test INT = 0 AS begin --blabla end
We have a proc that adds some fields to a few tables of ours and normally there are no issues. For one of our client databases this process is taking anywhere from 5-10 minutes to add the fields. This causes an issue where the app will timeout waiting. After plugging around and looking at the proc and trying different items i found it to only be for this one database and ONLY when there is data in the table. If i truncate the table and run the same procedure everything is fine. Tables all have same index on 4 columns and the columns being added are not indexed because of the stupid hoops we have to jump thru to pre-pivot data for our reporting package.
Database File Placement Layout? We are planning to implement a new SQL Server 2014 OLTP Database with a 1 TB Data file and 1 TB Log File. I am looking at the possible layout of the database files and trying to determine the best possible configuration. My knowledge/research tells me that items which need separate storage due to constant simultaneous access are:
Data files – should go on the fastest reading storage. Log files – should go on the fastest writing storage. TempDb – involves a lot of writing at the same time the data files are being read. Indexes - (including full text indexes) - involves a lot of writing at the same time the data files are being read.
Also, are there any benefit to having multiple OLTP Database Log files? Because SQL Server writes to the log file sequentially, I do not see any advantages to having multiple database log files. In a SQL Server 2012 Class I took last summer, under “Determining File Placement and Number of Files”, it states “Use a single log file in most situations as log files are written sequentially.”
I have installed SQL 2014 (Evaluation Version) on testing machine. We want to import some excel files on database. I manually created one Test Database and now trying to import excel file. Import completed successfully but I am not able to see any table created as result of Import. I tried it 3-4 times and even restarted sql services but no luck.
I installed SQL Server 2005 Enterprise Edition (64-bit) and then applied SQL SP2. When I launch the "SQL Server 2005 Surface Area Configuration" and choose "Surface Area Configuration for Features" I'm trying to enable xp_cmdshell.
After ticking the checkbox, and clicking Apply, I receive the error below. Anyone have any ideas on how to fix this?
===================================
Alter failed. (Microsoft.SqlServer.Smo)
------------------------------ For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&ProdVer=9.00.3042.00&EvtSrc=Microsoft.SqlServer.Management.Smo.ExceptionTemplates.FailedOperationExceptionText&EvtID=Alter+Configuration&LinkId=20476
------------------------------ Program Location:
at Microsoft.SqlServer.Management.Smo.ConfigurationBase.Alter(Boolean overrideValueChecking) at Microsoft.SqlSac.Public.Smo.SetSetting(Credentials credentials, DatabaseFeature feature, Int32 value) at Microsoft.SqlSac.MainPanel.UserControlSSxp_cmdshell.ProcessOK(HashKey givenKey, String machineName) at Microsoft.SqlSac.MainPanel.FormFeatures.commitDBChanges() at Microsoft.SqlSac.MainPanel.FormFeatures.commitChanges()
===================================
Could not load file or assembly 'Microsoft.SqlServer.BatchParser, Version=9.0.242.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' or one of its dependencies. The system cannot find the file specified. (Microsoft.SqlServer.ConnectionInfo)
------------------------------ Program Location:
at Microsoft.SqlServer.Management.Common.ServerConnection.GetStatements(String query, ExecutionTypes executionType, Int32& statementsToReverse) at Microsoft.SqlServer.Management.Common.ServerConnection.ExecuteNonQuery(String sqlCommand, ExecutionTypes executionType) at Microsoft.SqlServer.Management.Smo.ExecutionManager.ExecuteNonQuery(String cmd) at Microsoft.SqlServer.Management.Smo.ConfigurationBase.DoAlter(Boolean overrideValueChecking) at Microsoft.SqlServer.Management.Smo.ConfigurationBase.Alter(Boolean overrideValueChecking)
I'm trying to change the file extension of one of the database files of a big database that contains 10 years of EPOS data. When this file was created, at the beginning of 2008, with the following script:
USE [Live]
ALTER DATABASE [Live] ADD FILE (NAME = 'Live_2008',
FILENAME = E:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDataLive_2008') TO FILEGROUP [2008] the .ndf file extension was not added to the filename. So the file now has a file type of "File", instead of "SQL Server Secondary Data File" I tried, in a test environment, to bring the database offline, change the filename to "Live_2008.ndf" and bring the database back online, but it gives me an error message saying that "cannot open the Live_2008 file", i'm guessing because it's looking for "Live_2008" and it finds "Live_2008.ndf" instead. This file is quite big (5Gb) and I cannot afford to risk to lose any data. The situation is complicated even further by the fact that it seems that a partition for 2008 does not exist. How can I find out what data this file contains? Is it going to be an issue if I leave the file like that? Thanks
I'm attempting to alter a database by removing one of two log files. I have truncated both log files successfully and the database has no processes but I get an error stating file cannot be removed because it is not empty. Help
I want to set up a database role so that users can use sp_readerrorlog through SSMS. It does a check on membership in the securityadmin role.
I have tested it and can see you can grant execute on xp_readerrorlog but the SSMS GUI uses sp_readerrorlog.
I thought I could create a user/certificate and add the signature to sp_readerrorlog but it's not permitted (likely because it's not a normal database object).
So the other solution is to add the users to the securityadmin role but then explicitly deny alter any login (best done with a custom server role in 2012+ but otherwise just manually in 2008). I tested this out and it works, I'm not able to alter any logins or increase my own permissions, I also did a check of what's reported from fn_my_permissions(null, null) and it shows minimal permissions like I'd expect.
For a database, we have 4 data files in a particular file group and the file sizes are almost 70 GB each.
Do I come across any performance issues if I create/pre-allocate an additional data file in the same file group so that the existing files don't grow too much?
In a server we had File Growth,And then We had to Add New Hard Drive And New File On It.And Now We have New server with a Huge Hard Drive.But all files remaind.Can I Reduce This files to One data file or not ?
Hello!I have an MS SQL-server with an database, that runs replication. In thisdatabase there is an table with an columni want to extend; varchar(50)->varchar(60).But I get this error (using design window of Enterprise Manager): Cannotdrop the table 'MytableName' because it is being used for replication.Thanks for helpBjoern
1) We are providing a e governance solution for an organization,where we are providing a centralized database,Client have provided 5 Database server for the same.how can we position the Database Server? there are 5000 Concurrent users and 25000 users,SAN Storage for approx. 60 TB,Database size of 2 TB and growth of 1 TB every year
2) How many instance can we have for above said Case?
My sql databases in SQL Server 2014 has the status "suspend" as I saw in SQL Management Studio. I can't restore to serviceable condition sql databases through standard procedures. I need to restore .mdf file.
When I run this, the table is loaded with data but not in the intended way.This is what I have from the table
If the 1st line in the text file has 35 columns and the row ends after it, in the table the 1st row has correct info until the 35th column and instead of going to the next row for the next line in file, it continues to use the next 5 columns in table before it goes to the next row. I think its not getting the row delimiter.
I have a database which gets refreshed on the daily by restoring 1 transactional file log, at the moment it is done manually but I'm trying to create an automation job which would get the transactional life file and restoring it.
A log file size of a production database has been increase from 4gb to 150 gb initial size.Now i want to find when it will grow & how much it grow & which transaction is responsible for this.
1. I have sql 2008 R2 running on my LocalHost. 2. Created Data Base [Customer]. 3. Created Linked Server [CUSTOMERLINK] USING Microsoft Jet 4.0 to link to Drive F:Data which has DBF files in it. 4. Create dbo.Customer_Upload Table. 5. INSERT INTO [Customer].[dbo].[Customer_UpLoad] ([Name],[Email]) SELECT NAME,EMAIL FROM [CUSTOMERLINK]...[CUS]
All this works fine. I can even put it in to an After Insert Trigger on another table and it works.
My problem is that I need this to work in a scheduled job.
F:Data is just a folder with files in it.
This info is from a Restaurant POS system and I need to update it every night.
I have tried every which way to to setup the security issue as there isn't any login security on the folder and SqlServerAgent wants security.
I did tried the encryption on server "A" for database "AdventureWorks2012". Then I tried to restore to server "B". There was the certificate issue, and I thought "of course : it's encrypted ! Let's deactivate it". So here I go "ALTER DATABASE AdventureWorks2012 SET ENCYRPTION OFF".I look at sys.databases : not encrypted.I backup using no encryption, I verify using msdb.dbo.backupset : not encrypted.
I move my backup to my other server where encryption was never configured (so no certificate, nothing...), and I have the error : Msg 33111, Level 16, State 3, Line 1
Cannot find server certificate with thumbprint '0xFA130E58C999C4919B8975999C83A75A403B11D8'. Msg 3013, Level 16, State 1, Line 1 RESTORE DATABASE is terminating abnormally.
I want to restore a database (from an encrypted .bak file) - but *not* over the live original if you take my meaning. Encryption is the standard AES-256 that comes with Sql Server 2014 btw. I don't want the original touched/altered in any way. I would like to capture a success message if possible.I can extract the physical device name of the database in question using the following code:
SELECT physical_device_name, * FROM msdb.dbo.backupmediafamily WHERE media_set_id =(SELECT TOP 1 media_set_id FROM msdb.dbo.backupset WHERE database_name='MyDatabase' AND type='D' ORDER BY backup_start_date DESC)
I would like if the newly restored database was rename to something different than 'MyDatabase' (as shown above) and has different logs than the original. If possible, and capture a success message when restored.