SHRINKDATABASE And LOG File

Oct 22, 2004

Hi all, after using "DBCC SHRINKDATABASE (MyDB, TRUNCATEONLY )" command my log file stoped growing. It still remains 1024KB after 4-5 days of activity. What might be the problem?

View 1 Replies


ADVERTISEMENT

SHRINKDATABASE.....

Nov 27, 2007

I want to shrink the log file in SQL Server 2005 (SP2).
What is the bestway in 2005? Any enhancements in SQL Server 2005?
Thank you,
Smith

View 4 Replies View Related

Shrinkdatabase Questions

Feb 26, 2002

I just inherited a database nightmare - and I've got a few questions I hope you folks can help with.

I ran a shrinkdatabase command from the enterprise manager (I should have used DBCC I realize now), and then my session died. The process is still going - but this is a huge database, and I'd like to know how long it will run. Is there any way to translate physical io (or anything else) to % complete?

This is being run to open up space on a disk array that is completely full. However, since kicking that off, additional storage has mysteriously been made available. <sigh> I assume that killing the shrink wouldn't involve rollbacks - but might leave the database in an unpredictable state. Is this correct?

Any other recommendations to speed this along? And, yes, I've got all the users off the system.

Thanks in advance for you insights...

Bart

View 1 Replies View Related

DBCC Shrinkdatabase

May 1, 2000

Does anyone have any idea how long DBCC SHRINKDATABASE takes to run? I have a 20GB database with 10.5 GB used. I know hardware is part of the equation, but I don't know the specifics of the machine this particular DB is on, other than it is an 8 cpu compaq.

View 1 Replies View Related

Shrinkfile Vs Shrinkdatabase

Nov 27, 2001

Hi:

Need to shrink a few large databases and has anyone had more success with using DBCC Shrinkfile compared to DBCC Shrinkdatabase?

Thanks,

View 1 Replies View Related

Dbcc Shrinkdatabase

Sep 10, 2003

When using the above comand on our database, I receive the error:

Transaction (Process ID 81) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.

Is that because we have users in the database as Im trying to shrink it ?

I re-ran the query and received the same messages.

thank you

View 1 Replies View Related

Dbcc Shrinkdatabase

Sep 28, 2006

I've been testing this in our test environment. Our database has grown to 18GB so I want to make it smaller.

I used

DBCC SHRINKDATABASE (dbname, TRUNCATEONLY)

because i want to free up space that the file takes up on the server. Am I misunderstanding this? Should I not expect my .mdf to shrink?

Thanks.

View 6 Replies View Related

DBCC SHRINKDATABASE

May 12, 2008

Hi,

should i use the SHRINK DATABASE maintenance plan once a month on all USERS DB?
if so what parameters should i put this the "Shrink Database when i grows beyond XXX MB" and the "Amount of free space to remain after shrink"

THX

View 2 Replies View Related

DBCC SHRINKDATABASE

Feb 15, 2006

Hi,

I have strange problem:
1) When I run "DBCC SHRINKDATABASE" (from my application) on computer with SQL Server 2000, it works fine.
2) When I run "DBCC SHRINKDATABASE" (from my application) on computer with MSDE that is connected the network with other computer with SQL Server 2000, it works fine.
3) When I run "DBCC SHRINKDATABASE" (from my application) on computer with MSDE that isn't connected the network, it stops response. I have wait for 15 minuts but it still doesn't work, neither the database is clear before. Why?
4) Also when I run "DBCC SHRINKDATABASE" (from my application) on computer with MSDE that is connected the network without other computer with SQL Server 2000, it stops response. I have wait for 15 minuts but it still doesn't work, neither the database is clear before. Why?

May be the "DBCC SHRINKDATABASE" need some DLLs that installed on SQL Server 2000 only and doesn't deliver with MSDE? May be this is SQL-DMO or/and SQL-NS?

My appliation created by C# in VS.NET 2003. I am using OLEDB.

Thank's
Alexei

View 1 Replies View Related

Shrinkdatabase Progess

Jul 28, 2007

Hi there,

How can I see the progress of DBCC SHRINKDATABASE on SQL Server 2000?

Thanks in advance.

Regards,
MKarumuru

View 5 Replies View Related

DBCC SHRINKDATABASE

Jun 15, 2006

Has anyone used the
"DBCC SHRINKDATABASE"
in a VB.NET project to shrink a DB.
If YES,
Please, if you dont mind, paste the code in this forum.

Tnx

View 4 Replies View Related

DBCC SHRINKDATABASE Duration

Nov 8, 2006

Is there anyway to tell how long this will run for -- or how far it has got? I have a large database that has just had most of the data removed. The command has been running for 8 hours and I have just stopped it to let something else run quickly. Any way of telling how much longer it will take?

View 7 Replies View Related

Shrinkdatabase In Sql 2000 Vs Sql 2005

May 17, 2007

what is the difference in shrinkdatabase between sql 2000 and sql 2005. i am just doing a plain shrink, no reorg. it runs very fast in 2000 and runs forever in 2005.

View 1 Replies View Related

Shrinkdatabase Taking Forever...

Mar 17, 2008

Hi all,

2 weeks ago I deleted about 200GB of data from a 300GB+ database. It's a custom DB we want to use to test few things. We wanted a smaller size DB for our testing and since we didn't have any we grabbed a production backup, removed sensitive data and ran a large archiving script on it... Anyway so far so good but our data file was still the same size as before.

So we started a shrinkdatabase... it has been running for 2 weeks now! After about 1 week I interrupted the shrinkdatabase process and ran a
dbcc shrinkdatabase('DB', truncateonly)
just to see if the data file will get reduced a bit or not. It did get reduced by about 20GB. I assume that
dbcc shrinkdatabase('DB', 0)
has free up enough pages at the end of the data file so a truncateonly was able to free up some space... Anyway after this we started the
dbcc shrinkdatabase('DB', truncateonly)
again... still running...

The database was never shrank before and every index is highly fragmented... Is that why it's taking so long? Am I actually going to have to wait for another few weeks before that thing finishes??

Anyone has experience running shrink on large DBs?


thanks!

View 14 Replies View Related

Advice On DBCC SHRINKDATABASE

Mar 3, 2008



Hi everyone:

I have a database that appears like it's much bigger than it should be. Looking at its properties in Enterprise Manager, it's 13GB big, but it says 7GB is available in free space. It's like it's grown, data was deleted, and it was never shrunk back down.

So I'm considering running a DBCC SHRINKDATABASE but am worried about the ramifications. Here are my concerns:

- Is the data in the database in any danger? Is the SHRINKDATABASE function safe?

- Can SHRINKDATABASE be run with the DB still in use? I'll run it after hours, but want to know if I need to put it in admin mode

- Will SHRINKDATABASE even do what I want?

- I've read the SHRINKDATABASE can fragment indexes. Is this true? If yes, how do I avoid it?

Thanks!
Norm

View 6 Replies View Related

HELP: Shrinkdatabase Not Freeing Unused Space

Feb 7, 2002

I am working with a large database that has its tables stored on a secondary filegroup. I'm trying to shrink the size of the files but I can't seem to get the system to free up the unused space. I've tried shrinkdatabase and shrinkfile both with and without the truncateonly option. Has anyone else had this problem? Is there a workaround? Any help would be greatly appreciated.

Thank you.
Cathy

View 2 Replies View Related

Tran Log Backup Conflicts With Shrinkdatabase?

Apr 2, 2008

Hi:

I have maintenance plan on DBABC backup log to .trn job to run every 90 minutes (daily).

in order to keep the log file small, I also set up a job (T-SQL) to run at 4:15 am to backup log ABC with truncate_only, then run dbcc shrinkdatabase (DBABC, 10)

it looks "backup log ABC with truncate_only" has conflicts with the every90 minutes backup transaction log.

Question: could I keep the backup transaction log every90 minutes, but still could shrink the log file. The log file is growing very fast.

Or I have to use differential backup instead of backup tran log?

thanks
David

View 4 Replies View Related

How To Capture Results Of A DBCC SHRINKDATABASE?

Feb 28, 2008

So, basically I'm trying to do an insert into exec(dbcc shrinkdatabase)


Code Snippet

DECLARE

@SQL VARCHAR(1024)

,@DBName VARCHAR(512)

SET @DBName= 'admin'

IF OBJECT_ID('tempdb.dbo.#ShrinkDB') IS NOT NULL

DROP TABLE #ShrinkDB

CREATE TABLE #ShrinkDB

(

DbId INT

,FileID INT

,CurrentSize BIGINT

,MinimumSize BIGINT

,UsedPages BIGINT

,EstimatedPages BIGINT

)

SET @SQL=

'

INSERT INTO #ShrinkDB

(DbId,FileId,CurrentSize,MinimumSize,Usedpages,EstimatedPages)

EXEC(''DBCC SHRINKDATABASE(' + @DBName + ')'')

'

EXEC(@SQL)

SELECT * FROM #ShrinkDB


and receive the following:

Cannot perform a shrinkdatabase operation inside a user transaction. Terminate the transaction and reissue the statement.

I've tried adding a begin tran and commit tran around it, doesn't help ...

Is there any way around this? Is there any other way to capture the output of a shrink database from a procedure perspective?

Thanks

View 4 Replies View Related

Dbbs Shrinkdatabase Execution Times?

Oct 26, 2007

Hi guys,

I identified a problem today with one of our development DB's where tempororary tables were not being cleaned up by some stored procedures, and this lead to a large tempdb size (about 2 gigs data and a translog of 25 gigs!).

I'm currently running a dbcc shrinkdatabase on it, but its been running for over an hour now. In my experience with smaller DB's, this process normally takes a few minutes. Can anyone give me a ballpack figure as to how long I can expect this to run for? Are we talking an hour, a few hours, a day, a few days?

Thanks in advance

Matt

View 4 Replies View Related

Before Full Backup Is It Ok Running DBCC SHRINKDATABASE

Apr 16, 2008



Hi

I have a full backup Job in SQL Server2000 server(designed by some body).

Step1.DBCC SHRINKDATABASE
Step2.DBCC SHRINKFILE -->It is shrinking log file
and
BACKUP DATASE()

Is it OK Shrinking Database and log files before full backup..

Your advice is appreciated.



View 10 Replies View Related

Reporting Services :: Parsing SSRS Config File And Dynamically Changing File Path Of Config File In Code

Sep 2, 2015

Currently have a single hard coded file path to the SSRS config file which parses the file and provides the reporting services web service url.  My question is how would i run this same query against 100s of servers that may or may not share the same file path as the one hard coded ?

Is there a way to query the registry to find the location of the config file of any server ? which could be on D, E, F, H, etc. 

I know I can string together the address followed by "reports" and named instance if needed, but some instances may not have used the default virtual directory name (Reports).

Am I going about this the hard way ? Is there a location where the web service url exists in a table ? I could not locate anything in the Reporting service database. Basically need to inventory all of my reporting services url's.

View 2 Replies View Related

File Will Not Attach To SQL Reason File Is Not Primary File

May 10, 2007

I have a customer they are running raid 5 on a windows 2000 server one of the drives went bad. The customer replaced the drive and raid rebuilt the drive, every thing seamed to be fine but there is one database file that cannot be attached to SQL. The file is 15G so I know there is information the error states that the file is not a Primary file. Any clue on how to fix this?

mdf file size 5,738,944 KB
ldf file size 10,176 KB

View 4 Replies View Related

An Attempt To Attach An Auto-named Database For File (file Location).../Database.mdf Failed. A Database With The Same Name Exists, Or Specified File Cannot Be Opened, Or It Is Located On UNC Share.

Sep 2, 2007

Greetings, I have just arrived back into the country (NZ) and back into ASP.NET.
 I am having trouble with the following:An attempt to attach an auto-named database for file (file location).../Database.mdf failed. A database with the same name exists, or specified file cannot be opened, or it is located on UNC share.
It has only begun since i decided i wanted to use IIS, I realise VWD comes with its own localhost, but since it is only temporary, i wanted a permanent shortcut on my desktop to link to my intranet page.
 Anyone have any ideas why i am getting the above error? have searched many places on the internet and not getting any closer.
Cheers ~ J
 

View 3 Replies View Related

Attach Mdf File And Data File Without Log File

Mar 1, 2006

how can i attach the mdf file with a corrupted/deleted Log file
my log file is deleted how can i attach the mdf file

View 1 Replies View Related

Maintenance Commands Affect On Log File / Log File Maintenance Without Log File Backups?

Jun 18, 2015

I am testing some maintenance tasks sql commands such as index rebuild, index reorg, update statistics and db integrity check on a SQL Server 2014 Database. This is a new non-production vendor database (DB Size 500 GBs, Log Size 25 GBs) which eventually will be created in production. Currently, it is in full recovery model and without log backups. The database has a whole lot of indexes. I am just trying to rebuild and reorganize all the indexes (that need it), in addition to trying to get an idea of how long these maintenance task will take and the space needed in the log file to complete these tasks/commands. I would like to execute these tasks manually (the first time) to gather the duration and space required information. Eventually, I would probably schedule a weekly job to perform this maintenance.

I ran the index rebuild task on the database and noticed that the log file grew by over 50 GBs. I killed the process and truncated and shrunk the log file back down.

1. Does the index rebuild, index reorg, update statistics and db integrity check commands all use the log file?

2. Does Indexs Reorg have less impact on log file then Index Rebuild?

3. Should a truncate log and shrink log file be performed after these maintenance commands?

4. Should a full database backup be performed after these maintenance commands? Or before the maintenance commands?

I have read and understand that shrinking is not good for the database (could lead to more fragmentation and more data file growth when data is added) and I know about rebuilding indexes when fragmentation is GT 30% and reorganizing indexes when fragmentation is GT 5% and LE 30%.

Since this is a non-production database maybe I should set the recovery model to simple, run the maintenance commands and leave the database in simple recovery model unless the vendor needs it in full recovery model for some unknown reason.

5. With the simple recovery model the log file should be reused in a circular manner and not grow during these maintenance tasks. Is this correct?

View 3 Replies View Related

Help Please (Check File Exists/ Archive File/ Check If File Empty)

Mar 10, 2008



Hello World,

I'm new to SSIS and would like a little assistance getting started, if possible...


Here is what I want to do:


Check if file exist (C:DTS UpgradeFilexxx.txt) --->

Archive file (C:DTS UpgradeArchive) --->

Check if file has data (true or false)


AND/OR

If there are any good website that have good direction, let me know


Thanks in advance for your help!!!

View 5 Replies View Related

SQL Server 2012 :: Write A Process To Get File Size In Kb And Record Count In A File?

Jul 31, 2014

I need to write a process to get file size in kb and record count in a file. I was planning on writing a c# console app that takes the file path and name as a param however should i use a CLR?

I cant put a script in the ssis when it's bringing the file down because it has been deemed that we only use ssis for file consumption.

View 1 Replies View Related

SQL Server Admin 2014 :: Creating Additional Data File For A Particular File Group?

Jul 6, 2015

For a database, we have 4 data files in a particular file group and the file sizes are almost 70 GB each.

Do I come across any performance issues if I create/pre-allocate an additional data file in the same file group so that the existing files don't grow too much?

View 5 Replies View Related

Integration Services :: Flat File Error File Being Created In-spite Of No Errors

Jun 23, 2015

I have a package in which there are only one Data flow Task and it has only three components. 1) Source , which is a SQL db 2) destination and 3) OLE DB Destination flat file Error output file.   I want the error file to be created ONLY if there is any error while dumping the data into destination DB. But , the issue is, the error flat file is being created inspite of No error while dumping the  data from Source to Destination.

View 5 Replies View Related

Integration Services :: Renaming / Replacing Part Of A File In File System Task

May 14, 2015

I'm copying files to a folder with the naming convention as follows in the source folder:

CM_ABC_MY_TEST.txt

In the destination folder, this filename needs to appear as:

CM_XYZ_MY_TEST.txt

In my File System Task, I'm pretty sure I'm going to need an expression with a replace, substring, etc. But am having a hard time nailing down the exact syntax.

View 10 Replies View Related

Integration Services :: Get FileName Fo Each File Created Via Dynamic Flat File Destination

Jul 24, 2015

Need to know how I can get the dynamic filename created in the FlatFile destination for insert into a package audit table?

Scenario: Have created a package that successfully outputs Dynamiclly named flat files { Format: C:Test’Comms_File_’ + ‘User::FileNumber’+’_’+Date +’.txt’

E.g.: Comms_File_1_20150724.txt, Comms_File_2_20150724.txt  etc} using Foreach Loop Container  :

* Enumerator Set to: “Foreach ADO Enumerator” with the ADO object source variable selected to identify how many total loop iterations there are i.e. Let’s say 4 thus 4 files to be created

*Variable Mappings : added the User::FileNumber – indicates which file number current loop iteration is i.e. 1,2,3,4

For the DataFlow task have a OLDBSource and a FlatFile Destination where Flat File ConnectionString is set up as:

@[User::Output_Path] + "Comms_File"+ @[User:: FileNumber] +"_" + replace((DT_WSTR, 10) (DT_DBDATE) GETDATE(),"-","")+ ".txt"

All this successfully creates these 4 files:

Comms_File_1_20150724.txt, Comms_File_2_20150724.txt, Comms_File_3_20150724.txt, Comms_File_4_20150724.txt

Now the QUESTION is how do I get these filenames as I need to insert them into a DB Audittable. The audit table looks like this:

CREATE TABLE dbo.MMMAudit
  (
     AuditID      INT IDENTITY(1, 1) NOT NULL,
     PackageName     VARCHAR(100) NULL,
  
FileName           VARCHAR(100) NULL,
     LoadTime        DATETIME NULL,
     NumberofRecords INT NULL
  ) 

To save the Filename & how many records in each file in our Audit Table, am using an Execute SQL Task and configuring it as this:

Execute SQL Task

Parameter mapping - Mapped the User Variable (RecordsInserted) and System Variable( PackageName) to Insert statement as shown below

SQLStatement: INSERT INTO [dbo].[MMMAudit] ( 
PackageName,NumerofRecords,LoadTime)
 (?,?.GETDATE)

Again this all works terrific & populates the dbo.MMMAudit table as shown below BUT I also need to insert the respsctive file name – How do I do that?

AuditID PackageName FileName  NumberOfRecords
1           MMM       NULL                      12
2          MMM  NULL                23
3          MMM  NULL      14
4          MMM  NULL              1                     

View 2 Replies View Related

How To Create, Install And Deploy Dts File For Sqlserver 2000 And Dtsx File For 2005 ?

Apr 24, 2007

Hi,



I am looking for tutorials about how to create dts et dtsx files.

Thanks for your help.



Arioule.

View 1 Replies View Related

Script Task To Compare File From FTP With File On Local Archive Behaved Strangely!

May 9, 2008

In my script task I have the following code. The task I'm trying to accomplish is:
If the filename on FTP can be found in the local archive folder of e: drive then show message "FileAlreadyThere" (I will ultimatley change it to do nothing); if the filename on FTP cannot be found in the local archive folder of e: drive then transfer the file to the local package folder on d: drive.

While the script task is executing I was watching it closely, but the problem i saw is that:
If some files on FTP are already in local archive folder and some are not, then it the files which are already in the archive folder are dumped to the package folder; then after that the files which are not in the archive folder are then dumped to the package folder. But I only want the new files on FTP to be transferred to the package folder for further processing.

Then after this is finished, I saw all the files in the package folder are refreshed one after another, after the first round of refresh the second round starts, after the second round finishes it then stopped. I saw it refreshes itself because the 'Date Modified' of the file changes. And I saw the script task turned green.

I don't see how the code below produced this result. Something is wrong in the logic of the loop? Anyone has any idea why it's behaving the way it is now? And how to change the code to accomplish what I want? Thanks a lot!!

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Imports System
Imports System.IO
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Public Class ScriptMain
Public Sub Main()
Dim cm As ConnectionManager = Dts.Connections.Add("FTP")
cm.Properties("ServerName").SetValue(cm, "ftp2.name.com")
cm.Properties("ServerUserName").SetValue(cm, "username")
cm.Properties("ServerPassword").SetValue(cm, "password")
cm.Properties("ServerPort").SetValue(cm, "21")
cm.Properties("Timeout").SetValue(cm, "0")
cm.Properties("ChunkSize").SetValue(cm, "1000") '1000 kb
cm.Properties("Retries").SetValue(cm, "1")
Dim ftp As FtpClientConnection = New FtpClientConnection(cm.AcquireConnection(Nothing))
ftp.Connect()
ftp.SetWorkingDirectory("/directory")
Dim fileNames() As String
Dim folderNames() As String
ftp.GetListing(folderNames, fileNames)
If fileNames Is Nothing Then
MsgBox("NoFileOnFTP")
Else
Dim fileName As String
For Each fileName In fileNames
If File.Exists("c: emp" + fileName) Then
MsgBox("FileAlreadyThere")
Else
ftp.ReceiveFiles(fileNames, "c: emp", True, True)
End If
Next
End If

ftp.Close()
End Sub
End Class

View 3 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved