Delete Huge Log File

Feb 15, 2006

how to delete a very huge log file, to free up some harddisk space

View 3 Replies


ADVERTISEMENT

Does DB Size Decrease When I Delete A Huge Table ??

Jan 15, 2004

Hi,
My DB size (Right click on DB Name, Data Files tab, Space Allocated field) was 10914 MB.

I delete a huge table (1.2 million records * 15 columns).
I checked the db size again. It didnt change.
Shouldn't it decrease because I delete a huge table ??

View 14 Replies View Related

Delete Duplicate Records From Huge Tables

Jul 20, 2005

Hi All,I've the following table with a PK defined on an IDENTITY column(INSERT_SEQ):CREATE TABLE MYDATA (MID NUMERIC(19,0) NOT NULL,MYVALUE FLOAT NOT NULL,TIMEKEY INTEGER NOT NULL,TIMEKEY_DTTM DATETIME NULL,IID NUMERIC(19,0) NOT NULL,EID NUMERIC(19,0) NOT NULL,INSERT_SEQ NUMERIC(19,0) IDENTITY(1,1) NOT NULL)GOALTER TABLE MYDATAADD CONSTRAINT PK_MYDATAPRIMARY KEY (INSERT_SEQ)GOThe TIMEKEY_DTTM field is generated, from the value actually insertedinto theTIMEKEY field, by the following trigger:CREATE TRIGGER TIMEKEY1ON MYDATAFOR INSERT ASBEGINDECLARE @M_TIMEKEY_DTTM DATETIMESELECT @M_TIMEKEY_DTTM = DATEADD(SECOND, INS.TIMEKEY +EP.GMT_OFFSET * 0 ,'1970-01-01 00:00:00.000')FROM INSERTED INS, LOCATIONINFO EPWHERE INS.EID = EP.EIDUPDATE MYDATASET TIMEKEY_DTTM = @M_TIMEKEY_DTTMFROM INSERTED INS, MYDATA MDWHERE MD.INSERT_SEQ = INS.INSERT_SEQENDGOThere is also a composite, non unique, index defined on thetuple:(MID,IID,TIMEKEY,EID)CREATE INDEX IX_METDATA ON MYDATA (MID,IID,TIMEKEY,EID)GOAs a consequence of an application design change, I would also changethis index to be UNIQUE, but when I try to drop and create it I get anerror, because the tables stores some duplicated rows...In order to succesfully upgrade the index definition, I wrote some DMLstaementsto lookup and remove the duplicated rows, keeping only the firstrecord inserted, i.e. the one with the lowest INSERT_SEQ:---- This table stores then umber of duplicated records eventuallydiscovered-- into the MYDATA table; the initial value for the NUM_DUPLICATESfield is-- 0 (no duplicated record)--DROP TABLE DUPLICATESGOCREATE TABLE DUPLICATES (TABLENAME VARCHAR(17),NUM_DUPLICATES NUMERIC(19,0) )GOINSERT INTO DUPLICATES VALUES ('MYDATA',0)GOINSERT INTO DUPLICATES VALUES ('CATEGORIESDATA',0)GO---- ///////// CLEAN UP OF MYDATA TABLE--DROP TABLE TMP_MYDATAGOCREATE TABLE TMP_MYDATA (MID NUMERIC(19,0) NOT NULL,TIMEKEY INTEGER NOT NULL,IID NUMERIC(19,0) NOT NULL,EID NUMERIC(19,0) NOT NULL,INSERT_SEQ NUMERIC(19,0) )GO---- Insert into the TMP_MYDATA table all the duplicated records for-- the tuple (MID,IID,TIMEKEY,EID) and NULL for the INSERT_SEQ field--INSERT INTO TMP_MYDATA (MID,IID,TIMEKEY,EID)SELECT MID,IID,TIMEKEY,EIDFROM MYDATAGROUP BY MID,IID,TIMEKEY,EIDHAVING COUNT(*)>1GO---- Updates the INSERT_SEQ field to the lowest value in the group-- of duplicated records--UPDATE TMP_MYDATASET TMP_MYDATA.INSERT_SEQ = (SELECT MIN(INSERT_SEQ)FROM MYDATAWHERE TMP_MYDATA.MID = MYDATA.MID ANDTMP_MYDATA.IID = MYDATA.IID ANDTMP_MYDATA.TIMEKEY = MYDATA.TIMEKEY ANDTMP_MYDATA.EID = MYDATA.EID )GO---- Updates the value of NUM_DUPLICATES for the MYDATA table.--UPDATE DUPLICATESSET NUM_DUPLICATES = (SELECT COUNT(*) FROM TMP_MYDATA)WHERE TABLENAME = 'MYDATA'GO---- Delete from the MYDATA table all the duplicated records,-- keeping only the row with the lowest INSERT_SEQ-- The delete is performed only if there are duplicated recors;-- this is achieved using a "short circuit" AND on the number ofrecords-- stored into the NUM_DUPLICATES field of the DUPLICATES table for-- the MYDATA table...--DELETE FROM MYDATAWHERE ( SELECT NUM_DUPLICATES FROM DUPLICATES WHERE TABLENAME ='MYDATA') > 0 ANDEXISTS ( SELECT 1FROM TMP_MYDATAWHERE MYDATA.MID = TMP_MYDATA.MID ANDMYDATA.IID = TMP_MYDATA.IID ANDMYDATA.TIMEKEY = TMP_MYDATA.TIMEKEY ANDMYDATA.EID = TMP_MYDATA.EID ANDMYDATA.INSERT_SEQ > TMP_MYDATA.INSERT_SEQ )GOThis tecnique works fine on a normal table (1M recs) but is not veryperformanton huge tables (>10M records)!Do you know a better way to achieve the task of removing all theduplicates records, preserving the lowest INSERT_SEQ betwee theduplicates and also preserving the sequence seed, so that a new recordinserted at time t1>t0 is enumerated with an INSERT_SEQ|t1 >max(INSERT_SEQ)|t0 ?Thanks a lot for your help!PatrizioPS. sorry for such a large post!

View 1 Replies View Related

The Best Way To Delete A Huge Number Of Records In Table

May 30, 2008

Hi Everyone,



We have a large test database with million of records for more than company site Code. Sometime we want to refresh the data of that database for one or more site Codes.

In order to do that I have to delete all records of the site code we want to refresh on the test database first then copy a new set of data from production database over. Since we refresh data based on the site code therefore I have to use the Delete command instead of Truncate.

Since this is a huge database with thousand of tables and million of records per table I have a performance issues with delete command. So what would be the best to delete a large number of records without writing any information to database log file?



FYI: The Recovery model of this database is Simple


Regards,



Jdang

View 9 Replies View Related

What Should I Do With The Huge Log File

Oct 11, 2004

Hi all,
I found my database log file is 26GB and the database file is just about 280MB. We are doing full backup everyday. However, my sql server seems running very slow now and please advise:

1. How can I decrease/truncate my log file?
2. Would the huge size of the log file be the reasons slowing up my sql server?
3. Would anyone give me direction knowing more on the transaction log?
Thank you and appreciated!

View 4 Replies View Related

Log File HUGE!!

Oct 30, 2007

Hi guys, its my first post! Its also like my first time really diving into sql. We are using sharepoint on site here along with sql server 2005, one of our log files is 255 GBs and needs to be made smaller very fast!! We are almost out of disk space and the log is growing fast.

I am very new to sql and dont even know where to go to enter commands, so youll have to bear with me here. I've read about truncating and shrinking and some other things, I am just worried and dont want to mess anything up. I know this is probably a simple task, but like I said, with the truncate command I was reading about, I dont even know where to go to type it in!!! If someone could please help it would be much appreciated. Thanks so much.

View 14 Replies View Related

Huge .BAK FILE

Jan 28, 2008

I have a .bak file of 72gb. But my database size is only 32gb, I got this value from sp_spaceused?
Anyone know why the .bak file is so big?. Is it possible to reduce the size? How could i reduce it?



http://www.sqlserverstudy.com

View 9 Replies View Related

Huge LDF File

Nov 13, 2007



Hi
this is regarding SQL Sever 2000. ( it was upgraded form sql7). its log file is increasing in very high manner. say 40 gb, 50 gb and now 57 gb. Mdf file is around 15 mb. we created back up and tried to restore to another system. its asking 57 gb free space. how to proceeed with file recovery. we have backups but it askes more space for log file. how to retrieve the data.
rgds
Pramod

View 3 Replies View Related

SQL Backup File Is HUGE!!!

Sep 25, 2004

One of my databases is approxiamtely 1.5 GB's but when I run a backup the backup file explodes to 38GB's.

What is the proper usage of Shrink Database, is this safe or is there another method to reduce the size?

View 2 Replies View Related

The Log File For My Database Is Huge

May 20, 2006

The log file for my database in SQL Server 2005 is huge. How do I get empty it or in effect shrink it or start it over?Thanks

View 1 Replies View Related

Huge Backup File

May 12, 2000

I am using SQL Server 7 and have about 5 databases. One of them has a data file of about 10 Meg, and most of the others are larger. I do a nightly backup to both a local and mapped drive. On both, the size of the backup file for this database is more than 500 Meg, but the rest appear to be an appropriate size. Does anyone know why this would be happening? The database works fine, it does not get a lot of insert/delete activity and I run DBCC every weekend. If anyone has any ideas I would sure like to hear from them.

View 1 Replies View Related

Huge Log File On SQL Server 7

Feb 21, 2001

I have a huge log file (285M) on SQL Server 7.
The database itself is about 10M.
How can I reduce the log file ?
Is it possible to build it again from scratch ?

I tried the Truncate Transaction Log but it didn't help.

View 2 Replies View Related

What Is The Best Way To Upload Huge File Into SQL 2K

Mar 13, 2002

I have a 2 GB text file(semicolon delimited), which I need to
pump into SQL 2K. What is the best way to achieve this in shortest time?

I tried with DTS BCP, it tooks 1 minute 14 secs to transfer 13457 records. This is only 1/2000 of the record i need to transfer. Please help.

Thanks.

regards,
Terry

View 2 Replies View Related

Import Of Huge XML File

May 22, 2013

I have an xml file of 44 Gb (Not Meg, its really GB) . Delivered by the Danish custom authorities.

My problem is simple - How to import such a beast? I have seen a limit of 2.1 Gb everywhere.

View 9 Replies View Related

'transaction Log Backup' File Is Huge

May 19, 2007

Hello all,

I have inherited a SQL 2000 server, and am therefore an absolute beginner of SQL2000.

I know this has been covered before, but I don't know how to use the KB as I don't know how to run the commands/script.
http://support.microsoft.com/kb/272318/

I can no longer backup the SQL database because the 'transaction log backup' file is about 17GB. The SQL database is only about 2GB! The partition that it is backed up onto fills up every day.

Can anyone help me?

Thanks
Puskas

View 4 Replies View Related

Turning Off Logging / Avoiding Huge Log File

Nov 15, 1999

Help,

Have a user who needs many data changes to a huge table with over 2 million rows.
He hopes to do this by running a series of queries.

Last time our log grew gargantuan. I know 7.0 is wonderful & dynamic, but after
an hour of rambling through books online I'm more confused than ever.

Is there a way we can code into his query at intervals any hard coded checkpoints
or log backups? Do we need to dbcc shrinkfile the logfile?

Thanks,

Mark Blackburn
mark@mbari.org

View 1 Replies View Related

Insert Huge Text File Into SQL Server

Dec 16, 2004

I am writing a .NET service, and I need to insert files in a SQL DB for temporary storage.

I have never inserted a file into a SQL database before. I am thinking about using the image field type. Has anyone done this before? How did you do it?

View 9 Replies View Related

Database Script Creates Huge Log File

Mar 20, 2008

I am creating a database for an application through script. After the tables, views, and sp's are created, the database is populated with data. After all of this (and before the application is even run), the log file is about 700MB. If I shrink the database, it takes the log down to 1MB. The mdf file is about 165 MB before and after it has been shrunk.

I have two questions:
1. Is there something I should look for in my database scripts or is there a setting that could prevent this from being created so large.

2. Is there a script I can run in my sql code after the database has been created and populated to shrink it.

Thank you for your help

View 2 Replies View Related

Move Current Huge Data File To New GPT Drive?

Oct 31, 2015

I have a database data file almost at 2tb maxing out a windows drive. Only 16gb left. Should I just add another data file on another Windows drive for growth? Or just move current huge data file to a new GPT drive? Or do both adding another data file and moving existing to its own new GPT drive?

Primary objective is to make do for now.

View 1 Replies View Related

Question About Log File Size When Alter Huge Table

Oct 31, 2005

I have the next question, and i would like to hear what do you thinkabout, and if is there a better solution for "my problem"here is the question, I have a huge table with 60GB of data (imagefiles). The problem happen always when i try to ALTER the structure ofthe table. For example I change a field char(3) to char(4)...thesqlserver then performs the "alter table" command...that must besomething similar than "insert into the new table + drop the actualtable" and for that I need about 60GB o space for my LOG file, andtakes hours to complete the operation.Is this the only way to alter a single field in my table??I would like to heard you opinions...Thanks..ALberto

View 2 Replies View Related

Simple Recovery Model Database - Huge Log File

Nov 3, 2015

Have a database that's in "Simple" recovery mode whose .ldf has grown to 270GB.   This database is a data warehouse so "full" is not required.  I put it in simple mode a month ago and shrunk the log down and now it's filled up the disk. 

What steps can I take to mitigate this in future?  I've read that this is caused by long running transactions which fill the log for DR purposes.  Should I put the database back into full mode and backup/truncate daily.  

The auto-growth is set to 128MB which is very low. 

View 3 Replies View Related

SQL Server 2008 :: Huge Volume Of Records To Copy To Excel File Through SSIS

Oct 22, 2015

I am copying data from database to an excel file through SSIS. database is MS SQL 2005 and BIDS is also 2005.However, the job doing this task fails every time.As per investigation, the result of the query is more than 100,000 rows and we know that excel has a limit of 65000 rows of data.Is there a way of setting the limit up? or something? a better approach maybe so that everything will be copied to the excel file successfully.

The PrimeOutput method on component "Source - Query" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure. End Error Error: 2015-10-22 04:34:58.05 Code: 0xC0047021

Source: Data Flow Task
Description: SSIS Error Code DTS_E_THREADFAILED.

Thread "SourceThread0" has exited with error code 0xC0047038. There may be error messages posted before this with more information on why the thread has exited. End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 4:30:00 AM Finished: 4:35:05 AM Elapsed: 304.844 seconds. The package execution failed. The step failed. "

View 1 Replies View Related

HELP: How To Avoid Ridiculous System Resource Issues When Connecting To A Huge Excel File

Sep 20, 2007

Hi all,

I have a 400MB Excel file that I consume from another automated process (don't ask). I copy this file down locally to my server, and I am attempting to create an SSIS package that points to this file via a connection manager. My computer starts gobbling up massive amounts of memory (devenv.exe gets up to about 800MB or so, then drops back down to 100MB) even when I attempt to rename the connection in the connection managers tab.

I have set all BypassPrepare to TRUE and ValidateExternalMetadata properties to FALSE, and still it can take up to 3 to 6 minutes for BI Dev Studio to respond. My specs:

Intel Centrino Duo 2.00 GHz
2GB RAM
XP Pro SP2

There MUST be a way for me to work effectively on a file of this size. Please help! Thanks much for any assistance.

Sincerely,

Brian Pulliam

View 4 Replies View Related

Huge Deletes In A Huge Table

Apr 3, 2000

SQL 7 SP1 NT4 SP5

I have a TRANSACTION table with 150 million rows.

I have a USER table.

Each user has about 600 records in the TRANSACTION table.

The TRANSACTION cluster index is on USERID + RECID . The second index is on USERID + Fieldx + Fieldy.

The TRANSACTION table gets about 1.4 million inserts in a normal day and about 40,000 updates.

I want to go through the USER table and delete all users who have not visited me in a while.

I want to do this without substantially hindering performance in a production environment. I can perform this over a week period or two if needed.

The best way I thought of doing this was to grab x amount of users in a cursor and loop through deleting their corresponding TRANSACTION records.

Does anyone have any ideas on a better way. What is going to happen to my indices during this time ?

Thanks !!!

View 3 Replies View Related

SQL Server Admin 2014 :: Can Delete A Data-file Or File-group

Apr 27, 2015

In a server we had File Growth,And then We had to Add New Hard Drive And New File On It.And Now We have New server with a Huge Hard Drive.But all files remaind.Can I Reduce This files to One data file or not ?

View 3 Replies View Related

Integration Services :: File System Task - Set Source Variable And Pickup BAK File In Directory To Delete

Nov 9, 2015

I have created a File System task which is contained in a Foreach Loop Container. I have .bak files that are populating a directory from a maintenance backup plan.

There is a point where I need to delete the .bak file's after I've zipped them all up.

How do I set the SourceVariable to read through the directory and pick up just the .bak file's in the directory to delete.

View 3 Replies View Related

Cannot Delete A File

Feb 22, 2001

Hi,
I have by mistake created a file by name abc.ndf on Primary filegroup on a database.
Now when I delete it it says can't delete because file is not empty.
I want to get rid of this file.
Any suggestion.

View 2 Replies View Related

Delete Log File

Dec 9, 2004

Our server is getting pretty full, the data has already been backed up, but it did not create additional room on the server. Can I delete the log file as long as I have backed up the items I need? We are running Microsoft SQL 2000.

Thanks for the help!

View 4 Replies View Related

Delete Flat File

Dec 7, 2001

Can anyone point me to an example or tell me. How to run a job that delete's a file from a directory on a server? A have a ftp process that send's a larger flat file to my sql server every hour, which after about 4hr will full my server. I need to delete this file from its d drive location.

Thanks Reggie

View 4 Replies View Related

Delete 2nd Transaction Log File

Feb 9, 2005

Hello
We have a SQL Server 2000 database with 2 transaction log files.
The 2nd file was created when we were running out of disk space and the person creating it was not familiar with the dbcc shrink command.

I now want to get rid of the 2nd log file. I ran the following steps with no success:

DBCC SHRINKFILE ('Log_file', EMPTYFILE )
--Message: Cannot shrink log file 3 (log_file) because all logical log files are in use.

ALTER DATABASE db1 REMOVE FILE 'Log_file'
--Message: The file 'Log_file' cannot be removed because it is not empty.


There are no users or open transactions in the database. I have also tried sp_detach_db and sp_attach_single_file_db but that does not work either as the database attaches both the transaction logs back.

Please advise.

Thanks

Nina

View 2 Replies View Related

Delete Backup File

Nov 15, 2005

Hi,
I have a database which size is 64Go, and i backup it everyday with 1 day of retention.
But the maintenance plan don't delete the file older than 1 day.
This works fine for the other db.
Do you know what is the pb ?
regards.

View 1 Replies View Related

Delete A File From The Server

Jul 20, 2005

I have a table in my database on SQL Server which holds a file namethat refers to a file that is stored on the server. I would like tocreate a trigger to delete this file from the server if the row in thetable is deleted. I have been trying to use this command in a trigger(<filename> is the name and path of the file):xp_cmdshell "delete <filename>"If some one could please help I would appreciate it very much. Iwould love a code sample if you have one. Thank you so very much.From,Ryan

View 2 Replies View Related

How To Delete Text File

May 3, 2007

hi...
how to delete text files from c drive (ex: c:sale300407.txt) using ms sql express edition 2005 ?
plz help me...

View 13 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved