Purge .ldf File

Jul 23, 2004

I'm assuming that the .ldf file houses all the transaction logs, well mine is getting quite large (like 6 times the size of my .mdf) and I'm wondering if there is a way to purge logs back to a specific date. Can anyone give me some direction? Thanks

View 2 Replies


ADVERTISEMENT

Purge A Log File

Aug 9, 2006

Hello,



I would like to use the built in logging feature to log to a text file.



Is there a way to purge the log peridically (only keep enties for the last 30 days, etc).



Thanks,



Michael

View 1 Replies View Related

DB Backup File Purge

Apr 25, 2002

Anyone have anything hints on getting the option of DELBKUP to work. I have a maintenance job that backs up the database. Also set to delete backups older than 1 day. This doesn't seem to be working.

Any thoughts.

Thanking you in advance.

View 1 Replies View Related

Purge

Aug 29, 2007



Hi Everyone.

i have a big database and for each table i have Active field (N/Y) and i wanna delete all record the active N. how i can do that without delete from each tables and looking to details tables before. our database have a complex relations.
please any solution.

thanks alot.

View 4 Replies View Related

Purge Time

Oct 26, 2003

All, I records that have two columns, date col and time col.

I am trying to write a SQL statement to extract the rows that fall within a range of time.
Eg current date/time + 10 hours and
current date/time - 10 hours.

Kindly advise how can I do that.
Thanks.

View 4 Replies View Related

Purge Database

Feb 8, 2005

Is there any way to purge/truncate a database where the dataase is full or does not have free space ?

View 4 Replies View Related

Purge A Big Table

May 14, 2007

Hi all I have table with about 67 million records
that are marked for deletion.

I know that I can

DELETE from table
WHERE ToBeDeleted='t'

But this may be too big a task for the server considering the amount
of data to delete at once. And if it runs out of resources and errors then nothing gets deleted....

Is there a way to segmant or loop so i can delete like 100k records at a time?

Many thanks in advance

View 8 Replies View Related

Archive And Purge

Dec 13, 2007

A client of mine just asked about Archiving and Purging their data (based on different time constraints). For isntance, certain data would get archived after 3 years over to a different database, others purged all together. My thoughts off the bat are store procedures run on a job schedule.

Any other thoughts or ideas?

View 3 Replies View Related

SQL Server 2005 Purge?

Jun 9, 2006

Greetings,
I have a SQL Server 2005 database which is populated with test data.  I need to copy this database to a new instance and then purge the copied database of it's contents for the next round of testing.
I know there exists a Copy Database Wizard w/in the SQL Server Management Studio; I'm assuming I need to perform this first, then purge the existing data w/in the copy.  The Copy operation looks pretty straight forward, but I haven't a clue on how to perform the purge.
Can someone help?
Regards,
Loopsludge

View 1 Replies View Related

Purge Log Dump Device

Feb 22, 1999

Suppose I am appending to my transaction log dump device every half hour,thus adding 48 log dumps per day. How can I purge my transaction dump device to only keep the last 1 week's worth of these logs. I do not wanted to issue an INIT in the command, because this will wipe it completely. I noticed an EXPIREDATE and RETAINDAYS parameter for the DUMP command. Can I use these to selectively purge the backup device or will these allow the device to be wiped clean also?

Thanks in advance,
Ed Molinari
Emerald Solutions

View 2 Replies View Related

Why Won't My Transaction Log Backups Purge

Oct 5, 2004

Hi

We are running SQL Server 2000 and making full database backups using a maintenance plan. The transaction logs are being backed up via a separate plan.

However, the transaction log backups aren't purging but the database ones are. They are both run under the same id and both write to the same directory. Old text reports also get purged fine. I've got 'Delete older than 1 week' ticked for everything.

Can anyone think of anything else to check ?
Thanks

View 7 Replies View Related

Looking For Simple Purge Of Records From A Table

May 21, 2007

greets all, ive got a table with batches of records. each group of records has a batch id as part of the PK in the form BTCXXXX where XXXX is an auto-incremented number. so lets say i have 100 batches of 20k records per batch in the table. so the distinct batch ids are BTC0200 (oldest batch) through BTC0300 (newest batch).
i only want to keep the 90 most recent batches in the table at any given time.
is it ok to just subtract 90 from the last batch id and do something like:

DECLARE @batch_id char(10)
SET @batch_id = 'BTC' + batch_num-90

DELETE FROM ITEM_BATCH
WHERE BATCH_ID < @batch_id

i want to cover if the table has more than 90 batches and if the table has less then 90 batches. is this a feasible approach?

View 4 Replies View Related

DB Mirroring Monitor Job - Purge Old Data?

Dec 14, 2006

Hi,

Not sure if this question makes sense, but is it necessary to purge old data in msdb tables used by the db mirroring monitor job?

I'm just wondering if an insert into the data table every minute of the day would still be needed a month from now. I'm thinking this data would be useful for the purpose of "alerts" and to have access to its recent history, but other than that, is it recommended (or necessary)? Would these records keep accumulating until manually purged?

TIA.

View 7 Replies View Related

Transact SQL :: How To Purge Data From A Table

Nov 4, 2015

We have staging table in which data is dumped from files . The staging table is truncated for every load . In order to retain data from staging table we are creating staging_purge table which hold the staging data. what is the fastest way  to copy data from staging to purge table without impacting the load process.

View 13 Replies View Related

Automatic Purge Of Transaction Log Data?

Jan 25, 2007

Hello,

I have a database that I am setting up in SQL Server 2005. Initially, I am doing very large imports of data. Every time I run an import, I am having the increase the size of my transaction logs, and now they are approaching 2 GB. Should these be purging themselves? I have to keep increasing the max size of the log so that I can get my data in. While this will work for now, it is not a long term solution, because I can see the log size growing quite large and the amount of space on the server obviously isn't infinite. Is there a setting that I can change so they will automatically purge? If not, how do I purge this information myself?

Thanks so much!

Christine

View 4 Replies View Related

Purge Records From Table In A Weekly Schedule

Jul 23, 2002

Hello all,

I hope someone can help me with a big problem... I'm using Citrix Resource Management Services with a SQL 2000 database. Their are 15 citrix servers which are all reporting to the SQL database.

The database is expanding very quickly and is becoming slower and slower.

My question is: I want to schedule a purge of old records on a friday afternoon, like this:

WEEK 1 - MON / FRI
WEEK 2 - MON / FRI (Friday's purge records week1)
WEEK 3 - MON / FRI (Friday's purge records week2)
etc...

Is this possible? if yes how do i do this !??!

Thank you very much for any info!!

Daan

View 1 Replies View Related

SQL 2012 :: How To Purge Old Store Procedures From Clients

Jul 11, 2014

I do have very old versions of duplicate store procedures on my databases. I know there is no "safe" way to do this using DMVs, so I am planning to combine that with a trace. But I would like to get others opinions about that.

Here's the DMV I am planning to use:

SELECT
CASE WHEN database_id = 32767 then 'Resource' ELSE DB_NAME(database_id)END AS DBName
,OBJECT_SCHEMA_NAME(object_id,database_id) AS [SCHEMA_NAME]
,OBJECT_NAME(object_id,database_id)AS [OBJECT_NAME]
,cached_time
,last_execution_time
,execution_count

[Code] ....

I will save that on a local table and run it every 5 min maybe? Or at an interval equal or lower than PLE?

View 5 Replies View Related

Baffled. How To Purge Old Backups With Maint Plan

Apr 13, 2006

I'm missing something basic...

Trying to create a new job from sql server 2005 Management Studio. The nifty wizard under Management provides options to create backups, and clean history, but I cant seem to find out how to create a job to purge old backups.

Looking at the commnand line Sql Server Agent Job shows:

/SQL "Maintenance PlansFull Backups" /SERVER "JFSLIB-DEV" /MAXCONCURRENT " -1 " /CHECKPOINTING OFF

But as you can see, doesn't help a novice like myself, as it appears to be command options to run the "integration" package.

Any help is appreciated. Thanks!

View 3 Replies View Related

SQL Server 2012 :: Purge Process On A Large Table

Jan 9, 2014

I am attempting to do a rather simple purge task on a very large table. This task will need to take place daily and delete records older than 6 months out of the database. On first pass this will delete well over 130 million rows. I thought the best way to handle this is create a proc and call the proc from a SQL Agent Job that runs nightly. Here is an example of the script:

CREATE PROCEDURE usp_Purge_WCFLogger
AS
SET NOCOUNT ON
EXEC sp_rename 'dbo.logs', 'logs_work'
GO
SELECT * INTO dbo.Logs_Backup FROM dbo.Logs_Work WHERE TIMESTAMP < DATEADD(month, -6, GETDATE())

[Code] .....

View 3 Replies View Related

Massive Bulk Delete / Data Purge Problem

Jan 28, 2008

I've got a large MS Sql Server 2000 database that has 15 indexes, with roughly 180 million rows representing 240 GB worth of data. Due to the massive size of the database we are trying to purge it down to a smaller dataset, about 40 million rows, in order to speed up the query performance and to be able to defrag the indexes (which are 30-50% fragmented). To complicate the matter, this table is also a publisher in a transactional replication setup, with one subscriber. Also, the system needs to be up constantly so I'm only allowed about a 3-5 hour period to take an outage a week.

So far I've tested several methods of delete following all best practices (batch deletes, using indexes in delete's where clause), and have come up with deleting/commiting 500 rows at a time. The problem is that it still takes 3-4 seconds to delete this many rows, on a 8 GB RAM, 4 processor machine that is not currently used or replicated.

I'm at a loss on a way to pare down the data with a delete as the current purge script will take 7 hours a day for about 3 months. Another option I'm considering is to do a truncate and copy the data back over from the replicated database, but again this has its own set of problems, i.e. network latency and slow inset times. Yet another option would be to create a replica of the table on the production db, copy the data to it, then rename the table.

Any one have experience with purging such a massive amount of data? Any help would be greatly appreciated.

View 6 Replies View Related

Integration Services :: Purge Data In Transaction Table Or Delete Some Data And Store In Separate Table

Aug 18, 2015

How to purge data  in transaction table or we can delete some data and store in separate table in data warehouse?

View 7 Replies View Related

Reporting Services :: Parsing SSRS Config File And Dynamically Changing File Path Of Config File In Code

Sep 2, 2015

Currently have a single hard coded file path to the SSRS config file which parses the file and provides the reporting services web service url.  My question is how would i run this same query against 100s of servers that may or may not share the same file path as the one hard coded ?

Is there a way to query the registry to find the location of the config file of any server ? which could be on D, E, F, H, etc. 

I know I can string together the address followed by "reports" and named instance if needed, but some instances may not have used the default virtual directory name (Reports).

Am I going about this the hard way ? Is there a location where the web service url exists in a table ? I could not locate anything in the Reporting service database. Basically need to inventory all of my reporting services url's.

View 2 Replies View Related

File Will Not Attach To SQL Reason File Is Not Primary File

May 10, 2007

I have a customer they are running raid 5 on a windows 2000 server one of the drives went bad. The customer replaced the drive and raid rebuilt the drive, every thing seamed to be fine but there is one database file that cannot be attached to SQL. The file is 15G so I know there is information the error states that the file is not a Primary file. Any clue on how to fix this?

mdf file size 5,738,944 KB
ldf file size 10,176 KB

View 4 Replies View Related

An Attempt To Attach An Auto-named Database For File (file Location).../Database.mdf Failed. A Database With The Same Name Exists, Or Specified File Cannot Be Opened, Or It Is Located On UNC Share.

Sep 2, 2007

Greetings, I have just arrived back into the country (NZ) and back into ASP.NET.
 I am having trouble with the following:An attempt to attach an auto-named database for file (file location).../Database.mdf failed. A database with the same name exists, or specified file cannot be opened, or it is located on UNC share.
It has only begun since i decided i wanted to use IIS, I realise VWD comes with its own localhost, but since it is only temporary, i wanted a permanent shortcut on my desktop to link to my intranet page.
 Anyone have any ideas why i am getting the above error? have searched many places on the internet and not getting any closer.
Cheers ~ J
 

View 3 Replies View Related

Attach Mdf File And Data File Without Log File

Mar 1, 2006

how can i attach the mdf file with a corrupted/deleted Log file
my log file is deleted how can i attach the mdf file

View 1 Replies View Related

Maintenance Commands Affect On Log File / Log File Maintenance Without Log File Backups?

Jun 18, 2015

I am testing some maintenance tasks sql commands such as index rebuild, index reorg, update statistics and db integrity check on a SQL Server 2014 Database. This is a new non-production vendor database (DB Size 500 GBs, Log Size 25 GBs) which eventually will be created in production. Currently, it is in full recovery model and without log backups. The database has a whole lot of indexes. I am just trying to rebuild and reorganize all the indexes (that need it), in addition to trying to get an idea of how long these maintenance task will take and the space needed in the log file to complete these tasks/commands. I would like to execute these tasks manually (the first time) to gather the duration and space required information. Eventually, I would probably schedule a weekly job to perform this maintenance.

I ran the index rebuild task on the database and noticed that the log file grew by over 50 GBs. I killed the process and truncated and shrunk the log file back down.

1. Does the index rebuild, index reorg, update statistics and db integrity check commands all use the log file?

2. Does Indexs Reorg have less impact on log file then Index Rebuild?

3. Should a truncate log and shrink log file be performed after these maintenance commands?

4. Should a full database backup be performed after these maintenance commands? Or before the maintenance commands?

I have read and understand that shrinking is not good for the database (could lead to more fragmentation and more data file growth when data is added) and I know about rebuilding indexes when fragmentation is GT 30% and reorganizing indexes when fragmentation is GT 5% and LE 30%.

Since this is a non-production database maybe I should set the recovery model to simple, run the maintenance commands and leave the database in simple recovery model unless the vendor needs it in full recovery model for some unknown reason.

5. With the simple recovery model the log file should be reused in a circular manner and not grow during these maintenance tasks. Is this correct?

View 3 Replies View Related

Help Please (Check File Exists/ Archive File/ Check If File Empty)

Mar 10, 2008



Hello World,

I'm new to SSIS and would like a little assistance getting started, if possible...


Here is what I want to do:


Check if file exist (C:DTS UpgradeFilexxx.txt) --->

Archive file (C:DTS UpgradeArchive) --->

Check if file has data (true or false)


AND/OR

If there are any good website that have good direction, let me know


Thanks in advance for your help!!!

View 5 Replies View Related

SQL Server 2012 :: Write A Process To Get File Size In Kb And Record Count In A File?

Jul 31, 2014

I need to write a process to get file size in kb and record count in a file. I was planning on writing a c# console app that takes the file path and name as a param however should i use a CLR?

I cant put a script in the ssis when it's bringing the file down because it has been deemed that we only use ssis for file consumption.

View 1 Replies View Related

SQL Server Admin 2014 :: Creating Additional Data File For A Particular File Group?

Jul 6, 2015

For a database, we have 4 data files in a particular file group and the file sizes are almost 70 GB each.

Do I come across any performance issues if I create/pre-allocate an additional data file in the same file group so that the existing files don't grow too much?

View 5 Replies View Related

Integration Services :: Flat File Error File Being Created In-spite Of No Errors

Jun 23, 2015

I have a package in which there are only one Data flow Task and it has only three components. 1) Source , which is a SQL db 2) destination and 3) OLE DB Destination flat file Error output file.   I want the error file to be created ONLY if there is any error while dumping the data into destination DB. But , the issue is, the error flat file is being created inspite of No error while dumping the  data from Source to Destination.

View 5 Replies View Related

Integration Services :: Renaming / Replacing Part Of A File In File System Task

May 14, 2015

I'm copying files to a folder with the naming convention as follows in the source folder:

CM_ABC_MY_TEST.txt

In the destination folder, this filename needs to appear as:

CM_XYZ_MY_TEST.txt

In my File System Task, I'm pretty sure I'm going to need an expression with a replace, substring, etc. But am having a hard time nailing down the exact syntax.

View 10 Replies View Related

Integration Services :: Get FileName Fo Each File Created Via Dynamic Flat File Destination

Jul 24, 2015

Need to know how I can get the dynamic filename created in the FlatFile destination for insert into a package audit table?

Scenario: Have created a package that successfully outputs Dynamiclly named flat files { Format: C:Test’Comms_File_’ + ‘User::FileNumber’+’_’+Date +’.txt’

E.g.: Comms_File_1_20150724.txt, Comms_File_2_20150724.txt  etc} using Foreach Loop Container  :

* Enumerator Set to: “Foreach ADO Enumerator” with the ADO object source variable selected to identify how many total loop iterations there are i.e. Let’s say 4 thus 4 files to be created

*Variable Mappings : added the User::FileNumber – indicates which file number current loop iteration is i.e. 1,2,3,4

For the DataFlow task have a OLDBSource and a FlatFile Destination where Flat File ConnectionString is set up as:

@[User::Output_Path] + "Comms_File"+ @[User:: FileNumber] +"_" + replace((DT_WSTR, 10) (DT_DBDATE) GETDATE(),"-","")+ ".txt"

All this successfully creates these 4 files:

Comms_File_1_20150724.txt, Comms_File_2_20150724.txt, Comms_File_3_20150724.txt, Comms_File_4_20150724.txt

Now the QUESTION is how do I get these filenames as I need to insert them into a DB Audittable. The audit table looks like this:

CREATE TABLE dbo.MMMAudit
  (
     AuditID      INT IDENTITY(1, 1) NOT NULL,
     PackageName     VARCHAR(100) NULL,
  
FileName           VARCHAR(100) NULL,
     LoadTime        DATETIME NULL,
     NumberofRecords INT NULL
  ) 

To save the Filename & how many records in each file in our Audit Table, am using an Execute SQL Task and configuring it as this:

Execute SQL Task

Parameter mapping - Mapped the User Variable (RecordsInserted) and System Variable( PackageName) to Insert statement as shown below

SQLStatement: INSERT INTO [dbo].[MMMAudit] ( 
PackageName,NumerofRecords,LoadTime)
 (?,?.GETDATE)

Again this all works terrific & populates the dbo.MMMAudit table as shown below BUT I also need to insert the respsctive file name – How do I do that?

AuditID PackageName FileName  NumberOfRecords
1           MMM       NULL                      12
2          MMM  NULL                23
3          MMM  NULL      14
4          MMM  NULL              1                     

View 2 Replies View Related

How To Create, Install And Deploy Dts File For Sqlserver 2000 And Dtsx File For 2005 ?

Apr 24, 2007

Hi,



I am looking for tutorials about how to create dts et dtsx files.

Thanks for your help.



Arioule.

View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved