Purge Records From Table In A Weekly Schedule
Jul 23, 2002
Hello all,
I hope someone can help me with a big problem... I'm using Citrix Resource Management Services with a SQL 2000 database. Their are 15 citrix servers which are all reporting to the SQL database.
The database is expanding very quickly and is becoming slower and slower.
My question is: I want to schedule a purge of old records on a friday afternoon, like this:
WEEK 1 - MON / FRI
WEEK 2 - MON / FRI (Friday's purge records week1)
WEEK 3 - MON / FRI (Friday's purge records week2)
etc...
Is this possible? if yes how do i do this !??!
Thank you very much for any info!!
Daan
View 1 Replies
ADVERTISEMENT
Feb 18, 2014
I have a query that will generate records monthly based on the number of months that i calculate between two date feilds for a given requestid. How can i use the same query to generate records for weekly and bi weekly based on the receiveddate field that i use in the subtraction for calculating the number of months.
Also when inserting i have been adding a month for every record as i was generating monthly and now i would have to add week and 2 weeks to the receiveddate
SET NOCOUNT ON
GO
declare @num_of_times int
declare @count int
declare @frequency varchar(10)
declare @num_of_times1 int
[Code] ....
View 6 Replies
View Related
May 21, 2007
greets all, ive got a table with batches of records. each group of records has a batch id as part of the PK in the form BTCXXXX where XXXX is an auto-incremented number. so lets say i have 100 batches of 20k records per batch in the table. so the distinct batch ids are BTC0200 (oldest batch) through BTC0300 (newest batch).
i only want to keep the 90 most recent batches in the table at any given time.
is it ok to just subtract 90 from the last batch id and do something like:
DECLARE @batch_id char(10)
SET @batch_id = 'BTC' + batch_num-90
DELETE FROM ITEM_BATCH
WHERE BATCH_ID < @batch_id
i want to cover if the table has more than 90 batches and if the table has less then 90 batches. is this a feasible approach?
View 4 Replies
View Related
May 14, 2007
Hi all I have table with about 67 million records
that are marked for deletion.
I know that I can
DELETE from table
WHERE ToBeDeleted='t'
But this may be too big a task for the server considering the amount
of data to delete at once. And if it runs out of resources and errors then nothing gets deleted....
Is there a way to segmant or loop so i can delete like 100k records at a time?
Many thanks in advance
View 8 Replies
View Related
Nov 4, 2015
We have staging table in which data is dumped from files . The staging table is truncated for every load . In order to retain data from staging table we are creating staging_purge table which hold the staging data. what is the fastest way to copy data from staging to purge table without impacting the load process.
View 13 Replies
View Related
Jan 9, 2014
I am attempting to do a rather simple purge task on a very large table. This task will need to take place daily and delete records older than 6 months out of the database. On first pass this will delete well over 130 million rows. I thought the best way to handle this is create a proc and call the proc from a SQL Agent Job that runs nightly. Here is an example of the script:
CREATE PROCEDURE usp_Purge_WCFLogger
AS
SET NOCOUNT ON
EXEC sp_rename 'dbo.logs', 'logs_work'
GO
SELECT * INTO dbo.Logs_Backup FROM dbo.Logs_Work WHERE TIMESTAMP < DATEADD(month, -6, GETDATE())
[Code] .....
View 3 Replies
View Related
Aug 18, 2015
How to purge data in transaction table or we can delete some data and store in separate table in data warehouse?
View 7 Replies
View Related
Nov 5, 2005
DELETING 100 million from a table weekly SQl SERVER 2000Hi AllWe have a table in SQL SERVER 2000 which has about 250 million recordsand this will be growing by 100 million every week. At a time the tableshould contain just 13 weeks of data. when the 14th week data needs tobe loaded the first week's data has to be deleted.And this deletes 100 million every week, since the delete is taking lotof transaction log space the job is not successful.Can you please help with what are the approaches we can take to fixthis problem?Performance and transaction log are the issues we are facing. We trieddeletion in steps too but that also is taking time. What are thedifferent ways we can address this quickly.Please reply at the earliest.ThanksHarish
View 4 Replies
View Related
Mar 29, 2004
hi,
i would like to create a DTS package to retrieve records from database , this records i retrieve is from the error log table ( ERROR_LOG_TB),the scheduler will run at 9 am daily and will retrieve the records if there is a error and the error information will be capsulate and sent through email.
Can i know how to know how to graphically do in DTS ? i am running SQL Server 2000.
View 3 Replies
View Related
Sep 13, 2006
Ok, I'm really new at this, but I am looking for a way to automatically insert new records into tables. I have one primary table with a primary key id that is automatically generated on insert and 3 other tables that have foreign keys pointing to the primary key. Is there a way to automatically create new records in the foreign tables that will have the new id? Would this be a job for a trigger, stored procedure? I admit I haven't studied up on those yet--I am learning things as I need them. Thanks.
View 4 Replies
View Related
Aug 11, 2015
Table1 contains fields Groupid, UserName,Category, Dimension
Table2 contains fields Group, Name,Category, Dimension (Group and Name are not in Table1)
So basically I need to read the records in Table1 using Groupid and each time there is a Groupid then select records from Table2 where Table2.Category in (Select Catergory from Table1)
and Table2.Dimension in (Select Dimension from Table1)
In Table1 There might be 10 Groupid records all of which are different.
View 9 Replies
View Related
Oct 21, 2015
I am trying to write a query that will retrieve all students of a particular class and also any rows in HomeworkLogLine if they exist (but return null if there is no row). I thought this should be a relatively simple LEFT join but I've tried every possible combination of joins but it's not working.
SELECT
Student.StudentSurname + ', ' + Student.StudentForename AS Fullname,
HomeworkLogLine.HomeworkLogLineTimestamp,
HomeworkLog.HomeworkLogDescription,
ROW_NUMBER() OVER (PARTITION BY HomeworkLogLine.HomeworkLogLineStudentID ORDER BY
[Code] ...
It's only returning two rows (the students where they have a row in the HomeworkLogLine table).
View 3 Replies
View Related
Oct 10, 2006
Where can I find an english translation of the values in fields ReccurrenceType, State, Daysofweek, and Type in the Schedule table of the ReportServer database?
Example what does it mean when RecurrenceType is equal to 2, or equal to 4?
Thanks
View 5 Replies
View Related
Jun 9, 2008
I have an application that has a dropdown in which, I want a users most popular choices (from history) to be on top. This is easily done via a query however the query is too slow, after be optimized. I'd like to have create a table that contains each users most popular selection, and then just update it nightly. How can I schedule some SQL code to run nightly?
View 1 Replies
View Related
Dec 17, 2014
I have a table right now, that gives when an event was done in a date time (eg. Event done Jan 1, 2014, event 2 done Jan 2, 2014).
What is a feasible way to make a second table to join on this date time which provides the type of worker (eg. part time, full time, volumteer, etc) without making a table with every day, hour and minute as a separate row?
For example, a test was done on Jan 1, 2014 at 1pm, in table A. I was thinking my second table would list every date and time and fill in what type of worker was on at that minute/time. Then I would left outer join with my first table, so the end result would be Event A was done Jan 1, 1pm, therefore it was a: Volunteer worker. Event B was done Jan 1, 2pm, therefore it was a : Full time worker.
But I realize that it is not feasible to write a table with every date/time available to accommodate all the different types of shifts.
View 2 Replies
View Related
Apr 4, 2008
hi friends,
i have SSIS packages and i want them execure from table values...
table ETLJOBS
PackageID 1,2,3
Name
FrequencyOF Run 30,60,90 (Mint)
StartTime
EndTime
.....
....
how can i use table values and Execure Package.
here my package frequency can be change in future so i want to make my code generic.
thx for reading my question.
View 5 Replies
View Related
Aug 29, 2007
Hi Everyone.
i have a big database and for each table i have Active field (N/Y) and i wanna delete all record the active N. how i can do that without delete from each tables and looking to details tables before. our database have a complex relations.
please any solution.
thanks alot.
View 4 Replies
View Related
Jul 20, 2005
I can't get my head around this:I want to select all IDs from table A that do not have a related record intable B according to some condition:Table A contains, say, Parents and table B contains Children. I want toselect all Parents that have no children called "Sally" (this is a noddyexample, reminds me of being at Uni again :) ).Any ideas?Thanks
View 2 Replies
View Related
Mar 18, 2014
I have a situation where deleting old records is blocking updating latest records on highly transactional table and getting timeout errors from application.
In details, I have one table called Tran_table1 in OLTP database. This Tran_table1 is highly transactional table, it will receive data for insert/update continuously
While archiving 2 years old records from Tran_table1 into Tran_table1_archive in batches(using DELETE OUTPUT INTO clause), if there is any UPDATEs on Tran_table1,these updates are getting blocked and result is timeout errors in application.
Is there any SQL Server hints to avoid blocking ..
View 3 Replies
View Related
Dec 3, 2014
I have a table with about half a million records, each representing a patient in my county.
Each record has a field (RRank) which basically sorts the patients as to how "unwell" they are according to a previously-applied algorithm. The most unwell patient has an RRank of 1, the next-most unwell has RRank=2 etc.
I have just deleted several hundred records (which relate to patients now deceased) from the table, thereby leaving gaps in the RRank sequence. I want to renumber the remaining recs to get rid of the gaps.
I can see what I want to accomplish by using ROW_NUMBER, thus:
SELECT ROW_NUMBER() Over (ORDER BY RRank) as RecNumber, RRank
FROM RPL
ORDER BY RRank
I see the numbers in the RecNumber column falling behind the RRank as I scan down the results
My question is: How to convert this into an UPDATE statement? I had hoped that I could do something like:
UPDATE RISC_PatientList_TEMP
SET RRank = ROW_NUMBER() Over (ORDER BY RRank);
but the system informs that window functions will only work on SELECT (which UPDATE isn't) or ORDER BY (which I can't legally add).
View 5 Replies
View Related
Oct 26, 2003
All, I records that have two columns, date col and time col.
I am trying to write a SQL statement to extract the rows that fall within a range of time.
Eg current date/time + 10 hours and
current date/time - 10 hours.
Kindly advise how can I do that.
Thanks.
View 4 Replies
View Related
Jul 23, 2004
I'm assuming that the .ldf file houses all the transaction logs, well mine is getting quite large (like 6 times the size of my .mdf) and I'm wondering if there is a way to purge logs back to a specific date. Can anyone give me some direction? Thanks
View 2 Replies
View Related
Feb 8, 2005
Is there any way to purge/truncate a database where the dataase is full or does not have free space ?
View 4 Replies
View Related
Dec 13, 2007
A client of mine just asked about Archiving and Purging their data (based on different time constraints). For isntance, certain data would get archived after 3 years over to a different database, others purged all together. My thoughts off the bat are store procedures run on a job schedule.
Any other thoughts or ideas?
View 3 Replies
View Related
Aug 9, 2006
Hello,
I would like to use the built in logging feature to log to a text file.
Is there a way to purge the log peridically (only keep enties for the last 30 days, etc).
Thanks,
Michael
View 1 Replies
View Related
Jun 9, 2006
Greetings,
I have a SQL Server 2005 database which is populated with test data. I need to copy this database to a new instance and then purge the copied database of it's contents for the next round of testing.
I know there exists a Copy Database Wizard w/in the SQL Server Management Studio; I'm assuming I need to perform this first, then purge the existing data w/in the copy. The Copy operation looks pretty straight forward, but I haven't a clue on how to perform the purge.
Can someone help?
Regards,
Loopsludge
View 1 Replies
View Related
Feb 22, 1999
Suppose I am appending to my transaction log dump device every half hour,thus adding 48 log dumps per day. How can I purge my transaction dump device to only keep the last 1 week's worth of these logs. I do not wanted to issue an INIT in the command, because this will wipe it completely. I noticed an EXPIREDATE and RETAINDAYS parameter for the DUMP command. Can I use these to selectively purge the backup device or will these allow the device to be wiped clean also?
Thanks in advance,
Ed Molinari
Emerald Solutions
View 2 Replies
View Related
Apr 25, 2002
Anyone have anything hints on getting the option of DELBKUP to work. I have a maintenance job that backs up the database. Also set to delete backups older than 1 day. This doesn't seem to be working.
Any thoughts.
Thanking you in advance.
View 1 Replies
View Related
Oct 5, 2004
Hi
We are running SQL Server 2000 and making full database backups using a maintenance plan. The transaction logs are being backed up via a separate plan.
However, the transaction log backups aren't purging but the database ones are. They are both run under the same id and both write to the same directory. Old text reports also get purged fine. I've got 'Delete older than 1 week' ticked for everything.
Can anyone think of anything else to check ?
Thanks
View 7 Replies
View Related
Dec 14, 2006
Hi,
Not sure if this question makes sense, but is it necessary to purge old data in msdb tables used by the db mirroring monitor job?
I'm just wondering if an insert into the data table every minute of the day would still be needed a month from now. I'm thinking this data would be useful for the purpose of "alerts" and to have access to its recent history, but other than that, is it recommended (or necessary)? Would these records keep accumulating until manually purged?
TIA.
View 7 Replies
View Related
Jan 25, 2007
Hello,
I have a database that I am setting up in SQL Server 2005. Initially, I am doing very large imports of data. Every time I run an import, I am having the increase the size of my transaction logs, and now they are approaching 2 GB. Should these be purging themselves? I have to keep increasing the max size of the log so that I can get my data in. While this will work for now, it is not a long term solution, because I can see the log size growing quite large and the amount of space on the server obviously isn't infinite. Is there a setting that I can change so they will automatically purge? If not, how do I purge this information myself?
Thanks so much!
Christine
View 4 Replies
View Related
Jun 7, 2007
The only way the job success is if I select the option add rows wich is copying all rows even the ones that are already there, I tried to drop amd recreate the table option but i just dosn't run that way, help please..
View 1 Replies
View Related
Jul 11, 2014
I do have very old versions of duplicate store procedures on my databases. I know there is no "safe" way to do this using DMVs, so I am planning to combine that with a trace. But I would like to get others opinions about that.
Here's the DMV I am planning to use:
SELECT
CASE WHEN database_id = 32767 then 'Resource' ELSE DB_NAME(database_id)END AS DBName
,OBJECT_SCHEMA_NAME(object_id,database_id) AS [SCHEMA_NAME]
,OBJECT_NAME(object_id,database_id)AS [OBJECT_NAME]
,cached_time
,last_execution_time
,execution_count
[Code] ....
I will save that on a local table and run it every 5 min maybe? Or at an interval equal or lower than PLE?
View 5 Replies
View Related