Hi all I have table with about 67 million records
that are marked for deletion.
I know that I can
DELETE from table
WHERE ToBeDeleted='t'
But this may be too big a task for the server considering the amount
of data to delete at once. And if it runs out of resources and errors then nothing gets deleted....
Is there a way to segmant or loop so i can delete like 100k records at a time?
greets all, ive got a table with batches of records. each group of records has a batch id as part of the PK in the form BTCXXXX where XXXX is an auto-incremented number. so lets say i have 100 batches of 20k records per batch in the table. so the distinct batch ids are BTC0200 (oldest batch) through BTC0300 (newest batch). i only want to keep the 90 most recent batches in the table at any given time. is it ok to just subtract 90 from the last batch id and do something like:
DECLARE @batch_id char(10) SET @batch_id = 'BTC' + batch_num-90
DELETE FROM ITEM_BATCH WHERE BATCH_ID < @batch_id
i want to cover if the table has more than 90 batches and if the table has less then 90 batches. is this a feasible approach?
We have staging table in which data is dumped from files . The staging table is truncated for every load . In order to retain data from staging table we are creating staging_purge table which hold the staging data. what is the fastest way  to copy data from staging to purge table without impacting the load process.
I hope someone can help me with a big problem... I'm using Citrix Resource Management Services with a SQL 2000 database. Their are 15 citrix servers which are all reporting to the SQL database.
The database is expanding very quickly and is becoming slower and slower.
My question is: I want to schedule a purge of old records on a friday afternoon, like this:
I am attempting to do a rather simple purge task on a very large table. This task will need to take place daily and delete records older than 6 months out of the database. On first pass this will delete well over 130 million rows. I thought the best way to handle this is create a proc and call the proc from a SQL Agent Job that runs nightly. Here is an example of the script:
CREATE PROCEDURE usp_Purge_WCFLogger AS SET NOCOUNT ON EXEC sp_rename 'dbo.logs', 'logs_work' GO SELECT * INTO dbo.Logs_Backup FROM dbo.Logs_Work WHERE TIMESTAMP < DATEADD(month, -6, GETDATE())
i have a big database and for each table i have Active field (N/Y) and i wanna delete all record the active N. how i can do that without delete from each tables and looking to details tables before. our database have a complex relations. please any solution.
All, I records that have two columns, date col and time col.
I am trying to write a SQL statement to extract the rows that fall within a range of time. Eg current date/time + 10 hours and current date/time - 10 hours.
I'm assuming that the .ldf file houses all the transaction logs, well mine is getting quite large (like 6 times the size of my .mdf) and I'm wondering if there is a way to purge logs back to a specific date. Can anyone give me some direction? Thanks
A client of mine just asked about Archiving and Purging their data (based on different time constraints). For isntance, certain data would get archived after 3 years over to a different database, others purged all together. My thoughts off the bat are store procedures run on a job schedule.
Greetings, I have a SQL Server 2005 database which is populated with test data. I need to copy this database to a new instance and then purge the copied database of it's contents for the next round of testing. I know there exists a Copy Database Wizard w/in the SQL Server Management Studio; I'm assuming I need to perform this first, then purge the existing data w/in the copy. The Copy operation looks pretty straight forward, but I haven't a clue on how to perform the purge. Can someone help? Regards, Loopsludge
Suppose I am appending to my transaction log dump device every half hour,thus adding 48 log dumps per day. How can I purge my transaction dump device to only keep the last 1 week's worth of these logs. I do not wanted to issue an INIT in the command, because this will wipe it completely. I noticed an EXPIREDATE and RETAINDAYS parameter for the DUMP command. Can I use these to selectively purge the backup device or will these allow the device to be wiped clean also?
Anyone have anything hints on getting the option of DELBKUP to work. I have a maintenance job that backs up the database. Also set to delete backups older than 1 day. This doesn't seem to be working.
We are running SQL Server 2000 and making full database backups using a maintenance plan. The transaction logs are being backed up via a separate plan.
However, the transaction log backups aren't purging but the database ones are. They are both run under the same id and both write to the same directory. Old text reports also get purged fine. I've got 'Delete older than 1 week' ticked for everything.
Can anyone think of anything else to check ? Thanks
Not sure if this question makes sense, but is it necessary to purge old data in msdb tables used by the db mirroring monitor job?
I'm just wondering if an insert into the data table every minute of the day would still be needed a month from now. I'm thinking this data would be useful for the purpose of "alerts" and to have access to its recent history, but other than that, is it recommended (or necessary)? Would these records keep accumulating until manually purged?
I have a database that I am setting up in SQL Server 2005. Initially, I am doing very large imports of data. Every time I run an import, I am having the increase the size of my transaction logs, and now they are approaching 2 GB. Should these be purging themselves? I have to keep increasing the max size of the log so that I can get my data in. While this will work for now, it is not a long term solution, because I can see the log size growing quite large and the amount of space on the server obviously isn't infinite. Is there a setting that I can change so they will automatically purge? If not, how do I purge this information myself?
I do have very old versions of duplicate store procedures on my databases. I know there is no "safe" way to do this using DMVs, so I am planning to combine that with a trace. But I would like to get others opinions about that.
Here's the DMV I am planning to use:
SELECT CASE WHEN database_id = 32767 then 'Resource' ELSE DB_NAME(database_id)END AS DBName ,OBJECT_SCHEMA_NAME(object_id,database_id) AS [SCHEMA_NAME] ,OBJECT_NAME(object_id,database_id)AS [OBJECT_NAME] ,cached_time ,last_execution_time ,execution_count
[Code] ....
I will save that on a local table and run it every 5 min maybe? Or at an interval equal or lower than PLE?
Trying to create a new job from sql server 2005 Management Studio. The nifty wizard under Management provides options to create backups, and clean history, but I cant seem to find out how to create a job to purge old backups.
Looking at the commnand line Sql Server Agent Job shows:
I've got a large MS Sql Server 2000 database that has 15 indexes, with roughly 180 million rows representing 240 GB worth of data. Due to the massive size of the database we are trying to purge it down to a smaller dataset, about 40 million rows, in order to speed up the query performance and to be able to defrag the indexes (which are 30-50% fragmented). To complicate the matter, this table is also a publisher in a transactional replication setup, with one subscriber. Also, the system needs to be up constantly so I'm only allowed about a 3-5 hour period to take an outage a week.
So far I've tested several methods of delete following all best practices (batch deletes, using indexes in delete's where clause), and have come up with deleting/commiting 500 rows at a time. The problem is that it still takes 3-4 seconds to delete this many rows, on a 8 GB RAM, 4 processor machine that is not currently used or replicated.
I'm at a loss on a way to pare down the data with a delete as the current purge script will take 7 hours a day for about 3 months. Another option I'm considering is to do a truncate and copy the data back over from the replicated database, but again this has its own set of problems, i.e. network latency and slow inset times. Yet another option would be to create a replica of the table on the production db, copy the data to it, then rename the table.
Any one have experience with purging such a massive amount of data? Any help would be greatly appreciated.
I am using SQL Server 2005 and trying to create a linked server on Oracle 10. I used the commands below: EXEC sp_addlinkedserver @server = 'test1', @srvproduct = 'Oracle', @provider = 'MSDAORA', @datasrc = 'testsource' exec sp_addlinkedsrvlogin @rmtsrvname = 'test1', @useself = 'false', @rmtuser='sp', @rmtpassword='sp'
When I execute select * from test1...COUNTRY I get the error. "The OLE DB provider "MSDAORA" for linked server "...." does not contain the table "COUNTRY". The table either does not exist or the current user does not have permissions on that table." The 'sp' user I am connecting is the owner of the table. What could be the problem ? Thanks a lot.
I have created a table Table with name as Varchar and id as int. Now i have started inserting the rows like, insert into Table values ('arun',20).Yes i have inserted a row in the table. Now i have got the values " arun's ", 50. insert into Table values('arun's',20) My sqlserver is giving me an error instead of inserting the row. How will you solve this problem?
I am having a table called as status ,in that table one field is there i.e. currentstatus. the rows which are having currentstatus as "ticket closed",i want to move those rows into other table called repository which is having same table structure as status table. I can do programatically. but is there any way for every 3 months system has to check and do this action means moving to repository table automatically?
I'm inserting from TempAccrual to VacationAccrual . It works nicely, however if I run this script again it will insert the same values again in VacationAccrual. How do I block that? IF there is a small change in one of the column in TempAccrual then allow insert. Here is my query
INSERT INTO vacationaccrual (empno, accrued_vacation, accrued_sick_effective_date, accrued_sick, import_date)
For reasons that are not relevant (though I explain them below *), Iwant, for all my users whatever privelige level, an SP which createsand inserts into a temporary table and then another SP which reads anddrops the same temporary table.My users are not able to create dbo tables (eg dbo.tblTest), but arepermitted to create tables under their own user (eg MyUser.tblTest). Ihave found that I can achieve my aim by using code like this . . .SET @SQL = 'CREATE TABLE ' + @MyUserName + '.' + 'tblTest(tstIDDATETIME)'EXEC (@SQL)SET @SQL = 'INSERT INTO ' + @MyUserName + '.' + 'tblTest(tstID) VALUES(GETDATE())'EXEC (@SQL)This becomes exceptionally cumbersome for the complex INSERT & SELECTcode. I'm looking for a simpler way.Simplified down, I am looking for something like this . . .CREATE PROCEDURE dbo.TestInsert ASCREATE TABLE tblTest(tstID DATETIME)INSERT INTO tblTest(tstID) VALUES(GETDATE())GOCREATE PROCEDURE dbo.TestSelect ASSELECT * FROM tblTestDROP TABLE tblTestIn the above example, if the SPs are owned by dbo (as above), CREATETABLE & DROP TABLE use MyUser.tblTest while INSERT & SELECT usedbo.tblTest.If the SPs are owned by the user (eg MyUser.TestInsert), it workscorrectly (MyUser.tblTest is used throughout) but I would have to havea pair of SPs for each user.* I have MS Access ADP front end linked to a SQL Server database. Forreports with complex datasets, it times out. Therefore it suit mypurposes to create a temporary table first and then to open the reportbased on that temporary table.
The following dbo.Tables of Northwind.mdf in my .SQLEXPRESS (SQL Server Management Studio Express) are missing: dbo.Categories dbo.CustomerCustomerDemo dbo.CustomerDemographics dbo.Customers dbo.Employees dbo.EmployeeTerritories dbo.Order Details dbo.Orders dbo.Products dbo.Regions dbo.Shippers dbo.Suppliers dbo.Territories.
But, I have these dbo.Tables in a different Database "xyzDatabase". How can I copy each of these dbo.Tables to the another blank dbo.Table of Northwind Database?
I right clicked on the dbo.Categories and I saw the following thing: dbo.Categories New Table... Modify Open Table Script Table as |> CREATYE To |> DROP To |> SELECT To |> INSERT To |> New Query Editor Window File.... Clipboard UPDATE To |> DELETE to |> From the above observation,I think it is possible to copy the dbo.Table from the one Database to the Northwind Database that needs to be repaired. Please help and advise me how to do this task or tell me where I can find the Microsoft document that gives the details of this X-copy thing.
Thanks in advance, Scott Chang
P. S. I am using VB 2005 Express to create a project to learn "Calling Stored Procedures with ADO.NET" (see Paul Kimmel's article in http://www.developer.com/db/article.php/3438221) that needs the dbo.Tables of Northwind Database and my Northwind Database has been screwed up for quite a while and needs a big repair.
--Table 1 "Employee" CREATE TABLE [MyCompany].[Employee]( [EmployeeGID] [int] IDENTITY(1,1) NOT NULL, [BranchFID] [int] NOT NULL, [FirstName] [varchar](50) NOT NULL, [MiddleName] [varchar](50) NOT NULL, [LastName] [varchar](50) NOT NULL, CONSTRAINT [PK_Employee] PRIMARY KEY CLUSTERED ( [EmployeeGID] ) GO ALTER TABLE [MyCompany].[Employee] WITH CHECK ADD CONSTRAINT [FK_Employee_BranchFID] FOREIGN KEY([BranchFID]) REFERENCES [myCompany].[Branch] ([BranchGID]) GO ALTER TABLE [MyCompany].[Employee] CHECK CONSTRAINT [FK_Employee_BranchFID]
-- Table 2 "Branch" CREATE TABLE [Mycompany].[Branch]( [BranchGID] [int] IDENTITY(1,1) NOT NULL, [BranchName] [varchar](50) NOT NULL, [City] [varchar](50) NOT NULL, [ManagerFID] [int] NOT NULL, CONSTRAINT [PK_Branch] PRIMARY KEY CLUSTERED ( [BranchGID] ) GO ALTER TABLE [MyCompany].[Branch] WITH CHECK ADD CONSTRAINT [FK_Branch_ManagerFID] FOREIGN KEY([ManagerFID]) REFERENCES [MyCompany].[Employee] ([EmployeeGID]) GO ALTER TABLE [MyCompany].[Branch] CHECK CONSTRAINT [FK_Branch_ManagerFID]
--Foreign IDs = FID --generated IDs = GID Then I try a simple single row DELETE
DELETE FROM MyCompany.Employee WHERE EmployeeGID= 39
Well this might look like a very basic error: I get this Error after trying to delete something from Table €œEmployee€?
The DELETE statement conflicted with the REFERENCE constraint "FK_Branch_ManagerFID". The conflict occurred in database "MyDatabase", table "myCompany.Branch", column 'ManagerFID'.
Yes what I€™ve been doing is to deactivate the foreign key constraint, in both tables when performing these kinds of operations, same thing if I try to delete a €œBranch€? entry, basically each entry in €œbranch€? and €œEmployee€? is child of each other which makes things more complicated.
My question is, is there a simple way to overcome this obstacle without having to deactivate the foreign key constraints every time or a good way to prevent this from happening in the first place? Is this when I have to use €œON DELETE CASCADE€? or something?