SQL Help With Purging Old Data
Feb 28, 2007Does anyone know how to purge old data in a MS SQL server 2000
database?
Does anyone know how to purge old data in a MS SQL server 2000
database?
How do I purge data off of an MSDE database. I only want to keep 6 months of data in the database. Right now I have data going back to 2004. I get errors about every 10 seconds. "Primary File Group is Full" is the error I am getting.
Regarding SQL Server data, I am looking to implement the beset Data-Archive and Purge policy. Normal, we do SQL Backups and keep the history for some period , for example, 8 weeks, so we can go back and restore any data point in time upto 8 month in past. and we also do Tape backups.
Question is Where can I get nice article or documentation on this to best design such policy where I make sure that I am covered for point in time recovery of database (which is sql backups) and point in time recovery in far past, say, 3 years ago using tape backups, and I need to make sure that I don't repeat the same efforsts.
Any advices or suggestion on this topic.
Thanks,
Is it possible to purge all records in the database while retaining the the table structures. Even better yet, could I do it on a table by table basis? If I simply delete all the records the identities for the tables do not revert back to 1.
View 2 Replies View RelatedI have 6 tables which are very huge in row count and records needs to deleted which are older than 8 days.
Little info: Every day, 300 Million records are inserted in below 7 tables. we should maintain only 8 days worth of data in below tables. How to implement Purge script which can delete records in all tables in the same time and with optimized parallelism.
Master table which has [ID],[Timestamp]
Table Name: Sample - 2,578,106
Child tables: Foreign key [ID] is common for all the tables. There is no timestamp column in child table. So the records needs to deleted based on Min(ID) from Sample
dbo.ConnectionDB - 1,147,578,048
dbo.ConnectionSS - 876,458,321
dbo.ConnectionRT - 118,133,857
dbo.ConnectionSample - 100,038,535
dbo.Command - 100,032,235
How to implement a optimal archival and purging in MSSQL SERVER
databases
hi !
I have a job that creates tranlog backups every 15 mins and makes a full database backup at midnight.
I want to purge the old logs after the full backup , how can I do that ?
thanks
Sami
Hi DB Experts
I have a question on shrinking tempDB. Currently, I am using MS SQL 2005 Server, I have a software that used SQL DB but I had created a separate database to store my data. The tempDB is totally not used at all.
Problem - Whenever I export data into my created database on MS SQL server, tempDB also grew, I noticed that the files grew so large that it crashed the server. I am running 50GB free space on my drive where by the MS SQL server was installed.
Question - May I know are there any solution to shrink or freeze the growth of tempDB size?
Best regards
TEWCT
Hi,
In one of our DR Servers we have configured Custom Logshipping. In the folder where the .TRN files are getting copied there is a script to purge the files which are older than one day. Following is the code for the same.
@Echo Off
if exist filelist del filelist
date /t > rundate
for /f "tokens=2* delims= " %%i in (rundate) do set rundate=%%i
for %%i IN (*.trn) do echo >> filelist %%i %%~ti
for /f "tokens=1,2* delims= " %%i in (filelist) do if not %rundate% equ %%j del %%i
:pause
:Exit
Instead of removing the files older than 1 day, I need to keep 3 days transaction logs.
Being a novice I don't have much idea how to accomlish it. Can anybody help me with this?
Many thanks in advance,
Sandhya
My transmission queue has lots of messages that will never, ever be delivered because the transmission_status = "The session keys for this conversation could not be created or accessed. The database master key is required for this operation."
How can I purge the transmission queue to get rid of this junk?
When connecting to an SQL Server (v7 in this case) the log file shows that old and deleted databases are still being opened as part of the process. Further, these db names are now reserved and cannot be reused, even thought I've deleted them.
Is there any way I can purge all traces of these deleted db names?
You might have guessed, I'm a newbie, which is why there are a few dozen deleted DBs.
Thanks for any help
Max
atomax@gmx.net
Hi,
I'm using SQL2K with SP4 2187. I have created a DB Maintenance wizard where the purging older than 1 day is set.
However, this feature seems not to be working, even if I tried two ways. Delete the scheduled job and recreate it - not successful, 2nd) delete the Maintenance Plan, still not successful.
Is this a bug or do I miss something here.
Hi,We have a Java application that runs against a variety of backendsincluding Oracle and MSSql 2000 and 2005. Our application has ahandful of active tables that are constantly being inserted into, andsome of which are being selected from on a regular basis.We have a purge job to remove unneeded records that every hour "DELETEFROM <tableWHERE <datafield< <sometimestamp>". This is how we arepurging because we need 100% up time, so we do so every hour. Forsome tables the timestamp is 2 weeks ago, others its 2 hours ago. Thedate field is indexed in some cases, in others it is not... theDELETE is always done off of a transaction (autoCommit on), butexperimentation has shown doing it on one doesn't help much.This task normally functions fine, but every once in a while theinserts or counts on this table fail with deadlocks during the purgejob. I'm looking for thoughts as to what we could do differently orother experience doing this type of thing, some possibilities include:- doing a select first, then deleting one by one. This is apossibility, but its SLOW and may take over an hour to do this so we'dbe constantly churning deleting single records from the db.- freezing access to these tables during the purge job... our appcannot really afford to do this, but perhaps this is the only option.- doing an update of an "OBSOLETE" flag on the record, then deletingby that flag... i'm not sure we'd avoid issues doing this, but its'an option.The failures happen VERY infrequently on sql2005 and much morefrequently on sql2000. Any help or guidance would be mostappreciated, thanks!Bob
View 3 Replies View Related
hi All,
I have to remove non-useful and duplicate records containing NULL , Blanks and extra spaces (either on left or right side of the column values) etc from all the tables in my server's DB XYZ weekly thru a a scheduled job with the help of a Stored Proc, that s i guess called Purging og DB. Plz help how i can do it with T-SQL.
Also i have to find out and remove all the duplicate DB objects(tables) from the DB .e.g. a table existing with name TABLE_TEST or TABLE_DEBUG etc for an original table TABLE , making sure no any of the base table is dropped.
Plz help me reagrding these two problems.
Thanks in advance for the quick replies.
Mohit
HiWe have need for an SSIS package that would perform routine purging of the growing data in some of the tables used to support notification services. While running Sql in a regular job would seem to suffice, an SSIS package would be more in line with the other processes for the alerts in terms of manageability. The SSIS package should follow the following guidelines:Deletes records from a given number of tables, based on a specified date column for each table and a specified number of days for each table, or other conditions. SystemAlertQueue: 30 days old based on the SubmitTimestamp column. SystemAlertChron: 30 days old based on the EventTimestamp column. SystemAlertNotifChron: 30 days old based on the NotifTimestamp column. WMICheck, related tables (based on WMICheckID, see WMIAlerts database diagram in SqlServer) 15 days old based on BeginCheckDate column. Each table€™s deletion routine should be distinguishable in the package. Does any body know how to do this.. Please help meRegardsDeepu M.I
View 1 Replies View RelatedHi, all here,
Thank you very much for your kind attention.
I am wondering if it is possible to use SSIS to sample data set to training set and test set directly to my data mining models without saving them somewhere as occupying too much space? Really need guidance for that.
Thank you very much in advance for any help.
With best regards,
Yours sincerely,
After testing out the application i write on the local pc. I deploy it to the webserver to test it out. I get this error.
System.Data.SqlClient.SqlException: The conversion of a char data type to a
datetime data type resulted in an out-of-range datetime value.
Notes: all pages that have this error either has a repeater or datagrid which load data when page loading.
At first I thought the problem is with the date, but then I can see
that some other pages that has datagrid ( that has a date field) work
just fine.
anyone having this problem before?? hopefully you guys can help.
Thanks,
I have used both data readers and data adapters(with datasets) in the projects that I have worked on. I am trying to get some clarification on when I should be using which one. I think I am doing this correctly but I want to be sure I am developing good habits.
As the name might suggest, it seems like a datareader is for only reading data. I have read that the data adapter and dataset are for a disconnected architecture. Or, that they can be used for this type of set up. I have been using the data adapter and datasets when writing to a database and the datareader when reading from a database.
Is this how these should be used? Is the data reader the best choice for reading data? Am I doing this the optimal way from a performance stand point?
......................................................thanks in advance
We already integrated different client data to MDS with MS Excel plugin, now we want to push back updated or new added record to source database. is it possible do using MDS? Do we have any background sync process to which automatically sync data to and from subscriber and MDS?
View 4 Replies View RelatedWhen I enter over 4000 chars in any ntext field in my SQL Server 2005 database (directly in the database and through the application) I get an error saying that the data could not be updated because string or binary data would be truncated.Has anyone ever seen this? I cannot figure out what is causing it, ntext should be able to hold a lot more data that this...
View 7 Replies View RelatedI have a requirement to implement CDC for 50+ tables to implement incremental data changes warehouse/reporting rather than exporting the whole table data. The largest table is having more than half a billion records.
The warehouse use a daily copy of OLTP db (daily DB refresh). How can I accomplish this. Is there a downside in implementing CDC just for the sake of taking incremental changes on the tables?
Is there any performance impact if we enable CDC on OLTP db?
Can we make use of the CDC tables on the environment we do daily db refresh so that the queries don't hit OLTP database?
What is the best way to implement CDC to take incremental changes for reporting.
Hi,This is driving me nuts, I have a table that stores notes regarding anoperation in an IMAGE data type field in MS SQL Server 2000.I can read and write no problem using Access using the StrConv function andI can Update the field correctly in T-SQL using:DECLARE @ptrval varbinary(16)SELECT @ptrval = TEXTPTR(BITS_data)FROM mytable_BINARY WHERE ID = 'RB215'WRITETEXT OPERATION_BINARY.BITS @ptrval 'My notes for this operation'However, I just can not seem to be able to convert back to text theinformation once it is stored using T-SQL.My selects keep returning bin data.How to do this! Thanks for your help.SD
View 1 Replies View RelatedI'm using Script Component to load data into Oracle DB due to the poor performance issue. Now, I found it will missing some data during the transmission. Please see the screenshot below:Â
SQL Server:
Oracle:
DDL:
create table Person
(
BusinessEntityID Integer,
FirstName nvarchar2(50),
MiddleName nvarchar2(50),
LastName nvarchar2(50)
);
Result:
I follow up this article:Â [URL] ....
VB Script:Â
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper
[Code] ..........
Hi all, i got this error:
[DTS.Pipeline] Error: "component "Excel Source" (1)" failed validation and returned validation status "VS_NEEDSNEWMETADATA".
and also this:
[Excel Source [1]] Warning: The external metadata column collection is out of synchronization with the data source columns. The column "Fiscal Week" needs to be updated in the external metadata column collection. The column "Fiscal Year" needs to be updated in the external metadata column collection. The column "1st level" needs to be added to the external metadata column collection. The column "2nd level" needs to be added to the external metadata column collection. The column "3rd level" needs to be added to the external metadata column collection. The "external metadata column "1st Level" (16745)" needs to be removed from the external metadata column collection. The "external metadata column "3rd Level" (16609)" needs to be removed from the external metadata column collection. The "external metadata column "2nd Level" (16272)" needs to be removed from the external metadata column collection.
I tried going data flow->excel connection->advanced editor for excel source-> input and output properties and tried to refresh the columns affected.
It seems that somehow the 3 columns are not read in from the source file?
ans alslo fiscal year, fiscal week is not set up up properly in my data destination?
anyone faced such errors before?
Thanks
When I execute the below stored procedure I get the error that "Arithmetic overflow error converting expression to data type int".
USE [FileSharing]
GO
/****** Object: StoredProcedure [dbo].[xlaAFSsp_reports] Script Date: 24.07.2015 17:04:10 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
[Code] .....
Msg 8115, Level 16, State 2, Procedure xlaAFSsp_reports, Line 25
Arithmetic overflow error converting expression to data type int.
The statement has been terminated.
(1 row(s) affected)
is there a step by step paper to get there? here is what i need to consider. I Iwill have many customers that will need their own set of records and access pages "branded for their company" each customer will have many clients. I am hosting this application on a windows 2003 server with SQL 2005 server enterprise.
I am using windows authentication, I have created a username in windows, then i added the windows user in SQL management studio in security, granted "DB Read" and "DB write" and again under the database security tab. still from the web authentication fails. i must be nissing a step or two?
I expect to set up a username for each database as i setup new customers.
RE: XML Data source .. Expression? Variable? Connection? Error: unable to read the XML data.
I want my XML Data source to be an expression as i will be looping through a directory of xml files.
I don't see the expression property or the connection property??
I tried setting the XMLData property to @[User::filename], but that results in:
Information: 0x40043006 at Load XML Files, DTS.Pipeline: Prepare for Execute phase is beginning.
Error: 0xC02090D0 at Load XML Files, XML Source [108]: The component "XML Source" (108) was unable to read the XML data.
Error: 0xC0047019 at Load XML Files, DTS.Pipeline: component "XML Source" (108) failed the prepare phase and returned error code 0xC02090D0.
Information: 0x4004300B at Load XML Files, DTS.Pipeline: "component "OLE DB Destination" (341)" wrote 0 rows.
Task failed: Load XML Files
Information: 0xC002F30E at Bad, File System Task: File or directory "d:jcpxmlLoadjcp2.xml.bad" was deleted.
Warning: 0x80019002 at Package: The Execution method succeeded, but the number of errors raised (2) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "Package.dtsx" finished: Failure.
The program '[3312] Package.dtsx: DTS' has exited with code 0 (0x0).
Thanks for any help or information.
I setup this package to import data from a Sharepoint list to a SQL Server data table. The primary key of my SQL table is mapped to the Title column of my Sharepoint list. There is a possibility that duplicate values will be entered in the Title field of the Sharepoint list. So when importing data into my table via SSIS, my package always error-out when there it comes across duplicate values. how you others have managed data integrity when importing from a Sharepoint list with the Title column being mapped to the primary key of a table.
View 4 Replies View RelatedHello,
I am wondering what conversion rules apply, when a string, which contains a number, is saved to a SQL Server 2005 into a column of type decimal.
This is the code I€™m using (C++):
CString cValue = "0.75"
_variant_t vtFieldValue;
vtFieldValue = _variant_t(cValue)
pRecordSet->Fields->Item["MyColumn"]->Value = vtFieldValue;
"pRecordSet" is an ADO recordset. The database column "MyColumn" is of type "decimal(19,10)".
The most important question for me is, if the regional settings of the database server or the regional settings of the client PC are considered during the conversion from the string to the decimal value. For example in standard French regional settings the "." would not be recognized as decimal separator.
I am also wondering if the language of the database instance, in which this data is saved, is considered during this conversion or any other settings of this database instance.
So my general question is: Does anybody know exactly what rules apply during the above mentioned conversion?
Thank you for your help.
Regards,
Volker
I've question about how to handle structural datamodel changes in a datasource of PowerPivot. Suppose I'm developing a starmodel in SQL Server and sometimes a datatype changes or a name of a field changes in a table. It seems to me that PowerPivot handle this not gracefully as Analysis MD does (mostly). I received an error because of a wrong fieldname or even no error when a dattype changes in PowerPivot. Is this common or do I something wrong here. Does this mean that every time the datamodel changes the PowerPivot should be recreated? Or am I missing the clue here?
View 6 Replies View RelatedHi everyone,
I have to extract, dayly a list of contacts on a exchange server in a table on our EDW on sql server 2005. Is it possible to get the information directly from a dataflow or i will have to developpe a script task ?
Need help desperatly !!!
I am getting ErrorCode 8 while loading the data from stage to model. I have checked my error view it states that "Member Code is Inactive".
Initially I have loaded same set of data in Model from MDS Stage table but then deleted with ImportType = 5 which removed all the data from the MDM model.
Now i want to load it back but its giving the Error Code 8 .. Before loading the same data i have changed the stage table Importtype to 2 and Importstatusid to 0.
Hello,
I have noticed that for one of my data-flows, the process is really long during the phase "the final commit data insertion has started".
To be accurate, the process is fast until it reaches this phase. It happens often when I load millions of lines.
The extraction is done from a database SQL Server 2005 to a database SQL Server 2005, on the same server (with the SQL Server native provider).
I used a SQL Server destination but I have tried with an OLE DB destination and it is the same situation.
Why the process could be so long during this phase?
There is a way to optimised my package to avoid that?
Any idea is welcome.
Thanks.
Guillaume