SQL 2012 :: Strategies To Archive Old Database Data?
Oct 28, 2014
I have the requirement to implement archiving of some database old data.
The database is not used to store files, just table with text/numeric data, some data modification logs.
Which strategies do you usually use to archive old records from a database?
Do you move old/unused data to another similar database in another server?
The data needs to be accessible for read access, even when archived.
View 3 Replies
ADVERTISEMENT
Apr 22, 2015
I am using SQL 2012 SE. I have 2 databases say A and B with same structure and relationships. There are 65 tables in each database. A is already replicating data to database C for 35 tables. Now I need to move data from A to B which is greater than getdate()-1 everyday for all the tables and once the move is done I need to delete this data from A. And the same thing the next day and everyday. Since this is for 65 tables its challenging to identify the insert order. Once the insert order is identified the delete order will be the reverse of it.
Is there a tool or any SP that could generate the insert order script? The generate scripts data only is generating the entire data and these databases are almost 400GB. Some tables have 200Mil+ rows. So it takes forever.
View 1 Replies
View Related
Jul 31, 2015
I wrote code to archive Data.I created a table that set the maximum number of records to archive and delete and loop until finished or a set a flag in a table to stop archiving. I was able to limit the number of records to commit.
View 9 Replies
View Related
Jul 10, 2015
I'm trying to flesh out a good queue table design with our dev team.So here is a general overview of the scenario. First an application will hit a WebAPI and grab any updates to Content and store those ID's in SQL (queue table). Next is the fun part, different multi threaded apps will process ID's from the queue. One app will make updates to the data in a different SQL DB while the other will update an index (likely Elastic).
Obviously, we don't want multiple threads working on the same items. One strategy could be to use UPDLOCK & READPAST query hints. However, I'm not sure about the reliability or performance of this solution. I just started looking into setting up a service broker but that would be completely unfamiliar territory for me. Also I can see how a broker might work well within the instance but how would that work with the application making updates to Elastic?
View 5 Replies
View Related
Mar 29, 2007
There is a great book on database refactoring that contains a comprehensive set or recipies on how to revise databases that are supposed to be always online and may have various clients that can't be upgraded at the same time. I guess this is a typical case with large databases and I would be surpised if Amazon stops their servers just to move a column from one table to another. The book describes necessary steps for such changes. Basically it's all about creating intermediate database schemas that would be used during transition period.
For example, if we need to move a column from one table to another:
Version 1.
Table A columns: Name, Price
Table B columns: Quantity, Date
Let's say we move Price to table B:
Version 2.
Table A columns: Name
Table B columns: Quantity, Date, Price
The book suggests an intermediate version:
Version 1_2.
Table A columns: Name, Price
Table B columns: Quantity, Date, Price
Additional trigger that will synchronize "Price" columns between A and B.
Version 1_2 can be used by both clients written for version 1 and 2. Software developers don't need to rush their upgrades, transition can last months and include several changes.
This technique requires accuracy in version control management, but looks very good to implement non-interruptible database schema upgrade. I wonder if this is the only option available for data schema upgrade with no downtime. I can't think about anything else - it this how large data warehouses updata their databases?
View 1 Replies
View Related
Jun 22, 2015
I have some table that need to be partitioned and archive one of the partitions.
I did this in Oracle several years ago but not in SQL Server.
I'm looking for a basic example on how to do this.
I know the basic steps but the examples that I found on the Web were not quite what I'm looking for.
[url][/
Partition an existing SQL Server Table
url]
View 9 Replies
View Related
Jan 18, 2005
Hi Everybody,
I'm working on a new database of 25GB in size with an expected 25% growth per year and an estimated 1000 TPS. Since I need to retain the old data for atleast 7 years. I would like to know whether I should Archive the database or is there any other means of storing the data, which will be used just to generate reports. Also pl. do let me know the advantages/disadvantages.
Thanks
Kishore
View 1 Replies
View Related
Jan 18, 2005
Hi Everybody,
I'm working on a new database of 25GB in size with an expected 25% growth per year and an estimated 1000 TPS. Since I need to retain the old data for atleast 7 years. I would like to know whether I should Archive the database or is there any other means of storing the data, which will be used just to generate reports. Also pl. do let me know the advantages/disadvantages.
Thanks
Kishore
View 4 Replies
View Related
Mar 25, 2012
Project assigned to me called ''Archive off old data''.As SQL DBA the best way to archive large table is data Partitioning. But one table does not enough in terms of size to do partition and plus table has 32 other tables,view,sps dependencies.The data from that table needs to archive off every week to different database on different server. My question are
1) what is the best way to archive data ?
2) what are steps I need to follow ? Do I need to remove dependencies first and then take out data?.
View 2 Replies
View Related
Feb 4, 2004
Hello, everyone:
What does "data archive" mean in SQL Server? Is it same thing as archiving in Oracle?
Thanks.
ZYT
View 1 Replies
View Related
Mar 14, 2008
We are running SQL Server 2005 express on Windows 2003. The database server gets significant amounts of data.
Because of the 4GB data limit we have a daily cron task which goes through and deletes data older then 90 days.
We would like a way to archive this data instead of deleting it. Is there any way to take data and compress it and store it in a different way, so that if needed, customers can query directly out from the compressed data? Cleary querying from compressed would be slower but that is ok.
Any other solutions that would allow us to archive data instead of deleting it? Thanks.
View 10 Replies
View Related
Nov 27, 2005
I want to replicate to an archive database. This means that the subscriber will have data that has been removed from the publisher. In my reading, I haven't seen any discussion of this specific scenario.
Here's what I imagine the solution might be:
EXEC sp_addpublication_snapshot
@publication = N'My_Publication',
@frequency_type = 1 -- only create the snapshot once
GO
...
EXEC sp_addarticle
@publication = N'My_Publication',
@article = N'My_Table',
@source_owner = N'dbo',
@source_object = N'My_Table',
@del_cmd = 'NONE',
GO
I set the publication snapshot to only execute once, that would be during the maintenance window when it is initially installed. Then, on the tables that will contain archived data, I specify that deletes aren't replicated.
Here's my concern: aren't there times when you need to resync?
If you could push a new snapshot that dropped the tables on the subscriber and built the thing up from scratch, then things would sync-up just fine. But in this scenario if you drop the subscriber tables then you've just lost your archive.
What's the best way to handle this?
Thanks,
-=michael=-
View 6 Replies
View Related
Jun 5, 2002
Hi Folks,
We have transactional replication setup to replicate data from production across to a reporting server.
We want to ARCHIVE production, but don't want the ARCHIVE duplicated on the reporting server.
Does anyone know of a way that the reporting server can be stopped from replicating these changes, and continue to hold the FULL history of the database?
Cheers,
David
View 2 Replies
View Related
Jun 5, 2002
Hi Folks,
We have transactional replication setup to replicate data from production across to a reporting server.
We want to ARCHIVE production, but don't want the ARCHIVE duplicated on the reporting server.
Does anyone know of a way that the reporting server can be stopped from replicating these changes, and continue to hold the FULL history of the database?
Cheers,
David
View 2 Replies
View Related
Jul 20, 2005
I have a table contains huge rows of data. Performance issue raised. I amthinking archive some data so that the table will not be that big. The mostconvience way is move it to another table. The problem is: will this solvemy performance problem? or I need to move it to another database to reducethe database size?Regards,TrueNo
View 2 Replies
View Related
Oct 31, 1999
Hello:
The purchased-application mssql 6.5, sp 4 that I am working on has one large table has 13m illion. It the largest table considering thenextlatgest table is only1.75 million rows.
Thew vnedor has made a change to this largest table in recommending changing a data type -- char to varchar. To make this change easier to do,
I want to "archive" older data not necessary for the current year or current processing to another table.
What is the best way to do this archiving?
Any information you can provide will be greatly appreciated. Thanks.
David Spaisman
View 4 Replies
View Related
Mar 7, 2007
Hi
For strange reasons, I am now unable to backup (archive) any of the databases in Analysis Service. :eek:
When I try to archive, I get a dialog box with the following error:
"[MicroSoft][ODBC Driver Manager] Data source name not found and no default driver specified"
I am able to process my cubes and dimensions and am totally lost as to why I'm getting this problem :S
Will appreciate your help.
Thanks.
View 2 Replies
View Related
Mar 14, 2008
I have two tables say A and Archive. After a certain period of time some records are to be sent to archive table.To copy records to archive table I am using SqlBulkCopy operations.Now I have to delete the records from A Table. I was thinking of sending a Comma seperated id's of rows that are to be deleted to a stored procedure.Are there any better techniques to move data to archive table and to remove data from main table.?Thanks.
View 9 Replies
View Related
May 5, 2015
I'm looking for a process to archive data through replication. I have nightly job that purge records in few source tables(publisher) retaining only 3 yrs data. I have archive database (subscriber) that contains prior 3 yrs data and current 3 yrs data.
Before nightly job DELETES records in Source table i want to STOP replication so that the delete is not replicated in archive database. After the job completes i would like to TURN ON replication so that any new inserts and updates in Source will ONLY be replicated in archive database.
My DBA tested this but after last step of turning replication back ON archive database is sync'd with source table.
There are around 70 tables where 30 of them are transactional tables that needs record purge. Developing ETL process is possible but tedious.
View 5 Replies
View Related
Jun 21, 2006
Apparently the query analyser of sql server does not recognise adatabase with a point in it, likemail.archive.mdfI receive the following error when I use it:Server: Msg 911, Level 16, State 1, Line 1Could not locate entry in sysdatabases for database 'mail'. No entryfound with that name. Make sure that the name is entered correctly.Please help, thanksPrem
View 2 Replies
View Related
May 16, 2006
Dear all,I'd like to explore more about indexing strategies and would appreciatea reference to some good resources this regard including book names.Best regards
View 1 Replies
View Related
Sep 28, 2007
hello, what are the strategies when designing tables that needspaging?in the past i used to useselect top 200 * from tablewhere id not in (select top 100 id from table)with SQL 2005, would u guys recommend using CTE and/or ROW_NUMBER?or any other advice?thanks
View 4 Replies
View Related
Aug 23, 2007
I have been somewhat dismayed by the lack of information available relative to testing SSIS solutions. At a time when both managed code and db code are getting tooling (through VSTS) to help drive an automated testing mentality into developers everywhere - there seems to be little out there for those using SSIS. I have seen people talk about having SSIS packages that test the actual packages - there is always the manual way to write the tests in something like C# and then programmatically invoke the package and manually validate the results. Anyone out there have approaches that are working for them?
View 1 Replies
View Related
Oct 2, 2007
Im about to start a project that will be hosted by a third party web host. What is a common way to backup your database and have the backup saved ? The data may end up being several 100 MB of user settings, text etc (blog type stuff). If the DB gets to be several 100MB, then does making a backup and ftping it offsite sound reasonable ? Does ftp bandwidth usually count against your overall bandwidth usage ?
View 2 Replies
View Related
Jul 20, 2005
I am looking for some published paper regarding database performancetunning performance strategies. This is for academic purpose so itneeds not to be any commerical database specific. It will be evenbetter if the paper has some kind of methods to quantify/measureperformance. Has anyone come across with any interesting paper aboutthis?Thanks,ewong
View 2 Replies
View Related
Mar 30, 2001
I need a script that can be schedule for a SQL7 server to archive a table-space's data for exery day except current, then pull the last 7 days worth of data back into the table. Any one have anything like this? My dbase is a load totaling dbase and the size is getting out of control.
View 4 Replies
View Related
May 16, 2014
How to script the Database without data in sql server 2012?
View 1 Replies
View Related
Apr 17, 2014
I Enabled Data Collection on one of the server and planned to make it as Centralised Management Data Warehouse I configured data collection on it and can view reports. Next, I went to other server and configured "Set up data collection" to use my first instance as the centralised Database. But the issue is I can only see reports of first server. Am I missing something here.
I did exactly as explained in this video [URL] .....
View 9 Replies
View Related
Jul 21, 2014
I've stepped into a new environment and have never dealt with multiple data files on user databases only with Temp db.What would be the best way to get all my data files in sync. I have done this on databases that aren't that big in size or off in size by a lot. Here is what I have
Mdf -- 69 GB
ndf -- 3
ndf -- 3
ndf -- 3
ndf -- 3
ndf -- 4
ndf --4
ndf -- 2
View 7 Replies
View Related
Feb 13, 2015
I am aware that TDE protects data at Rest and not during communication or data in motion (UNLESS you use Encrypted communication channels using SSL certs etc). Hence I am thinking of doing data export from a TDE encrypted database to a database on the instance where TDE is not enabled or supported. I believe it works and need to take care of relationships between tables.The target database is hosted on SQL 2012 standard edition on which TDE is not supported.
View 4 Replies
View Related
Dec 22, 2014
(SQL 2012 Standard)
I have a database with one 320 .mdf file and one 200GB .ndf file.
I would like to reconfigure my database to have four 200GB .mdf files. How do I get from here to there?
View 5 Replies
View Related
Jul 30, 2014
How to export data from MDS 2012 entity to SQL 2012 user database table ?
View 6 Replies
View Related
Mar 24, 2015
How to estimate the size of datafile while creating the database?
View 2 Replies
View Related