I have a SQL2000 server with 128m rows of data. I want to delete about
65m of that. So far I have bcp'ed the relevent data out and put them
into another SQL database.
We have a small amount of space for our transaction log so I cannot
delete all 65m rows in one go. So far I have been doing them is 0.5m
chuncks, but it is extremly slow.
Would a faster way be to bcp the data I wan to keep and truncate the
table and bulk import them in again ?
What hapnes to log size in when builk import is happening and is there
another way of doing this ?
I have a table that has 110 million records, I will be deleting over 60 million records, but I dont have enough space to hold all the deletes on the trans log file. Is there a way to delete 1 million record then free the trans action then run another delete in one script.
I had write a ActiveX service to delete several tables and those records are more than 100000. When I test it by deleted 1000 records it is ok, but once the volum is increase until 100000, it will give me a error message said timeout operation fail.
I have a table with about 466 Million rows. In this table there is a int column called WeeksToRetain as well as a EventDate column containing the date the row was inserted. I am trying to delete all the rows that that should be deleted according to the WeeksToRetain. For example, if the EventDate is 5/07/15 with a 1 in the WeeksToRetain column the row should be removed by 5/14/15. I am not sure what days SQL considers the beginning and end of the week. However the core issue I am having is the sheer mass of deletions I must do and log growth.
So I am trying to do the delete in batches. More specifically I want to load a temporary table with a million rows, then use the temporary table to load a sub temporary table with 100,000 rows and join this temporary table to the table I want to delete from looping through 10 times to get 1 million. The Logging.EvenLog table which is the table I'm trying to purge has a clustered index on EventDate (ASC). I would like to run this in a schedule job with enough time between executions for log backups to run.
DECLARE @i int DECLARE @RowCount int DECLARE @NextBatchDate datetime CREATE TABLE #BatchProcess ( EventDate datetime, ApplicationID int,
A New Monthly data is being loaded, checked and finally approved after 6 or 7 iteration before approval.Because of this iteration the monthly data set is being added then deleted then added then deleted few times.Because the table is big this process takes time, any thoughts on how to make the delete insert process faster.Keep in mind I cannot do much because it is a production table and is being access by other users to do other analysis.
Delete is done based on trx_date which is a year/month combo, like 201508.
The table has monthly sales by customer aggregated.
The table structure is:
CREATE TABLE [dbo].[Sales]( [batch_key] [int] NOT NULL, [Company_key] [int] NOT NULL, [customer_key] [char](22) NOT NULL, [Trx_Date] [int] NOT NULL, [account] [nvarchar](35) NOT NULL,
Currently we has a database of size about 300G. Because our backup system failed some time past we were left with a transaction log file which grew to about 160G. However our backups are working again and everything is working fine. My understanding is that now the transaction log file is practically empty but the capacity remains at 160G.
When you delete records the deleted transactions are going to get logged to the transaction file. My understanding is when a backup is done these transactions get discarded out of the transaction file.
could I make use of this relatively large transaction file and start deleting transactions without out actually adding to the transaction file size.
The plan is to delete records from logging tables that are not referenced to by any other table without this increasing the transaction log file.For example over a period of a few weeks we can delete a chunk of records from a table. Then after it has completed a backup we can delete another chunk of records out of this table until we have got the table down to the records that we now need.Will this work?
I am using Master Data Service for couple of months now. I can load, update, merge and soft delete data in MDS. Occasionally we even have to hard delete data from MDS. If we keep on soft deleting records in a MDS table eventually there will be huge number of soft deleted records. Is there an easy way to hard delete all the soft deleted records from all MDS tables in a specific Model.
I am studying indexes and keys. I have a table that has a fixed width of data to be loaded in the first column which is parsed in a view based on data types within the fixed width specifications.
Example column A: (name phone house cost of house,zipcodecountystatecountry) -a view will later split this large varchar string based column b: is the source filename of the data load (varchar 256) ....
a. would there be a benefit of adding a clustered or nonclustered index (if so which/point in direction on why)
b. is there benefit of making one of these two columns a primary key (millions of records) or for adding a 3rd new column as a pk?
c. view: this parses the data in column a so it ends up looking more like "name phone house cost of house zipcode county state country" each having their own column.
-any pros/cons of adding indexes (if so which) to the view instead of the tables or both for once the data is parsed?
hello guyshere is my problem:i am developing a asp.net web app in .net 2.0. i have some sensitive data in my database. which is encrypted using DES ( with some key which is only known by the top level authorities ). now there is an option of changing the secret key. on changing the key the sensitive data has to decrypted using the old key and then again encrypted using the new key. Now if the no of records increases i am afraid that it might take a longer time and the application might look as it got hanged. guys i have no clue on how to do this. if you guys have any idea on how to implement this please let me know. any help would be appreciatedVignesh
Hi, I'm currently trying to retrieve results from a large dataset, there are over 45000 records and I need to use them all to peform counts etc. I have set up views, but my page is still being returned slowly, is there anything I can do to speed this up? Thanks Gemma
I am trying to import data into SQL Server 7. The table will be 700-800 columns, and the data will be about 150,000 records at a time. The data source is flat file.
First I create the table using a database schema, and secondly I would like to populate the table. The problem is that most of the data is numeric, and to be used for statistical analysis.
So far I have tried Bulk Insert, bcp, and dts. DTS is the only method that has worked in any way, shape or form, but that requires importing each column as a Varchar. Importing to my pre-created table doesn't work, because it is interpreting some of the source columns as character data and refusing to insert them into an int field. Bulk Insert and bcp both give error messages, and I am wondering if that is because of the size of the insert statement that is required to handle so many fields.
For the moment I am just trying to import the data in any way, but eventually, it will have to be run as an automated process, with the table structure probably needing to be altered as well.
Any help/suggestions would be very greatfully received.
I have a web site that allows user to enter large strings into a database (comments, etc). What is the best way to do that? Right now I have them limited to 25 characters and the data type is varchar. Is there a better way?
when to use table variable and temp table. i told the interviewer that when rows is less like hundreds or thousand then use table variable else use temp table.After that he asked that what do u mean by less data or thousand rows may be there are multiple columns involved with that less rows and make a huge data set.
I want to build a system that will have about 1 million rows in atable in sql server database.I am using this for a web application andaccessing it via JDBC type 4 driver.But display 20 records at a timeonly using pagination(as in google).What will be the best way to goabout this.
I have a database that is 70GB big. One of the tables has over 350million rows of data. I need to delete about 1/3 of the data in thatone table.I was going to use a simple delete command to delete the unnessacaydata.Something likeDelete from tablename where columname > '100'However, it takes SOOO LONG to run, over 8 hours, if not longer....What is the best way to delete this data???One idea was to create a Temp Table, then move the needed data to thetemp table, then Truncate the Table, and move that data back.I'm not very good with SQL Query Language, so can somone give me anexample on how to do this?? Or if you have a differant idea that wouldbe faster please let me know.thanks,Sam
I want to store some binary things(pic and so on), so I create a table which contain a a "varbinary" data-type column.
but 1. I used OPENROWSET to insert the large file in this table. 2. I used master..xp_cmdshell to retrieve data out as a file. One strange thing happened: the size of the input and output is really different(output is 1k bigger than the input file).
and it seems that the file is broken with different file format......
Hi All, How do I input a large text page (notepad) into a SQL column. Or assign a pointer to the data. I've tried to use BOL (writetext) and to no avail, I guess I'm missing something. I'm just using EM and Query analyzer. I thought this should be easy. Image data should work the same way.
The purchased-application mssql 6.5, sp 4 that I am working on has one large table has 13m illion. It the largest table considering thenextlatgest table is only1.75 million rows.
Thew vnedor has made a change to this largest table in recommending changing a data type -- char to varchar. To make this change easier to do, I want to "archive" older data not necessary for the current year or current processing to another table.
What is the best way to do this archiving?
Any information you can provide will be greatly appreciated. Thanks.
I have a table which basically stores multiple users' responses to a questionnaire. I want to calculate certain statistics on this data (for example: how many users selected a specific answer to a question). If there are many questions and possible answers, then this can get really inefficient. I was wondering what would be the best way to go about doing this.
Currently, I was thinking of using what I believe are called crosstabs:
Code:
SELECT (SELECT COUNT(*) FROM tableName WHERE Q1answer='value1'), (SELECT COUNT(*) FROM tableName WHERE Q1answer='value2'), (SELECT COUNT(*) FROM tableName WHERE Q2answer='value1'), etc...
Is this the best way to go about this or is this really inefficient?
I have run the select query which returned one row. There is one column in it which has got large amount of data. I want to copy the complete content of that column(varchar(max)), but I am unable to do it. It's not the xml data. I don't want to do any conversions.
i have just created a test database and now need to insert a large number of records into one of the tables, we were thinking of about 1 million records, has anyone got an sql script that i could use to create these records
Hi i wanna delete all the records from an large database 200 -300 tables, because i want make some changes an start from scratch,but keep the structures of the database key , index etc, i tried to generate script but when i run to many errors , plz help 10x
I was wondering if any one could help me, I need to store large amounts of data in my database, at present I have it set to nvchar (8000), I've looked around and noticed you can use text which stores up to 2 million, but is slow in displaying the information.
Any ideas or points in the right directions would be great.
Does anyone have ideas on the best way to move large amounts of databetween tables? I am doing several simple insert/select statementsfrom a staging table to several holding tables, but because of thevolume it is taking an extraordinary amount of time. I consideredusing cursors but have read that may not be the best thing for thissituation. Any thoughts?--Posted using the http://www.dbforumz.com interface, at author's requestArticles individually checked for conformance to usenet standardsTopic URL: http://www.dbforumz.com/General-Dis...pict254055.htmlVisit Topic URL to contact author (reg. req'd). Report abuse: http://www.dbforumz.com/eform.php?p=877392
This is a general question on the best way to import a large amount of datato a MS-SQL DB.I can have the data in just about any format I need to, I just don't knowhow to import the data. I some experience with SQL but not much.There is about 1500 to 2000 lines of data. I am looking for the best way toget this amount of data in on a monthly basis.Any help is greatly thanked!!Mike Charney
Brief background:We are using SQL Server 2000, and one of the tables stores usersessions details (each time our users logs into our system we insert anew record in the session table, and each time user logs out from oursystem we insert another record in the same table).SESSION_ID is the primary key and it is clustered index.The system produces 5 million session records/day.The problem:Each day we transfer the session data (delta only) to other machine andwe want to delete bulk of ~5 million sessions. This should happendwithout any interfering of our customers activity ( in the same time,we should not block the table - new sessions should be created).What is the best way to perform such task ?
We are looking to store a large amount of user data that will bechanged and accessed daily by a large number of people. We expectaround 6-8 million subscribers to our service with each record beingapproximately 2000-2500 bytes. The system needs to be running 24/7and therefore cannot be shut down. What is the best way to implementthis? We were thinking of setting up a cluster of servers to hold theinformation and another cluster to backup the information. Is thispractical?Also, what software is available out there that can distribute querycalls across different servers and to manage large amounts of queryrequests?Thank you in advance.Ben
Basically we have have two larges databases, one of which is updated from the other monthly.
For exaplination purposes:
DB1 = Source DB
DB2 = Destination DB
The problem that I require a soltion to is, how do I insert rows from a table in DB1 to DB2 and recover and store the identity of the new row against the ID of the existing row. This is so that I can then matain constraints when it comes to inserting rows into the next table and the next and so on.
This process of storing the ID's as lookups will need to be done for almost every table of which there are 20.
The best Idea we have at the minute is to create a table with two colums for each table (drop it and recreate it after each table has exported) that contains the two ID's, new and old.
This will require using a cursor for each row in the existing table, inserting it in the new table and the using @@Scope_Identity to get the new ID and then insert the two values into the temp table.
This too me feels like it will be very slow, particuarly when I bear in mind how much data we have.
Does anyone have any better ideas? (Sorry if the explaination isn't great, its difficult to get accross)
We can pass XML to the XML Source in a variable, but I haven't seen anywhere how much data can be passed this way? Is there a limit beyond the limits of system memory?
Also, what data types are valid for the variable? Just String?
I have a dataset with 300,000 records and I'm getting the following error with MS Reporting Services. "An error has occurred during report processing. Exception of type System.OutOfMemoryException was thrown. any help with this would be highly appreciated.
I need to be able to graph roughly about 150 employees/ supervisor and their monthly cell phone usage in minutes. I understand that I will need to group this on say one graph for every ten employees so it doesn't look messy and cluttered. I have read some threads here but they dont seem to work for me.
So again each supervisor has 100+ subordinates and I need to graph theie phone usage by month
p.s. my email was incorrect in the last mail. Hi all, is there a sql 2k thread. Am interseted in finding out what the largest database size of a sqlserver database people have worked with. We have a 1.2 Terabyte db with about 150-200 million new rows being processed everyday. Would like to share some thoughts on this with other people who are working with this much data and what they are doing with it.
bhala ---------------------------------------- Please check us out at: http://www.bivision.org/bivision