Problem On Huge Table.

Mar 14, 2001

We have a huge table which has 12 million records. And when I run the following script, it took 50 hours. Is there anyone who can help? Thanks.

update TableA
set In=e.In, EA=e.ea, We=e.we
from TableA c, TableB e
where c.code=e.code

TableA 12,000,000 records.
TableB 750,000 records.
And have clustered index on each code field.

View 6 Replies


ADVERTISEMENT

Huge Deletes In A Huge Table

Apr 3, 2000

SQL 7 SP1 NT4 SP5

I have a TRANSACTION table with 150 million rows.

I have a USER table.

Each user has about 600 records in the TRANSACTION table.

The TRANSACTION cluster index is on USERID + RECID . The second index is on USERID + Fieldx + Fieldy.

The TRANSACTION table gets about 1.4 million inserts in a normal day and about 40,000 updates.

I want to go through the USER table and delete all users who have not visited me in a while.

I want to do this without substantially hindering performance in a production environment. I can perform this over a week period or two if needed.

The best way I thought of doing this was to grab x amount of users in a cursor and loop through deleting their corresponding TRANSACTION records.

Does anyone have any ideas on a better way. What is going to happen to my indices during this time ?

Thanks !!!

View 3 Replies View Related

Alter A Huge Table

May 18, 2001

I need to alter a table (expand the column size for varchar(10) to varchar(255)) and the table has 200 million rows.
Please suggest me the best and the fastest method to achieve it. The database is on SQL 7.0


Thanks

View 1 Replies View Related

Huge Table Backup

May 13, 2008

Hello
i want to ask about the huge table(table with many tera records) backup time cost , any one can help me please in determining the time cost nearly

View 2 Replies View Related

Horizontal Partition Of A Huge Table

Jul 1, 2005

Hi,

I have a table with 52 million rows which resides on Primary file group in my database. Because of huge number of rows the performance has gone very down and I would like to break the table into parts.

Can anyone suggest me the steps for doing the same and the number of parts that should be made. It is named as Account_Transactions and contains information of Policies in an insurance database.

Rajat

View 2 Replies View Related

Frequency Counts From Huge Table

Mar 12, 2008

Hey guys,

I have a table with about 80 columns and 400 millions records. Each columns has different responses that I need to get frequency for. I need to get counts for each response from all the columns... I have a query that does it, but it will run forever... what is the best way to do so?

My starting query:

select res, sum(cnt) from
(
select col1 res, count(*) as cnt from table1 with (nolock)
group by col1
union all
select col2 res, count(*) as cnt from table1 with (nolock)
group by col2

........................

select col80 res, count(*) as cnt from table1 with (nolock)
group by col80
)a group by res

View 1 Replies View Related

Does DB Size Decrease When I Delete A Huge Table ??

Jan 15, 2004

Hi,
My DB size (Right click on DB Name, Data Files tab, Space Allocated field) was 10914 MB.

I delete a huge table (1.2 million records * 15 columns).
I checked the db size again. It didnt change.
Shouldn't it decrease because I delete a huge table ??

View 14 Replies View Related

Urgent! Please Help! Deleting Data From A Huge Table

Mar 16, 2004

I have a huge table with 4 primary keys on it. I need to delete the data from this table ( approx. 5.6 millions records to be deleted). It takes a hell lot of time to delete it by normal query.
Can someone please suggest me a better way?
Any help will be appreciated.

View 14 Replies View Related

How To Select Specific 2 Rows Out Of A Huge Table

Jul 24, 2013

I have a very large table , and from that table I need just 2 records with column1 = 'A' and column1 = 'B' .

Here I don't think if I can not use OR or IN or Case operators because I need exactly 2 records not more.

View 6 Replies View Related

Fastest Way To Move Huge Table Across Servers

Mar 23, 2008

Hi Guys,

What is the fast way to move huge table (77 million) records with 25 columns across servers? The servers are not linked though.

Thanks for the help.

View 3 Replies View Related

The Best Way To Delete A Huge Number Of Records In Table

May 30, 2008

Hi Everyone,



We have a large test database with million of records for more than company site Code. Sometime we want to refresh the data of that database for one or more site Codes.

In order to do that I have to delete all records of the site code we want to refresh on the test database first then copy a new set of data from production database over. Since we refresh data based on the site code therefore I have to use the Delete command instead of Truncate.

Since this is a huge database with thousand of tables and million of records per table I have a performance issues with delete command. So what would be the best to delete a large number of records without writing any information to database log file?



FYI: The Recovery model of this database is Simple


Regards,



Jdang

View 9 Replies View Related

Truncate A Huge Table Having 21 Millions Of Rows

Nov 19, 2015

SQL Server: 2008 R2

Question A :  I need to truncate a table, it has 21 millions of rows and it has a size of 14 GB.             

                                1-  How do I find out if this table is not being referenced by a FOREIGN KEY?
                                2-  Does it Participates in a indexed view?
                                3- Is being published by using transactional replication or merge replication?

Question B:  How do I safely truncate that table? 

View 8 Replies View Related

PULL Huge Table From SQL Server Problem

Jul 12, 2007



Hey Guys

I have meet the same problem, too.

I create a table with 1,380,000 rows data,

the db real size about 114 MB.

The primary key size is nchar(6).

When I use RDA pull, I found that the primary

key in the PDA disappear. So, It took a long time

to get query response.

But when I delete some rows to 680,000 rows of data.

After I pull, The primary key can pull from the SQL Server.

PS: I didn't change any code. Just delete some rows.

Is that SQL-Mobile's bug??



PS: 1.Database and Temp Database limitation both are 384MB

2.If I use query analyzer to add primary key it works! so strange!!

3.Pull process return "S_OK".

4.After Pull process finished, the db connection still alive. It seems not like

time out problem.

5.Local Connection String:"Data Source='%s\%s';SSCEatabase Password='%s';SSCE:Encrypt Database='true';SSCE:Max Database Size=384;SSCE:Temp File Max size=384;SSCE:Temp File Directory=%s"



View 1 Replies View Related

Database Table Design For Huge Number Of Columns

Jan 24, 2008

Hi

I have a table (Sql server 2000) which has 14 cost columns for each record, and now due to a new requirement, I have 2 taxes which needs to be applied on two more fields called Share1 and share 2
e.g
Sales tax = 10%
Use Tax = 10%
Share1 = 60%
Share2 = 40%

So Sales tax Amt (A) = Cost1 * Share1 * Sales Tax
So Use tax Amt (B) = cost1 * share2 * Use tax

same calculation for all the costs and then total cost with Sales tax = Cost 1 + A , Cost 2 + A and so on..
and total cost with Use tax = Cost1 +B, Cost 2 +B etc.

So there are around 14 new fields required to save Sales Tax amt for each cost, another 14 new fields to store Cost with Sales Tax, Cost with Use tax. So that increases the table size.
Some of these fields might be used for making reports.

I was wondering which is a better approach out of the below 3:
1) To calculate these fields dynamically while displaying them on the User interface and not save in DB (while making reports, again calculate these fields dynamically and show), or
2) Add new formula field columns in database table to save each field, which would make the table size bigger, but reporting becomes easier.
3) Add only those columns in database on which reports needs to be made, calculate rest of the fields dynamically on screen.

Your help is greatly appreciated.
Thanks

View 3 Replies View Related

Question About Log File Size When Alter Huge Table

Oct 31, 2005

I have the next question, and i would like to hear what do you thinkabout, and if is there a better solution for "my problem"here is the question, I have a huge table with 60GB of data (imagefiles). The problem happen always when i try to ALTER the structure ofthe table. For example I change a field char(3) to char(4)...thesqlserver then performs the "alter table" command...that must besomething similar than "insert into the new table + drop the actualtable" and for that I need about 60GB o space for my LOG file, andtakes hours to complete the operation.Is this the only way to alter a single field in my table??I would like to heard you opinions...Thanks..ALberto

View 2 Replies View Related

Database Table Design For Huge Number Of Columns

Jan 24, 2008

Hi

I have a table (Sql server 2000) which has 14 cost columns for each record, and now due to a new requirement, I have 2 taxes which needs to be applied on two more fields called Share1 and share 2
e.g
Sales tax = 10%
Use Tax = 10%
Share1 = 60%
Share2 = 40%

So Sales tax Amt (A) = Cost1 * Share1 * Sales Tax
So Use tax Amt (B) = cost1 * share2 * Use tax

same calculation for all the costs and then total cost with Sales tax = Cost 1 + A , Cost 2 + A and so on..
and total cost with Use tax = Cost1 +B, Cost 2 +B etc.

So there are around 14 new fields required to save Sales Tax amt for each cost, another 14 new fields to store Cost with Sales Tax, Cost with Use tax. So that increases the table size.
Some of these fields might be used for making reports.

I was wondering which is a better approach out of the below 4:
1) To calculate these fields dynamically while displaying them on the User interface and not save in DB (while making reports, again calculate these fields dynamically and show), or
2) Add new formula field columns in database table to save each field, which would make the table size bigger, but reporting becomes easier.
3) Add only those columns in database on which reports needs to be made, calculate rest of the fields dynamically on screen.

4) Create a view just for reports, and calculate values dynamically in UI and not adding any computed values in table.



Your help is greatly appreciated.
Thanks

View 4 Replies View Related

Transact SQL :: Append A Column To The Huge Transaction Table

Oct 18, 2015

I want to append the column to the transaction table(60 million records in it.) ..

Our transaction table is being used in production.. but i have very less amount of time ..

Instead of alter table.. (IF we use the alter to take backup of table and do the processing it will take more time). Is there any way to append the column to the transaction table ..

View 8 Replies View Related

SQL Server 2014 :: Huge Number Of Rows In Table Spool

Jul 6, 2015

I have a CTE query against a table with 32K rows that runs fine in 2008R2. I am running it in 2014 Std Ed. against the same data and it runs very slowly. Looking at the execution plan I think I see what's contributing to the slowness.

Note that the "actual number of rows" is some 351M...how is this possible?

the query:

declare @amts table (claim int,allowed decimal(12,2),copay decimal(12,2),deductible decimal(12,2),coins decimal(12,2));
;with unpaid (claimID) as (select claimID from claim where amt+copay + disct+mm + ded=0)
insert @amts
select lineID, sum(rc), sum(copay), sum(deduct),
case when sum(mm)>0 and (sum(mm)<sum(mmamt)) then sum(mm) else 0 end
from claimln
where status is null
and lineID not in (select claimID from unpaid)
group by lineID

it's like there's some massively recursive process going on?

View 5 Replies View Related

SQL 2012 :: Apply Primary Key On A Huge Table So Downtime Is Minimal?

Jul 30, 2015

I have a table (named table1) with 20million rows. It takes around 11 minutes to apply the primary key to this table. There are some tables with over 100 million rows so based on the previous time if my calculations are correct it will take close to an hour apply this primary key for tables with around 100 million rows.

My current solution is to create another table (named table2) with no indexs or primary keys. Pump over only like 5 days worth of data, then apply the primary key. Then have a script that will eventually populate table2 with the rest of the data gradually. When I say gradually I mean like insert like every 100k per hour or something. Keep in mind this table2 is heavily updated with new records.

View 2 Replies View Related

Transact SQL :: Table Locking Happening When Removing Huge Data

May 14, 2015

declare @error int, @rowcount int
select @rowcount = COUNT(1) FROM  STG_BCDR;
while @rowcount > 0
begin
 BEGIN TRAN Deletion

[code]....

Above code i try to delete records batch by batch to avoid table locking at BCDR table.total records in this BCDR  table is 40,000 records.  However I run the code at execution plan, the BCDR table still clustered index scan which means that the locking still happend.

If i change the delete top (5000)...... to delete top (5).... then thre is clustered index seek, which is good..The problem here is  each time  only delete top 5 records which is means it will realy take very long time to remove those  data.

how to cater the situation inorder for me to delete those huge data without table locking happend. If table locking happend , then other user will not be able to access this table at the same time.

View 6 Replies View Related

How To Work With Huge Amount Of Records In A Table Using MSSQL Server 2000?

Dec 21, 2005

In
one of our forth coming projects, with ASP.Net/C#/MSSQL Server, We have
to deal with a Business table having about 15 millions of records. We
want to know, that which methodologies should we adopt, both regarding
front end and back end perspective, so the site could give optimised
performance. Also in place of a Dedicated Server, the Hosting Company
provides MSDE (that come with .net). Will this create any problem with
this project, that have such a huge table? Should we go for some
advanced database technique, such as, Clustering, Spliting Tables, etc.

Followings are the fields that the business table contains:

ID, Category ID (which comes from a Category table, each business is
under a category), BusinessName, SignupDate, Address1, Address2, Phone
Number,
Hours Of Operation, Years in Business, LicenseNumber, DiscountCoupon, Website

View 3 Replies View Related

Huge Log

May 27, 2007

Dear all,



i have problems with log .

the mssql write 4 G log so plz how can i Eliminate log huge size

View 1 Replies View Related

Huge Log Files

May 9, 2008

 I have SQL Server 2005 Express Edition with Advanced Services running on a small web server. It all runs fine, but every now and then the log files grow and grow and eventually use up all the disk space of 30GB. As a quick fix, restarting SQL a couple of times clears out the logs and everything is up and running again. Any ideas on how to stop this happening?

View 5 Replies View Related

Huge Transaction Log - Do I Need This?

Nov 19, 2004

I just peeked at my DNN setup and I found that I have a transaction log about 98 gigs large, compared to a DNN database that is only about 250 megs. Crazy, huh?

Do you happen to know what I need that transaction log for? Can I just delete it or will it break my SQL db? Is there a way that I can keep only maybe a week of transactions in it so it doesn't grow so dang large?

Thanks in advance for your response!!

Tim x 4
(always learning!)

View 2 Replies View Related

Huge Transaction Log

Aug 10, 2000

One of my production databases is currently 51 mb. The transaction log is well over 5 gig. I have tried truncating and then shrinking the log through the use of SQL utilities. This does not work! How can I quickly resolve this problem without tampering with the production environment?

Thanks in advance.
Ray Reinders

View 1 Replies View Related

Huge Transaction Log

Sep 30, 2002

I am not a DBA and I run a personal web site that has gotten pretty large. I have never done anything to maintain my sql server, and now my transaction log is 10 Gigs and my data is only like 300 Megs. I am starting to get a memory leak with the sql service. What should I do? Is it bad to have a huge transaction log. I am not familiar with any of this stuff, so someone please point me in the right direction.

View 3 Replies View Related

What Should I Do With The Huge Log File

Oct 11, 2004

Hi all,
I found my database log file is 26GB and the database file is just about 280MB. We are doing full backup everyday. However, my sql server seems running very slow now and please advise:

1. How can I decrease/truncate my log file?
2. Would the huge size of the log file be the reasons slowing up my sql server?
3. Would anyone give me direction knowing more on the transaction log?
Thank you and appreciated!

View 4 Replies View Related

Log File HUGE!!

Oct 30, 2007

Hi guys, its my first post! Its also like my first time really diving into sql. We are using sharepoint on site here along with sql server 2005, one of our log files is 255 GBs and needs to be made smaller very fast!! We are almost out of disk space and the log is growing fast.

I am very new to sql and dont even know where to go to enter commands, so youll have to bear with me here. I've read about truncating and shrinking and some other things, I am just worried and dont want to mess anything up. I know this is probably a simple task, but like I said, with the truncate command I was reading about, I dont even know where to go to type it in!!! If someone could please help it would be much appreciated. Thanks so much.

View 14 Replies View Related

Huge Dataset, Only Need A Little

Dec 7, 2007

Hello, newbie here...

I have a huge dataset in MS SQL Server, over 11 million records of machine data taken every four seconds. I really don't need samples of this data at this interval. I'd like to run a query to retrieve my data for every 30 minute interval. I know enough SQL and DTS to SELECT was fields I want, and how to direct it to a CSV file, but I'm not sure how to manipulate the TimeStamp field in the query to only pull data every 450 records.

TIA

View 20 Replies View Related

Huge Issue - Well For Me Anyway

Dec 26, 2007

Hey everyone.

I have a problem and I am sure someone here can help. Lat night my DB was working fine, as it has been . This morning I get to the office and now everything has gone to hell in a handbag .

I can no longer connect to my sql2005 DB I get this error when trying to place an order on our order page.

"There was a problem with the website:An error has occurred while establishing a connection to the server. When connecting to SQL Server 2005, this failure may be caused by the fact that under the default settings SQL Server does not allow remote connections. (provider: TCP Provider, error: 0 - No connection could be made because the target machine actively refused it.)
The error has been logged."

I also notice the sql agent will not start. Any the former owner of the Co. had a eval version of sql management studio the must have just expired(could this be causing it?)

Any help anyone can offer would be great.

View 12 Replies View Related

Huge .BAK FILE

Jan 28, 2008

I have a .bak file of 72gb. But my database size is only 32gb, I got this value from sp_spaceused?
Anyone know why the .bak file is so big?. Is it possible to reduce the size? How could i reduce it?



http://www.sqlserverstudy.com

View 9 Replies View Related

Huge T-SQL - Out Of Memory

Mar 25, 2008

Hello,
I have a very big T-SQL script (~24mb) and I need the SQL Server on my hosted site to execute it.

Right now, I have a web page that uses Sql.Connection.ExecuteNonQuery () to do it, but the .NET process runs out of memory when I load the file into a C# string.

How would I go about executing this T-SQL script on the server?
Is there such a command:
EXECUTE SCRIPT "myscript.sql" FROM DISC
?

thanks.

View 1 Replies View Related

Huge LDF File

Nov 13, 2007



Hi
this is regarding SQL Sever 2000. ( it was upgraded form sql7). its log file is increasing in very high manner. say 40 gb, 50 gb and now 57 gb. Mdf file is around 15 mb. we created back up and tried to restore to another system. its asking 57 gb free space. how to proceeed with file recovery. we have backups but it askes more space for log file. how to retrieve the data.
rgds
Pramod

View 3 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved