Alter A Huge Table
May 18, 2001
I need to alter a table (expand the column size for varchar(10) to varchar(255)) and the table has 200 million rows.
Please suggest me the best and the fastest method to achieve it. The database is on SQL 7.0
Thanks
View 1 Replies
ADVERTISEMENT
Oct 31, 2005
I have the next question, and i would like to hear what do you thinkabout, and if is there a better solution for "my problem"here is the question, I have a huge table with 60GB of data (imagefiles). The problem happen always when i try to ALTER the structure ofthe table. For example I change a field char(3) to char(4)...thesqlserver then performs the "alter table" command...that must besomething similar than "insert into the new table + drop the actualtable" and for that I need about 60GB o space for my LOG file, andtakes hours to complete the operation.Is this the only way to alter a single field in my table??I would like to heard you opinions...Thanks..ALberto
View 2 Replies
View Related
Apr 3, 2000
SQL 7 SP1 NT4 SP5
I have a TRANSACTION table with 150 million rows.
I have a USER table.
Each user has about 600 records in the TRANSACTION table.
The TRANSACTION cluster index is on USERID + RECID . The second index is on USERID + Fieldx + Fieldy.
The TRANSACTION table gets about 1.4 million inserts in a normal day and about 40,000 updates.
I want to go through the USER table and delete all users who have not visited me in a while.
I want to do this without substantially hindering performance in a production environment. I can perform this over a week period or two if needed.
The best way I thought of doing this was to grab x amount of users in a cursor and loop through deleting their corresponding TRANSACTION records.
Does anyone have any ideas on a better way. What is going to happen to my indices during this time ?
Thanks !!!
View 3 Replies
View Related
Jul 23, 2005
Hi people,I?m trying to alter a integer field to a decimal(12,4) field in MSACCESS 2K.Example:table : item_nota_fiscal_forn_setor_publicofield : qtd_mercadoria integer NOT NULLALTER TABLE item_nota_fiscal_forn_setor_publicoALTER COLUMN qtd_mercadoria decimal(12,4) NOT NULLBut, It doesn't work. A sintax error rises.I need to change that field in a Visual Basic aplication, dinamically.How can I do it? How can I create a decimal(12,4) field via script in MSACCESS?Thanks,Euler Almeida--Message posted via http://www.sqlmonster.com
View 1 Replies
View Related
Jul 20, 2005
I would like to add an Identity to an existing column in a table using astored procedure then add records to the table and then remove the identityafter the records have been added or something similar.here is a rough idea of what the stored procedure should do. (I do not knowthe syntax to accomplish this can anyone help or explain this?Thanks much,CBLCREATE proc dbo.pts_ImportJobsas/* add identity to [BarCode Part#] */alter table dbo.ItemTestalter column [BarCode Part#] [int] IDENTITY(1, 1) NOT NULL/* add records from text file here *//* remove identity from BarCode Part#] */alter table dbo.ItemTestalter column [BarCode Part#] [int] NOT NULLreturnGOSET QUOTED_IDENTIFIER OFFGOSET ANSI_NULLS ONGOhere is the original tableCREATE TABLE [ItemTest] ([BarCode Part#] [int] NOT NULL ,[File Number] [nvarchar] (20) COLLATE SQL_Latin1_General_CP1_CI_AS NULLCONSTRAINT [DF_ItemTest_File Number] DEFAULT (''),[Item Number] [nvarchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NULLCONSTRAINT [DF_ItemTest_Item Number] DEFAULT (''),[Description] [nvarchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NULLCONSTRAINT [DF_ItemTest_Description] DEFAULT (''),[Room Number] [nvarchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NULLCONSTRAINT [DF_ItemTest_Room Number] DEFAULT (''),[Quantity] [int] NULL CONSTRAINT [DF_ItemTest_Quantity] DEFAULT (0),[Label Printed Cnt] [int] NULL CONSTRAINT [DF_ItemTest_Label Printed Cnt]DEFAULT (0),[Rework] [bit] NULL CONSTRAINT [DF_ItemTest_Rework] DEFAULT (0),[Rework Cnt] [int] NULL CONSTRAINT [DF_ItemTest_Rework Cnt] DEFAULT (0),[Assembly Scan Cnt] [int] NULL CONSTRAINT [DF_ItemTest_Assembly Scan Cnt]DEFAULT (0),[BarCode Crate#] [int] NULL CONSTRAINT [DF_ItemTest_BarCode Crate#] DEFAULT(0),[Assembly Group#] [nvarchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NULLCONSTRAINT [DF_ItemTest_Assembly Group#] DEFAULT (''),[Assembly Name] [nvarchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NULLCONSTRAINT [DF_ItemTest_Assembly Name] DEFAULT (''),[Import Date] [datetime] NULL CONSTRAINT [DF_ItemTest_Import Date] DEFAULT(getdate()),CONSTRAINT [IX_ItemTest] UNIQUE NONCLUSTERED([BarCode Part#]) ON [PRIMARY]) ON [PRIMARY]GO
View 2 Replies
View Related
Oct 8, 2007
I am using sql server ce.I am changing my tables sometimes.how to use 'alter table alter column...'.for example:I have table 'customers', I delete column 'name' and add column 'age'.Now I drop Table 'customers' and create again.but I read something about 'alter table alter column...'.I use thi command but not work.I thing syntax not true,that I use..plaese help me?
my code:
Alter table customers alter column age
View 7 Replies
View Related
Mar 14, 2001
We have a huge table which has 12 million records. And when I run the following script, it took 50 hours. Is there anyone who can help? Thanks.
update TableA
set In=e.In, EA=e.ea, We=e.we
from TableA c, TableB e
where c.code=e.code
TableA 12,000,000 records.
TableB 750,000 records.
And have clustered index on each code field.
View 6 Replies
View Related
May 13, 2008
Hello
i want to ask about the huge table(table with many tera records) backup time cost , any one can help me please in determining the time cost nearly
View 2 Replies
View Related
Sep 7, 2007
Hi guys,
If I have a temporary table called #CTE
With the columns
[Account]
[Name]
[RowID Table Level]
[RowID Data Level]
and I need to change the column type for the columns:
[RowID Table Level]
[RowID Data Level]
to integer, and set the column [RowID Table Level] as Identity (index) starting from 1, incrementing 1 each time.
What will be the right syntax using SQL SERVER 2000?
I am trying to solve the question in the link below:
http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=2093921&SiteID=1
Thanks in advance,
Aldo.
I have tried the code below, but getting syntax error...
ALTER TABLE #CTE
ALTER COLUMN
[RowID Table Level] INT IDENTITY(1,1),
[RowID Data Level] INT;
I have also tried:
ALTER TABLE #CTE
MODIFY
[RowID Table Level] INT IDENTITY(1,1),
[RowID Data Level] INT;
View 18 Replies
View Related
Jul 1, 2005
Hi,
I have a table with 52 million rows which resides on Primary file group in my database. Because of huge number of rows the performance has gone very down and I would like to break the table into parts.
Can anyone suggest me the steps for doing the same and the number of parts that should be made. It is named as Account_Transactions and contains information of Policies in an insurance database.
Rajat
View 2 Replies
View Related
Mar 12, 2008
Hey guys,
I have a table with about 80 columns and 400 millions records. Each columns has different responses that I need to get frequency for. I need to get counts for each response from all the columns... I have a query that does it, but it will run forever... what is the best way to do so?
My starting query:
select res, sum(cnt) from
(
select col1 res, count(*) as cnt from table1 with (nolock)
group by col1
union all
select col2 res, count(*) as cnt from table1 with (nolock)
group by col2
........................
select col80 res, count(*) as cnt from table1 with (nolock)
group by col80
)a group by res
View 1 Replies
View Related
Jan 15, 2004
Hi,
My DB size (Right click on DB Name, Data Files tab, Space Allocated field) was 10914 MB.
I delete a huge table (1.2 million records * 15 columns).
I checked the db size again. It didnt change.
Shouldn't it decrease because I delete a huge table ??
View 14 Replies
View Related
Mar 16, 2004
I have a huge table with 4 primary keys on it. I need to delete the data from this table ( approx. 5.6 millions records to be deleted). It takes a hell lot of time to delete it by normal query.
Can someone please suggest me a better way?
Any help will be appreciated.
View 14 Replies
View Related
Jul 24, 2013
I have a very large table , and from that table I need just 2 records with column1 = 'A' and column1 = 'B' .
Here I don't think if I can not use OR or IN or Case operators because I need exactly 2 records not more.
View 6 Replies
View Related
Mar 23, 2008
Hi Guys,
What is the fast way to move huge table (77 million) records with 25 columns across servers? The servers are not linked though.
Thanks for the help.
View 3 Replies
View Related
May 30, 2008
Hi Everyone,
We have a large test database with million of records for more than company site Code. Sometime we want to refresh the data of that database for one or more site Codes.
In order to do that I have to delete all records of the site code we want to refresh on the test database first then copy a new set of data from production database over. Since we refresh data based on the site code therefore I have to use the Delete command instead of Truncate.
Since this is a huge database with thousand of tables and million of records per table I have a performance issues with delete command. So what would be the best to delete a large number of records without writing any information to database log file?
FYI: The Recovery model of this database is Simple
Regards,
Jdang
View 9 Replies
View Related
Nov 19, 2015
SQL Server: 2008 R2
Question A : I need to truncate a table, it has 21 millions of rows and it has a size of 14 GB.
1- How do I find out if this table is not being referenced by a FOREIGN KEY?
2- Does it Participates in a indexed view?
3- Is being published by using transactional replication or merge replication?
Question B: How do I safely truncate that table?
View 8 Replies
View Related
Jul 12, 2007
Hey Guys
I have meet the same problem, too.
I create a table with 1,380,000 rows data,
the db real size about 114 MB.
The primary key size is nchar(6).
When I use RDA pull, I found that the primary
key in the PDA disappear. So, It took a long time
to get query response.
But when I delete some rows to 680,000 rows of data.
After I pull, The primary key can pull from the SQL Server.
PS: I didn't change any code. Just delete some rows.
Is that SQL-Mobile's bug??
PS: 1.Database and Temp Database limitation both are 384MB
2.If I use query analyzer to add primary key it works! so strange!!
3.Pull process return "S_OK".
4.After Pull process finished, the db connection still alive. It seems not like
time out problem.
5.Local Connection String:"Data Source='%s\%s';SSCEatabase Password='%s';SSCE:Encrypt Database='true';SSCE:Max Database Size=384;SSCE:Temp File Max size=384;SSCE:Temp File Directory=%s"
View 1 Replies
View Related
Jan 24, 2008
Hi
I have a table (Sql server 2000) which has 14 cost columns for each record, and now due to a new requirement, I have 2 taxes which needs to be applied on two more fields called Share1 and share 2
e.g
Sales tax = 10%
Use Tax = 10%
Share1 = 60%
Share2 = 40%
So Sales tax Amt (A) = Cost1 * Share1 * Sales Tax
So Use tax Amt (B) = cost1 * share2 * Use tax
same calculation for all the costs and then total cost with Sales tax = Cost 1 + A , Cost 2 + A and so on..
and total cost with Use tax = Cost1 +B, Cost 2 +B etc.
So there are around 14 new fields required to save Sales Tax amt for each cost, another 14 new fields to store Cost with Sales Tax, Cost with Use tax. So that increases the table size.
Some of these fields might be used for making reports.
I was wondering which is a better approach out of the below 3:
1) To calculate these fields dynamically while displaying them on the User interface and not save in DB (while making reports, again calculate these fields dynamically and show), or
2) Add new formula field columns in database table to save each field, which would make the table size bigger, but reporting becomes easier.
3) Add only those columns in database on which reports needs to be made, calculate rest of the fields dynamically on screen.
Your help is greatly appreciated.
Thanks
View 3 Replies
View Related
Jan 24, 2008
Hi
I have a table (Sql server 2000) which has 14 cost columns for each record, and now due to a new requirement, I have 2 taxes which needs to be applied on two more fields called Share1 and share 2
e.g
Sales tax = 10%
Use Tax = 10%
Share1 = 60%
Share2 = 40%
So Sales tax Amt (A) = Cost1 * Share1 * Sales Tax
So Use tax Amt (B) = cost1 * share2 * Use tax
same calculation for all the costs and then total cost with Sales tax = Cost 1 + A , Cost 2 + A and so on..
and total cost with Use tax = Cost1 +B, Cost 2 +B etc.
So there are around 14 new fields required to save Sales Tax amt for each cost, another 14 new fields to store Cost with Sales Tax, Cost with Use tax. So that increases the table size.
Some of these fields might be used for making reports.
I was wondering which is a better approach out of the below 4:
1) To calculate these fields dynamically while displaying them on the User interface and not save in DB (while making reports, again calculate these fields dynamically and show), or
2) Add new formula field columns in database table to save each field, which would make the table size bigger, but reporting becomes easier.
3) Add only those columns in database on which reports needs to be made, calculate rest of the fields dynamically on screen.
4) Create a view just for reports, and calculate values dynamically in UI and not adding any computed values in table.
Your help is greatly appreciated.
Thanks
View 4 Replies
View Related
Oct 18, 2015
I want to append the column to the transaction table(60 million records in it.) ..
Our transaction table is being used in production.. but i have very less amount of time ..
Instead of alter table.. (IF we use the alter to take backup of table and do the processing it will take more time). Is there any way to append the column to the transaction table ..
View 8 Replies
View Related
Jul 6, 2015
I have a CTE query against a table with 32K rows that runs fine in 2008R2. I am running it in 2014 Std Ed. against the same data and it runs very slowly. Looking at the execution plan I think I see what's contributing to the slowness.
Note that the "actual number of rows" is some 351M...how is this possible?
the query:
declare @amts table (claim int,allowed decimal(12,2),copay decimal(12,2),deductible decimal(12,2),coins decimal(12,2));
;with unpaid (claimID) as (select claimID from claim where amt+copay + disct+mm + ded=0)
insert @amts
select lineID, sum(rc), sum(copay), sum(deduct),
case when sum(mm)>0 and (sum(mm)<sum(mmamt)) then sum(mm) else 0 end
from claimln
where status is null
and lineID not in (select claimID from unpaid)
group by lineID
it's like there's some massively recursive process going on?
View 5 Replies
View Related
Jul 30, 2015
I have a table (named table1) with 20million rows. It takes around 11 minutes to apply the primary key to this table. There are some tables with over 100 million rows so based on the previous time if my calculations are correct it will take close to an hour apply this primary key for tables with around 100 million rows.
My current solution is to create another table (named table2) with no indexs or primary keys. Pump over only like 5 days worth of data, then apply the primary key. Then have a script that will eventually populate table2 with the rest of the data gradually. When I say gradually I mean like insert like every 100k per hour or something. Keep in mind this table2 is heavily updated with new records.
View 2 Replies
View Related
May 14, 2015
declare @error int, @rowcount int
select @rowcount = COUNT(1) FROM STG_BCDR;
while @rowcount > 0
begin
BEGIN TRAN Deletion
[code]....
Above code i try to delete records batch by batch to avoid table locking at BCDR table.total records in this BCDR table is 40,000 records. However I run the code at execution plan, the BCDR table still clustered index scan which means that the locking still happend.
If i change the delete top (5000)...... to delete top (5).... then thre is clustered index seek, which is good..The problem here is each time only delete top 5 records which is means it will realy take very long time to remove those data.
how to cater the situation inorder for me to delete those huge data without table locking happend. If table locking happend , then other user will not be able to access this table at the same time.
View 6 Replies
View Related
Dec 21, 2005
In
one of our forth coming projects, with ASP.Net/C#/MSSQL Server, We have
to deal with a Business table having about 15 millions of records. We
want to know, that which methodologies should we adopt, both regarding
front end and back end perspective, so the site could give optimised
performance. Also in place of a Dedicated Server, the Hosting Company
provides MSDE (that come with .net). Will this create any problem with
this project, that have such a huge table? Should we go for some
advanced database technique, such as, Clustering, Spliting Tables, etc.
Followings are the fields that the business table contains:
ID, Category ID (which comes from a Category table, each business is
under a category), BusinessName, SignupDate, Address1, Address2, Phone
Number,
Hours Of Operation, Years in Business, LicenseNumber, DiscountCoupon, Website
View 3 Replies
View Related
Nov 20, 2013
I have created a table as below mentioned. Then I want to alter the ID column as identity(1,1) without dropping the table as well as losing the data.
create table dbo.IdentityTest
(
id int not null,
descript varchar(255) null,
T_date datetime not null
)
View 7 Replies
View Related
Oct 25, 2015
we have a table in our ERP database and we copy data from this table into another "stage" table on a nightly basis. is there a way to dynamically alter the schema of the stage table when the source table's structure is changed? in other words, if a new column is added to the source table, i would like to add the column to the stage table during the nightly refresh.
View 4 Replies
View Related
Jul 26, 2004
I have contract table which has built in foreign key constrains. How can I alter this table for delete/ update cascade without recreating the table so whenever studentId/ contactId is modified, the change is effected to the contract table.
Thanks
************************************************** ******
Contract table DDL is
create table contract(
contractNum int identity(1,1) primary key,
contractDate smalldatetime not null,
tuition money not null,
studentId char(4) not null foreign key references student (studentId),
contactId int not null foreign key references contact (contactId)
);
View 3 Replies
View Related
Aug 13, 2006
In SQL Server 2005,here are two tables, created by the following SQL Statements:
CREATE TABLE student(
ID CHAR(6) PRIMARY KEY,
NAME VARCHAR(10),
AGE INT
);
CREATE TABLE score(
ID CHAR(6) PRIMARY KEY,
SCORE INT,
FOREIGN KEY(ID) REFERENCES student(ID)
);
For the length of Column ID is not enough, So I want to alter its length.The alter statement is:
ALTER TABLE student ALTER COLUMN ID CHAR(20)
For the table student is referenced by table score, the alter statement can not alter the column of the table student, and the SQL Server DBMS give the errors.
But, I can manually alter the length of the column ID in SQL SERVER Management Studio. How to alter column length of the master table(student) along with the slave table(score)?
Thanks!
View 2 Replies
View Related
May 26, 2015
which ALTER TABLE/ALTER COLUMN- Statement has a Recreate Table as result ?
View 2 Replies
View Related
Jul 20, 2005
Hello!I have an MS SQL-server with an database, that runs replication. In thisdatabase there is an table with an columni want to extend; varchar(50)->varchar(60).But I get this error (using design window of Enterprise Manager): Cannotdrop the table 'MytableName' because it is being used for replication.Thanks for helpBjoern
View 1 Replies
View Related
Apr 10, 2007
A customer wants to implement table partitioning on a replicated table.
They want to hold 13 months of data in the table and roll off the earliest/oldest month to an identical archive table. The table has a date field and partitioning by month makes sense all around.
So SWITCH PARTITION is the obvious solution to this, except for the fact that the table is replicated (transactional w/no subscriber updates).
What are his architectural or practical solutions to using table partitioning and replication?
thx
View 5 Replies
View Related
Feb 15, 2006
Is there a way to add a column to an existing table and do it all in C#If my query string is as follows how do I execute the query?ALTER TABLE interests ADD COLUMN Swim VARCHAR(1) NOT NULL DEFAULT('n')ThanksMoonWa
View 2 Replies
View Related