Truncate A Huge Table Having 21 Millions Of Rows
Nov 19, 2015
SQL Server: 2008 R2
Question A : I need to truncate a table, it has 21 millions of rows and it has a size of 14 GB.
1- How do I find out if this table is not being referenced by a FOREIGN KEY?
2- Does it Participates in a indexed view?
3- Is being published by using transactional replication or merge replication?
Question B: How do I safely truncate that table?
View 8 Replies
ADVERTISEMENT
Aug 10, 2006
Hi,
I have a DataTable in memory and I want to write a C# code to dump the data into a SQL database. Is there a faster way of dumping millions of rows into a SQL table besides running INSERT INTO row by row?
Thank you,
Jina
View 3 Replies
View Related
Sep 21, 2015
I have deleted nearly 30 million rows from a table. But however when I used the sp_spaceused command to calculate the data occupied by the table I don't see any difference in the data size of the table. In fact the data has increased to few MBs after the deletion, but not much.
View 8 Replies
View Related
Jul 24, 2013
I have a very large table , and from that table I need just 2 records with column1 = 'A' and column1 = 'B' .
Here I don't think if I can not use OR or IN or Case operators because I need exactly 2 records not more.
View 6 Replies
View Related
Jul 6, 2015
I have a CTE query against a table with 32K rows that runs fine in 2008R2. I am running it in 2014 Std Ed. against the same data and it runs very slowly. Looking at the execution plan I think I see what's contributing to the slowness.
Note that the "actual number of rows" is some 351M...how is this possible?
the query:
declare @amts table (claim int,allowed decimal(12,2),copay decimal(12,2),deductible decimal(12,2),coins decimal(12,2));
;with unpaid (claimID) as (select claimID from claim where amt+copay + disct+mm + ded=0)
insert @amts
select lineID, sum(rc), sum(copay), sum(deduct),
case when sum(mm)>0 and (sum(mm)<sum(mmamt)) then sum(mm) else 0 end
from claimln
where status is null
and lineID not in (select claimID from unpaid)
group by lineID
it's like there's some massively recursive process going on?
View 5 Replies
View Related
May 8, 2014
In my ETL job I would like to truncate stg table but before truncating stging table, I want to make sure that all the records are inserted in the data model. The sample is as below
create table #stg (
CreateID int,
Name nvarchar(10)
)
insert into #stg
select 1, 'a' union all
select 2, 'b' union all
[Code] ....
How can I check among these tables and make sure that all values are loaded into the data model and the staging table can be truncated.
View 2 Replies
View Related
Oct 22, 2015
I have a job which executes hourly.
Essentially:
Begin
Truncate table A
Insert into A
(Col1,
Col2,
Col3...
)
Select Value1,
Value2,
Value3...
From Table B
End
The insert operation query takes approximately 3.5 minutes to execute. What's occurring is the Table is immediately truncated, and there are no rows in the table for those 3.5 minutes.
How can I avoid having this gap - where there are no rows in the table for that period of time during the job execution ? The table could be locked, but that doesn't seem like the best solution.
View 8 Replies
View Related
May 26, 2004
hi everybody,
i have an application that generate a lot rows from 1 mellion to 2 mellions rows
i wana insert this record in MS SQL server in a fast way
i am currentlly loop through this records while it is loaded in dataset
building a command text that generate insert query for each row
and run it against SQL server
but it takes a lot of time to be finished
is there r a way to bulk insert this data?
thanks 4 ur help.
Bolos
View 7 Replies
View Related
Feb 21, 2008
Hi everyone - I have an ETL package which loads about 10 million rows from SQL 2005 staging tables to new, empty tables (no indexes or constraints) in another SQL 2005 DB to be SWITCHED into the main partioned data tables.
Both databases reside on the same SQL Server instance - it is a dev server so the disk aren't super fast/SAN speeds but it has plenty of RAM/CPU & SCSI disks.
The insert takes about 45 minutes - can I get this working any faster?? or is this typical for 10 million rows?? I've messed about withe the data flow a few times but I can't seem to get any significant improvements.
Any tips anyone??
I perform several lookups on dimensions - these are not cached.
I do query the source table concurrently with different WHERE clauses & run two pipelines processing the data into 2 destination tables.
Would it be better to query the base table once & use a conditional split instead of the two separate queries??
I also mulicast from each pipeline & use a UNION ALL to log some of the rows from each pipeline to anther destination table.
Hope this makes sense?? Any ideas or tips on how I can speed up this kinda transform would be appeciated..
I'm using oleDB connections.
Hope ths makes some kinda sense!! Thanks for any advice!!
Sinister Pengiun
View 5 Replies
View Related
Dec 12, 2014
I run the following statement and it will not update beyond 7 million plus rows and I have about 38 million to complete. I keep checking updated row counts and after 1/2 day it's still the same so I know something is wrong because it was rolling through no problem when I initiated it. I need to complete ASAP so it's adding to my frustration. The 'Acct_Num_CH' field is an encrypted field (fyi).
SET rowcount 10000
UPDATE [dbo].[CC_Info_T]
SET [Acct_Num_CH] = 'ayIWt6C8sgimC6t61EJ9d8BB3+bfIZ8v'
WHERE [Acct_Num_CH] IS NOT NULL
WHILE @@ROWCOUNT > 0
BEGIN
SET rowcount 10000
UPDATE [dbo].[CC_Info_T]
SET [Acct_Num_CH] = 'ayIWt6C8sgimC6t61EJ9d8BB3+bfIZ8v'
WHERE [Acct_Num_CH] IS NOT NULL
END
SET rowcount 0
View 5 Replies
View Related
Oct 11, 2007
Hello,Currently we are in the process of implementing a sql server database where couple tables will have millions of rows ( about 98 millions and will grow) and a web site that will retrieve and sort the data ( read only). How asp.net gridview and sqldatareader act situation like that? Will it be a very slow response? Is there any alternative? Is there any example on the net?
Assuming tables are well tuned and well indexed.
Thank you in advance.
View 4 Replies
View Related
Oct 11, 2007
Hello,
Currently we are in the process of implementing a sql server database where couple tables will have millions of rows (about 98 millions and will grow) and an asp.net site that will retrieve and sort the data
What will be the best practice installing the database in situation like this one?
Do we need a cluster server? Indexing needs to be done in a special way?
Thanks in advance.
View 5 Replies
View Related
Aug 26, 2015
I have one view which is based on couple of tables. Here is the definition of view. Which are the options i can use to optimize the view for better performance. This is one of the view which causing issue on database.
CREATE VIEW [dbo].[V_Reqs]
WITH SCHEMABINDING
AS
SELECT purchase.Req.RequisitionID, purchase.Req.StatusCode AS Expr2, purchase.Req.CollectionDateTime,
purchase.Req.ReportDateTime, purchase.Req.ReceivedDateTime, purchase.Req.PatientName, purchase.Req.AddressOne,
purchase.Req.AddressTwo, purchase.Req.City, purchase.Req.PostalCode, purchase.Req.PhoneNumber,
[code]....
View 3 Replies
View Related
Apr 22, 2006
I am having one querry regarding the same line.
In my stored procedure i am fetching the data from one table containing upto 5 to 6 million rows I made use of index in my database but then also I cant optimise my execution time of that sp.
Please help me out of this problem.
View 1 Replies
View Related
Apr 3, 2000
SQL 7 SP1 NT4 SP5
I have a TRANSACTION table with 150 million rows.
I have a USER table.
Each user has about 600 records in the TRANSACTION table.
The TRANSACTION cluster index is on USERID + RECID . The second index is on USERID + Fieldx + Fieldy.
The TRANSACTION table gets about 1.4 million inserts in a normal day and about 40,000 updates.
I want to go through the USER table and delete all users who have not visited me in a while.
I want to do this without substantially hindering performance in a production environment. I can perform this over a week period or two if needed.
The best way I thought of doing this was to grab x amount of users in a cursor and loop through deleting their corresponding TRANSACTION records.
Does anyone have any ideas on a better way. What is going to happen to my indices during this time ?
Thanks !!!
View 3 Replies
View Related
Apr 16, 2015
I am using SQL Server 2012 SE.I am trying to delete rows from a couple of tables (GetPersonValue has 250 million rows and I am trying to delete 50Million rows and GetPerson has 35 Million rows and I am trying to delete 20 million rows). These tables are in TX replication.The plan is to delete data older than 400 days old.
I tried to move data to new tables from the last 400 days and it took me like 11 hours. If I delete data in chunks of 500000 then its taking a long time to rebuild indexes(delete plus rebuild indexes 13 hours). Since I am using standard edition partition wont work.
find ddl below:
GO
CREATE TABLE [dbo].[GetPerson](
[GetPersonId] [uniqueidentifier] NOT NULL,
[LinedActivityPersonId] [uniqueidentifier] NOT NULL,
[CTName] [nvarchar](100) NULL,
[SNum] [nvarchar](50) NULL,
[PHPrimary] [nvarchar](50) NULL,
[code]....
View 1 Replies
View Related
Aug 21, 2015
There is a stored procedure. It uses linked server. As we will be migrating to amazon cloud, our architect instructed not to replace linked server with openquery.
View 8 Replies
View Related
Jun 21, 2007
hi!
i am using sql 2005 k with sp1.
I want to delete 30-40 million rows from a transactional table. Whats the fastest way to delete these rows. just to delete 300,000 rows it takes 30 min. also i don't want to truncate the table.
any help would be appreciated
Thanks
View 9 Replies
View Related
Mar 14, 2001
We have a huge table which has 12 million records. And when I run the following script, it took 50 hours. Is there anyone who can help? Thanks.
update TableA
set In=e.In, EA=e.ea, We=e.we
from TableA c, TableB e
where c.code=e.code
TableA 12,000,000 records.
TableB 750,000 records.
And have clustered index on each code field.
View 6 Replies
View Related
May 18, 2001
I need to alter a table (expand the column size for varchar(10) to varchar(255)) and the table has 200 million rows.
Please suggest me the best and the fastest method to achieve it. The database is on SQL 7.0
Thanks
View 1 Replies
View Related
May 13, 2008
Hello
i want to ask about the huge table(table with many tera records) backup time cost , any one can help me please in determining the time cost nearly
View 2 Replies
View Related
Jul 1, 2005
Hi,
I have a table with 52 million rows which resides on Primary file group in my database. Because of huge number of rows the performance has gone very down and I would like to break the table into parts.
Can anyone suggest me the steps for doing the same and the number of parts that should be made. It is named as Account_Transactions and contains information of Policies in an insurance database.
Rajat
View 2 Replies
View Related
Mar 12, 2008
Hey guys,
I have a table with about 80 columns and 400 millions records. Each columns has different responses that I need to get frequency for. I need to get counts for each response from all the columns... I have a query that does it, but it will run forever... what is the best way to do so?
My starting query:
select res, sum(cnt) from
(
select col1 res, count(*) as cnt from table1 with (nolock)
group by col1
union all
select col2 res, count(*) as cnt from table1 with (nolock)
group by col2
........................
select col80 res, count(*) as cnt from table1 with (nolock)
group by col80
)a group by res
View 1 Replies
View Related
Aug 23, 2006
Here is my issue I am new to 2005 sql server, and am trying to take my old data which is exported to a txt file and import it to tables in sql. The older database is non relational, and I had made several exports for the way I want to build my tables. I built my packages fine and everything is working until I start building relationships. I remove my foreign key and the table with the primary key will get updated for the package again. I need to update the data daily into sql, and once in it will only be update from the package until the database is moved over.
It will run and update with a primary key until I add a foreign key to another database.
Here is my error when running the package when table 2 has a foreign key.
[Execute SQL Task] Error: Executing the query "TRUNCATE TABLE [consumer].[dbo].[Client] " failed with the following error: "Cannot truncate table 'consumer.dbo.Client' because it is being referenced by a FOREIGN KEY constraint.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
View 3 Replies
View Related
Jan 15, 2004
Hi,
My DB size (Right click on DB Name, Data Files tab, Space Allocated field) was 10914 MB.
I delete a huge table (1.2 million records * 15 columns).
I checked the db size again. It didnt change.
Shouldn't it decrease because I delete a huge table ??
View 14 Replies
View Related
Mar 16, 2004
I have a huge table with 4 primary keys on it. I need to delete the data from this table ( approx. 5.6 millions records to be deleted). It takes a hell lot of time to delete it by normal query.
Can someone please suggest me a better way?
Any help will be appreciated.
View 14 Replies
View Related
Mar 23, 2008
Hi Guys,
What is the fast way to move huge table (77 million) records with 25 columns across servers? The servers are not linked though.
Thanks for the help.
View 3 Replies
View Related
May 30, 2008
Hi Everyone,
We have a large test database with million of records for more than company site Code. Sometime we want to refresh the data of that database for one or more site Codes.
In order to do that I have to delete all records of the site code we want to refresh on the test database first then copy a new set of data from production database over. Since we refresh data based on the site code therefore I have to use the Delete command instead of Truncate.
Since this is a huge database with thousand of tables and million of records per table I have a performance issues with delete command. So what would be the best to delete a large number of records without writing any information to database log file?
FYI: The Recovery model of this database is Simple
Regards,
Jdang
View 9 Replies
View Related
Jul 12, 2007
Hey Guys
I have meet the same problem, too.
I create a table with 1,380,000 rows data,
the db real size about 114 MB.
The primary key size is nchar(6).
When I use RDA pull, I found that the primary
key in the PDA disappear. So, It took a long time
to get query response.
But when I delete some rows to 680,000 rows of data.
After I pull, The primary key can pull from the SQL Server.
PS: I didn't change any code. Just delete some rows.
Is that SQL-Mobile's bug??
PS: 1.Database and Temp Database limitation both are 384MB
2.If I use query analyzer to add primary key it works! so strange!!
3.Pull process return "S_OK".
4.After Pull process finished, the db connection still alive. It seems not like
time out problem.
5.Local Connection String:"Data Source='%s\%s';SSCEatabase Password='%s';SSCE:Encrypt Database='true';SSCE:Max Database Size=384;SSCE:Temp File Max size=384;SSCE:Temp File Directory=%s"
View 1 Replies
View Related
Jan 24, 2008
Hi
I have a table (Sql server 2000) which has 14 cost columns for each record, and now due to a new requirement, I have 2 taxes which needs to be applied on two more fields called Share1 and share 2
e.g
Sales tax = 10%
Use Tax = 10%
Share1 = 60%
Share2 = 40%
So Sales tax Amt (A) = Cost1 * Share1 * Sales Tax
So Use tax Amt (B) = cost1 * share2 * Use tax
same calculation for all the costs and then total cost with Sales tax = Cost 1 + A , Cost 2 + A and so on..
and total cost with Use tax = Cost1 +B, Cost 2 +B etc.
So there are around 14 new fields required to save Sales Tax amt for each cost, another 14 new fields to store Cost with Sales Tax, Cost with Use tax. So that increases the table size.
Some of these fields might be used for making reports.
I was wondering which is a better approach out of the below 3:
1) To calculate these fields dynamically while displaying them on the User interface and not save in DB (while making reports, again calculate these fields dynamically and show), or
2) Add new formula field columns in database table to save each field, which would make the table size bigger, but reporting becomes easier.
3) Add only those columns in database on which reports needs to be made, calculate rest of the fields dynamically on screen.
Your help is greatly appreciated.
Thanks
View 3 Replies
View Related
Oct 31, 2005
I have the next question, and i would like to hear what do you thinkabout, and if is there a better solution for "my problem"here is the question, I have a huge table with 60GB of data (imagefiles). The problem happen always when i try to ALTER the structure ofthe table. For example I change a field char(3) to char(4)...thesqlserver then performs the "alter table" command...that must besomething similar than "insert into the new table + drop the actualtable" and for that I need about 60GB o space for my LOG file, andtakes hours to complete the operation.Is this the only way to alter a single field in my table??I would like to heard you opinions...Thanks..ALberto
View 2 Replies
View Related
Jan 24, 2008
Hi
I have a table (Sql server 2000) which has 14 cost columns for each record, and now due to a new requirement, I have 2 taxes which needs to be applied on two more fields called Share1 and share 2
e.g
Sales tax = 10%
Use Tax = 10%
Share1 = 60%
Share2 = 40%
So Sales tax Amt (A) = Cost1 * Share1 * Sales Tax
So Use tax Amt (B) = cost1 * share2 * Use tax
same calculation for all the costs and then total cost with Sales tax = Cost 1 + A , Cost 2 + A and so on..
and total cost with Use tax = Cost1 +B, Cost 2 +B etc.
So there are around 14 new fields required to save Sales Tax amt for each cost, another 14 new fields to store Cost with Sales Tax, Cost with Use tax. So that increases the table size.
Some of these fields might be used for making reports.
I was wondering which is a better approach out of the below 4:
1) To calculate these fields dynamically while displaying them on the User interface and not save in DB (while making reports, again calculate these fields dynamically and show), or
2) Add new formula field columns in database table to save each field, which would make the table size bigger, but reporting becomes easier.
3) Add only those columns in database on which reports needs to be made, calculate rest of the fields dynamically on screen.
4) Create a view just for reports, and calculate values dynamically in UI and not adding any computed values in table.
Your help is greatly appreciated.
Thanks
View 4 Replies
View Related
Oct 18, 2015
I want to append the column to the transaction table(60 million records in it.) ..
Our transaction table is being used in production.. but i have very less amount of time ..
Instead of alter table.. (IF we use the alter to take backup of table and do the processing it will take more time). Is there any way to append the column to the transaction table ..
View 8 Replies
View Related