Large Table Loads && Transaction Log Issues

Apr 7, 2004

I have a table that’s about 3 gigs, using this table and a few others I’m making another table. The problem is when making the new table my transaction log inflates so much that I’m running out of disk space. What I can I do to prevent this or to keep the transaction log size under control?

View 4 Replies


ADVERTISEMENT

Table Loads From DB2 - Parallel Processes, Linked Server, And Lightweight Pooling Issue

Aug 17, 2007

Hello. I have a 32-bit SQL 2005 (SP1) server that is my current Production server. I also have a 64-bit SQL 2005 (SP1) server that will become my Production server. I have several SSIS packages that load/refresh data on a nightly basis to a few of my databases from DB2 (MVS). I have the packages setup on the current Production server (32-bit) and all is working well through the Microsoft OLE DB provider for DB2. However, on the 64-bit server, I am experiencing some issues with the SSIS packages failing due to large loads. Loads that are loading tables with 500K, or less, data seem to run without issue (through SQL Agent Jobs). But, larger table loads are failing.

I do have a linked server set up on the 64-bit server to the 32-bit server, for other processes. And because of this I have lightweight pooling turned off on the 64-bit server (because of distributed querying). Lightweight pooling is turned on on the 32-bit server. Could this be what is causing some of my issue? Since I don't have the lightweight pooling option turned on (on the 64-bit server), am I not getting the proper amount of through-put for my 8 dual core CPU server?

Thanks
Scottye

View 1 Replies View Related

Transaction Log Too LARGE

Jan 30, 2007

I often in my job come across the following scenario:

Client rings up and says Run out of server space due to SQL 2000 Transaction log has consumed all the space or has consumed a very large portion of it.

what is the correct procedure in resolving this ASAP working with Full mode SQl 2000 Databases. as i have other
guys within my company that all do different things with good result and sometimes bad results.

The procedure i use is the following:

METHOD 1

1.backup both database and transaction log
2.Right click the database and select Detach, which from my understanding is a clean detach method which ensures that uncommited transactions are commited to the database.
3. Rename the old transaction log to .ldfOLD
4. REATTACH Database which creates a new transaction log.

METHOD 2

1. I dont use this method but im pretty sure its risky and if possible
can someone provide me with the reasons why:

1.Change database method from full to simple mode, shrink logs
and then change back to full mode.



basically what i am asking is what is the fastest way to sort out the above issue most effectively and with the abilty to Roleback succesfully.

PLEASE dont comment on why is the transaction log so big as i dont want to look into that now all i am sking is what is the most effective method to shrink the log down and save space.

View 13 Replies View Related

SQL Server Large Transaction Log

Feb 3, 2006

I'm working with a database that has a relatively small amount of data.  The size of the data file for this database is in the 100-110 MB range, which gives an idea of its size and how much data is in it.  What is slightly troubling and baffling, is that the transaction file for this same database is significantly larger than the data file, at about 2.5 GB.  What could be happening that is causing this transaction log to be larger than the actual database size? 
Thx

View 1 Replies View Related

Transaction Log Space To Large

Feb 24, 2000

I am new to SQL and might be missing something very easy. I have a situation where the space allocated to the transaction log of a database is extremly large (5 Gig). I can not manually reduce it. This gives me a "Error 21335: [SQL-DMO]The new DBfile size must be larger than its current size." This is a problem because the increase in size has taken all available space on the server.

Mike

View 3 Replies View Related

Transaction Log Extremely Large

Oct 10, 2000

Hello everyone,
I'm not sure if this is a problem but I've got a database which is about 1700mg in size (at least that's the allocated space on disk) and the log file is over 4600 mb. I've truncated the log file but it still keeps growing. None of our other databases are this large and there are a lot of transactions performed regularly but it looks odd to me that the log is this big when the data is half the size. How can I find out exactly how much space is being taken up by the data and is there anything I can do that will shrink the size of the log file? I am not really a dba so I'm not sure how crucial this is in the grand scheme of things.
Thanks

View 1 Replies View Related

Large Transaction Log Files

Apr 4, 2007

i have a few tables using Sql Server 2005 Express. currently they are holding roughly 30-40k records in them. i have my log files set at restricted growth to 90 megs. while im not close to reaching that, i would like my tables to be able to scale up to possibly millions of records. based on that, i figure the transaction log file will prolly need to have a higher threshold (unrestricted growth). for those with experience, for tables that have millions of records, what are the average size log files i could expect.
is it a bad idea to just shrink the log file every night during off peak hours so that regardless of the amount of records i have, ill always start the day with a minimal log file?
do large log files have any effect on SQL performance?

View 3 Replies View Related

How To Split A Large Transaction?

Jan 23, 2008

I have a huge table with ~30G data, and a column needs to be updated. In order to avoid a huge transaction, what I did was setting up a loop, update part of the records in each loop. The query is like following:



Declare @mo smalldatetime
Declare MOs cursor
for
Select [a month] from [a table]

Open MOs
Fetch Next from MOs into @mo
while @@FETCH_STATUS = 0
Begin
exec sp_UpdateColumn @mo
-- PRINT @mo
Fetch Next from MOs into @mo
end
close MOs
deallocate MOs

sp_UpdateColumn is the query that actually does the update, the code is like following:

Create Procedure sp_UpdateColumn @updMoDate smalldatetimeAS
[do some calculation here, store results in a temp table #Temp1]
Begin TranUpdate A set col =b.col from [big table] A, #Temp1 B where [some matching conditions]Commit GO


The BEGIN TRAN and COMMIT lines were meant to break up transaction, however, our database support people still tell me that a huge transaction has generated a GB sized log file that blocked the drive. Unless the transaction wasn't really splitted this should not happen. Can someone help me take a look at the code and tell me is there anything wrong? Thanks

View 6 Replies View Related

Transaction Log Size Growing Very Large

Oct 10, 2006

hi all,


my tlogs at the subscriber are growing very large

irregardless of my recovery mode. any help

using snapshot-push replication

thanks,

joey



View 8 Replies View Related

Shrinking Unreasonably Large Transaction Log File.

Oct 12, 2002

Gurus,

I have inherited a SQL 2000 database ( (I am new to SQL DBA) and I found this when I was checking the db properites . The transaction log has grown bigger than the actual data file, I thought transaction log backups would truncate the inactive portion of the log file and shrink the transaction log, but it was not the case it seems, may be it was truncating the inactive portion of the log, but not shrinking it. This site does not have a job for truncating the data/log files periodically. What is the best method to deal this situation, how can I shrink the Transaction log quickly?,

All your suggestions are welcome.

TIA,
-Jay

View 2 Replies View Related

SQL Server 2008 :: Making Use Of A Large Transaction File To Delete Records?

Jun 5, 2015

Currently we has a database of size about 300G. Because our backup system failed some time past we were left with a transaction log file which grew to about 160G. However our backups are working again and everything is working fine. My understanding is that now the transaction log file is practically empty but the capacity remains at 160G.

When you delete records the deleted transactions are going to get logged to the transaction file. My understanding is when a backup is done these transactions get discarded out of the transaction file.

could I make use of this relatively large transaction file and start deleting transactions without out actually adding to the transaction file size.

The plan is to delete records from logging tables that are not referenced to by any other table without this increasing the transaction log file.For example over a period of a few weeks we can delete a chunk of records from a table. Then after it has completed a backup we can delete another chunk of records out of this table until we have got the table down to the records that we now need.Will this work?

View 2 Replies View Related

Does It Store All The Results To Tempdb Database When I Query Against A Large Table Which Joins Another Table?

Jun 25, 2007

Hi, all experts here,



I am wondering if tempdb stores all results tempararily whenever I query a large fact table with over 4 million records which joins another dimension table? Since each time when I run the query, the tempdb grows to nearly 1GB which nearly runs out all the space on my local system drive, as a result the performance totally down. Is there any way to fix this problem? Thanks a lot in advance and I am looking forward to hearing from you shortly for your kind advices.



With best regards,



Yours sincerely,



View 11 Replies View Related

SQL 7 Loads / MTS Use

Feb 24, 1999

I am at turning point with MS and SQLv6.5. I really need to make a decision for a large scale growth of a project. I was hoping people could respond for their experiences with SQL 7, with some rough load numbers ( concurrent connections, size DB, transactions/sec or day). Also is anyone using this product with MTS.

Production numbers only please. So many people have worked with the Beta and are doing testing, but it doesn't really count until it is in production.

thanks in advance.

View 3 Replies View Related

Large Table-Table Partition, View Or Other Method?

Aug 27, 2007

Hi everyone,

I use sql 2005. What is the best practice for dealing with large table (more than million rows)? Table Partition, View or other?

Can you please give some suggestions? It will be very helpful if you can post some references or examples.

Thank you!

View 12 Replies View Related

Bulk Loads Using OLE DB

Feb 28, 2008

Basicallly, I have two questions-

Is it requirement that the OLE DB provider should have implemented IRowetChange interface so that it can be used to configure OLE DB destination?

Is there any way to configure a destnation for bulk and faster load? Normal OLE DB destination via IRowsetChange does load one row at a time ( InsertRow() ).

Thanks,
Vivek.

View 1 Replies View Related

VS IDE Loads All Packages During Run

Jul 31, 2006

I recently got a new computer and reinstalled Visual Studio 2005, SQL, and all the goodies. When I build a SSIS project with multiple packages and run it, VS loads all of the packages in the project into the IDE before execution. None of the packages are related and I only want the current package to be loaded. I don't remember it working this way before. I've looked through the package, project, solution, and VS options and can't find anything that might control this behavior.

Does anyone know what controls this behavior?

View 1 Replies View Related

Get A Variable From Db When A FormView Loads

Apr 21, 2006

Hello,I have a formview and when I load it I would like to check a variable "Cover" from the database, to see wether or not it is empty.
But how on earth do I get the variables from the sqldatasource in my function "FormView1_load(...)" ??

View 1 Replies View Related

Database Loads Very Slow

Aug 23, 2000

I have a database that's 2.5GB but only has about 17MB of actual data. I've setup a standby server that I load my dumps into. The load takes about 10 miuntes. The dump takes about a minute and a half (which also seems slow to me for that small amount of data). I don't expect that it should take that long to load 8800 pages into a database. The standby server is the same hardware as the production server (sinlge 500MHz Xeon, 2GB RAM, RAID 5). The server has only a single RAID 5 array to store all the OS, and all the SQL data however, I still don't thinkit should take thta long to load. Let me know what you think.


--Buddy

View 1 Replies View Related

ASP + SQL Loads Really Slow Over Intranet

Jul 23, 2005

Hi guys,I've created a web application using ASP together with SQL Server asour db source, running through IIS 6 on a Winows Server 2003 platform.This application retrieves a list of customer codes from our db, sorecords returned could be as many as 2000+ for any single transaction.The application runs fine for users from the same state. However, ourinterstate colleagues have notice that it takes more than 3-4mins forthe page to load, while it only takes me < 2secs to load.Our intranet server is located in the same state as I, so anyone fromwithin this state has no problems loading the page. All other statesare finding it unbearable.I've done some debugging, and it appears to be a server factor.I saved the page with the longest list to a local drive and opened itlocally in IE and it loads quickly.Does anyone have any suggestions as to how to speed this application upfor our interstate users?Any ideas would appreciated.Thanks,Shawn

View 3 Replies View Related

Paralellizing File Loads

Jul 19, 2007

I have a package that reads a table that has a list of files that I need to load into a table that arrive every night. These files range from 50mb to 1.5gb, The entire process to load and transform is taking about 50mins, which is about a 300% increase from our current production environment. However I am looking at ways to improve performance by loading all of the data into the staging table at the same time.







Is this possible, and do you guys think it would improve performance significantly?



View 5 Replies View Related

SQL Server Loads Over Network Take Forever

Feb 16, 1999

When I attempt to load a database from dump format across a network (100mb Ethernet) It takes forever. (15 hours for 16GB!) can anyone help me find a starting point to troubleshoot this?

Thanks!

-Chris

P.S. File Copies of the same size move at a rapid fashion, and I cannot find any bottlenecks in the network.

View 2 Replies View Related

DTS Loads Text File In Wrong Sequence

Apr 27, 2005

I am trying to load a text file into a temporary table in MS-SQL using DTS.

The file is coming from a unix machine and contains comma-delimited entries.

the DTS script runs every minute and the transformation maps values in the text file into corresponding columns in the temp table.

The problem is that the sequence of the entries in the text file does not match the sequence of the values imported.

I have several instances of this script (all identical) but only one file seems to be out of whack. Does anyone know what could be causing this? This is for a real-time trading system so the sequence has to be perfect.

Thanks
Alastair

View 8 Replies View Related

SQL 2012 :: Availability Groups And Bulk Loads

Sep 8, 2015

We have some tables that are bulk-loaded every day and they do not have RI to the other tables in the database.

To ease pressure on the logs, I had the idea of spinning them off to another database on the same AG in simple or bulk-load recovery model and using synonyms to point to them so the code base would not need changing.

I know an earlier bug in 2005 existed that basically made the query analyzer ignore indexes if a table was accessed via a synonym.

View 0 Replies View Related

SQL 2012 :: SSIS MERGE For Incremental Loads

Sep 28, 2015

I inherited an SSIS package that is rather simple. It grabs data from a SQL Query and then loads it into a SQL table. The first step of this process TRUNCATES the destination table and then reloads for the current year. This table has over a million rows and the DB SOURCE that we are pulling from is not in our domain, so one can imagine how long this takes.

This process is working fine, (given the 45 minutes it takes to repopulate data in the DESTINATION table), but what I really need is a way to load only the rows that are NEW and UPDATED. I would also need functionality to DELETE the rows that have been removed (sounds like a MERGE, right?).I tried using MERGE and MERGE JOIN transformations but these transformations seem to be different from the T-SQL MERGE statement. MERGE seems like a slow UNION and MERGE JOIN only seems to work with SELECTS.

View 3 Replies View Related

Sql Server 2005 Loads Data Slowly

Oct 1, 2007

This is the code i use.

Dim cnn As ADODB.Connection
Dim rst As ADODB.Recordset



Private Sub Form_Load()


Set cnn = New ADODB.Connection
cnn.ConnectionString = "driver={SQL Server};" & "server=SCHS-SQL;uid=sa;pwd=sa;database=Library"
cnn.Open

Call loadrst

End Sub

Public Sub loadrst()
Set rst = New ADODB.Recordset
Dim sql1 As String
sql1 = "select * from Books order by srno"
rst.Open sql1, cnn, adOpenDynamic, adLockOptimistic, adCmdText

If rst.EOF = True Then
MsgBox ("No records are present")
Command1.Enabled = False
Else
Call display
Command1.Enabled = True
End If


End Sub



This is the code i use basically to connect my vb6 application to sql server 2005. I had started out lately trying to use sql server instead of access. So far none of the program have given any problems as the databases has a max one of 120 records. But the one which this code connects to has about 5200 records. I had imported the tables from access into sql server. The size of the database was around 17.67mb so i shrank it and it became 4mb. But still it takes roughly 2 minutes for the user to see the records in the grid. Could you tell me what to do?

Sheldon

View 3 Replies View Related

I Have A Flat File With 500.000 Rows But It Only Loads 249999

Nov 27, 2007

Hi,

In a dataflow there is a Flat File Source that loads a file with 500000 rows but when I execute the package it only loads 249999, I have verified the file and every looks fine, it is a csv and ms-excel recognized it perfectly.

I'm wondering if there is any technical limitation regarding to the amount of rows per file?

and what else can I verify for discover the problem?

thanks

View 7 Replies View Related

Enterprise Manager For 2000 No Longer Loads

Nov 8, 2007

After installing DTS designer tools for SSIS/Visual studio,
I can no longer open enterprise manager and get the following error:

mmc.exe - entry point not found

The procedure entry poinst ?ProcessExecute@@YAXPAUXHWND_@@PBG1@Z could not be located in the dynamic link library SEMSFC.dll.

View 4 Replies View Related

PK On A Large Table

Nov 16, 2007

I am developing an application that has a table with lots of records(network traffic) but the data is summarize every so often to create summary records (old records are deleted). The problem is that I have a PK based on an autoincrement ID (int) that will run out of numbers. However, this ID is not referenced anywhere, (not a foreign key from another table, not use for deletion and there is no update in this table whatsoever).

So my possibilites are:
1.- reseed the id when it is about to run out.
2.- make the id bigint
3.- remove the id and change the PK to 2 other fields
4.- remove the id and without PK

I am leaning toward option 4, because I do not see the need for a PK, but I understand that it is quite out of the normal.. So I would like to hear from other people ( I do not have much experience with DB).

I also like option 3. I already have a index on one of the other fields (time).

Any input will be appreciated.

Claudio Robles

View 7 Replies View Related

BCP Large Table.

Jul 23, 2005

If I use BCP to export a very large table will that table be blockedfor writes during the export process? I don't want to prevent usersfrom accessing that table during the bcp process?Thank You, TFD.

View 1 Replies View Related

Storing Large PDF's In Table

May 26, 2008

Hi,
I have this page that upload's PFD's to a table. In principle this works fine.
Until I try to upload large files (3 to 4 MB)I need to even upload larger files than that. (Don't really know as of yet what users are going to come up with) I get TimeOut problems. Now some people say it is not possible to exceed a limit of about 4 MB. But that there is a workaround by changing something to the web.config file.Can somebody give me info about that, (I am quite a novice really)I tried to change it like this, but to no avail:
<system.web><httpRuntime maxRequestLength="102400"enable = "True"requestLengthDiskThreshold="102400" useFullyQualifiedRedirectUrl="True"executionTimeout="102400"/></system.web> 
Thanks for any help!

View 2 Replies View Related

Move Large Table From DB To DB

Mar 3, 2004

I have a table of approx 1/2 million rows.

On a nightly basis, this table gets rebuilt in a temporary database. Once the table has been built and scrubbed, i need to move it into our webservers db.

I'd like to do this with minimal interuption to the website.

Possible techniques:

1) I could set up a DTS package to copy the table object overwriting the destination table

2) I could export to a flat file and then bulk import into the live table (after truncating it)

3) I could run a process to update smaller chunks of data at a time running delete queries and insert queries.

Anybody have a thought on the best way to do this so that the web users would be virtually unaware that anything was happening ?

View 4 Replies View Related

Duplicate In Large Table

Mar 1, 2002

Hi,

I am absolutely innocent as far as T-SQL is concerned. I need to detect all duplicates (key consists of 5 fields) in the table and delete the duplicates.
I tried different approaches like joins etc but nope.
Any help is appreciated
Thanks

View 2 Replies View Related

Help! Indexing LARGE Table

Aug 6, 2004

OK, I imported 680 million records into an unindexed table. That went well.

Then, I went into Enterprise Manager and added a two column non-unique clustered index to that table to speed access.

It's been running for ~36 hours and I have no idea when it will complete. I have deadlines that I'm going to miss and am very nervous; what can I do?

SQL Server 2000 Enterprise Edition (8.00.818 - sp3 + hotfixes)
Dual 3Ghz Xeon (two physical CPUs each have HyperThreading enabled)
Windows 2000 SP4
4GB RAM (although I just noticed the 3GB OS switch wasn't on)
SCSI boot drive
tempdb, data, and transaction log are on a FibreChannel RAID SAN

Help! Thanks in advance!

View 8 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved