SQL Server Large Transaction Log

Feb 3, 2006

I'm working with a database that has a relatively small amount of data.  The size of the data file for this database is in the 100-110 MB range, which gives an idea of its size and how much data is in it.  What is slightly troubling and baffling, is that the transaction file for this same database is significantly larger than the data file, at about 2.5 GB.  What could be happening that is causing this transaction log to be larger than the actual database size? 

Thx

View 1 Replies


ADVERTISEMENT

SQL Server 2008 :: Making Use Of A Large Transaction File To Delete Records?

Jun 5, 2015

Currently we has a database of size about 300G. Because our backup system failed some time past we were left with a transaction log file which grew to about 160G. However our backups are working again and everything is working fine. My understanding is that now the transaction log file is practically empty but the capacity remains at 160G.

When you delete records the deleted transactions are going to get logged to the transaction file. My understanding is when a backup is done these transactions get discarded out of the transaction file.

could I make use of this relatively large transaction file and start deleting transactions without out actually adding to the transaction file size.

The plan is to delete records from logging tables that are not referenced to by any other table without this increasing the transaction log file.For example over a period of a few weeks we can delete a chunk of records from a table. Then after it has completed a backup we can delete another chunk of records out of this table until we have got the table down to the records that we now need.Will this work?

View 2 Replies View Related

Transaction Log Too LARGE

Jan 30, 2007

I often in my job come across the following scenario:

Client rings up and says Run out of server space due to SQL 2000 Transaction log has consumed all the space or has consumed a very large portion of it.

what is the correct procedure in resolving this ASAP working with Full mode SQl 2000 Databases. as i have other
guys within my company that all do different things with good result and sometimes bad results.

The procedure i use is the following:

METHOD 1

1.backup both database and transaction log
2.Right click the database and select Detach, which from my understanding is a clean detach method which ensures that uncommited transactions are commited to the database.
3. Rename the old transaction log to .ldfOLD
4. REATTACH Database which creates a new transaction log.

METHOD 2

1. I dont use this method but im pretty sure its risky and if possible
can someone provide me with the reasons why:

1.Change database method from full to simple mode, shrink logs
and then change back to full mode.



basically what i am asking is what is the fastest way to sort out the above issue most effectively and with the abilty to Roleback succesfully.

PLEASE dont comment on why is the transaction log so big as i dont want to look into that now all i am sking is what is the most effective method to shrink the log down and save space.

View 13 Replies View Related

Transaction Log Space To Large

Feb 24, 2000

I am new to SQL and might be missing something very easy. I have a situation where the space allocated to the transaction log of a database is extremly large (5 Gig). I can not manually reduce it. This gives me a "Error 21335: [SQL-DMO]The new DBfile size must be larger than its current size." This is a problem because the increase in size has taken all available space on the server.

Mike

View 3 Replies View Related

Transaction Log Extremely Large

Oct 10, 2000

Hello everyone,
I'm not sure if this is a problem but I've got a database which is about 1700mg in size (at least that's the allocated space on disk) and the log file is over 4600 mb. I've truncated the log file but it still keeps growing. None of our other databases are this large and there are a lot of transactions performed regularly but it looks odd to me that the log is this big when the data is half the size. How can I find out exactly how much space is being taken up by the data and is there anything I can do that will shrink the size of the log file? I am not really a dba so I'm not sure how crucial this is in the grand scheme of things.
Thanks

View 1 Replies View Related

Large Transaction Log Files

Apr 4, 2007

i have a few tables using Sql Server 2005 Express. currently they are holding roughly 30-40k records in them. i have my log files set at restricted growth to 90 megs. while im not close to reaching that, i would like my tables to be able to scale up to possibly millions of records. based on that, i figure the transaction log file will prolly need to have a higher threshold (unrestricted growth). for those with experience, for tables that have millions of records, what are the average size log files i could expect.
is it a bad idea to just shrink the log file every night during off peak hours so that regardless of the amount of records i have, ill always start the day with a minimal log file?
do large log files have any effect on SQL performance?

View 3 Replies View Related

How To Split A Large Transaction?

Jan 23, 2008

I have a huge table with ~30G data, and a column needs to be updated. In order to avoid a huge transaction, what I did was setting up a loop, update part of the records in each loop. The query is like following:



Declare @mo smalldatetime
Declare MOs cursor
for
Select [a month] from [a table]

Open MOs
Fetch Next from MOs into @mo
while @@FETCH_STATUS = 0
Begin
exec sp_UpdateColumn @mo
-- PRINT @mo
Fetch Next from MOs into @mo
end
close MOs
deallocate MOs

sp_UpdateColumn is the query that actually does the update, the code is like following:

Create Procedure sp_UpdateColumn @updMoDate smalldatetimeAS
[do some calculation here, store results in a temp table #Temp1]
Begin TranUpdate A set col =b.col from [big table] A, #Temp1 B where [some matching conditions]Commit GO


The BEGIN TRAN and COMMIT lines were meant to break up transaction, however, our database support people still tell me that a huge transaction has generated a GB sized log file that blocked the drive. Unless the transaction wasn't really splitted this should not happen. Can someone help me take a look at the code and tell me is there anything wrong? Thanks

View 6 Replies View Related

Transaction Log Size Growing Very Large

Oct 10, 2006

hi all,


my tlogs at the subscriber are growing very large

irregardless of my recovery mode. any help

using snapshot-push replication

thanks,

joey



View 8 Replies View Related

Shrinking Unreasonably Large Transaction Log File.

Oct 12, 2002

Gurus,

I have inherited a SQL 2000 database ( (I am new to SQL DBA) and I found this when I was checking the db properites . The transaction log has grown bigger than the actual data file, I thought transaction log backups would truncate the inactive portion of the log file and shrink the transaction log, but it was not the case it seems, may be it was truncating the inactive portion of the log, but not shrinking it. This site does not have a job for truncating the data/log files periodically. What is the best method to deal this situation, how can I shrink the Transaction log quickly?,

All your suggestions are welcome.

TIA,
-Jay

View 2 Replies View Related

Large Table Loads && Transaction Log Issues

Apr 7, 2004

I have a table that’s about 3 gigs, using this table and a few others I’m making another table. The problem is when making the new table my transaction log inflates so much that I’m running out of disk space. What I can I do to prevent this or to keep the transaction log size under control?

View 4 Replies View Related

SQL Server Admin 2014 :: Restore Lost Transaction From Transaction Log File

Jun 10, 2015

I have Full database backup upto previous day and transaction logfile of Today transaction. my database has crashed. I have restored previous day's Full backup. I have faced difficulty to restore today's transaction from today's transaction log. What are the steps to restore full database back and one day's transaction log file. Note: there is no differential database backup and transaction backup.

View 8 Replies View Related

Error 8525: Distributed Transaction Completed. Either Enlist This Session In A New Transaction Or The NULL Transaction.

May 31, 2008

Hi All

I'm getting this when executing the code below. Going from W2K/SQL2k SP4 to XP/SQL2k SP4 over a dial-up link.

If I take away the begin tran and commit it works, but of course, if one statement fails I want a rollback. I'm executing this from a Delphi app, but I get the same from Qry Analyser.

I've tried both with and without the Set XACT . . ., and also tried with Set Implicit_Transactions off.

set XACT_ABORT ON
Begin distributed Tran
update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TRANSACTIONMAIN
set REPFLAG = 0 where REPFLAG = 1
update TSADMIN.TRANSACTIONMAIN
set REPFLAG = 0 where REPFLAG = 1 and DONE = 1
update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.WBENTRY
set REPFLAG = 0 where REPFLAG = 1
update TSADMIN.WBENTRY
set REPFLAG = 0 where REPFLAG = 1
update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.FIXED
set REPFLAG = 0 where REPFLAG = 1
update TSADMIN.FIXED
set REPFLAG = 0 where REPFLAG = 1
update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.ALTCHARGE
set REPFLAG = 0 where REPFLAG = 1
update TSADMIN.ALTCHARGE
set REPFLAG = 0 where REPFLAG = 1
update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TSAUDIT
set REPFLAG = 0 where REPFLAG = 1
update TSADMIN.TSAUDIT
set REPFLAG = 0 where REPFLAG = 1
COMMIT TRAN


It's got me stumped, so any ideas gratefully received.Thx

View 1 Replies View Related

SSIS, Distributed Transaction Completed. Either Enlist This Session In A New Transaction Or The NULL Transaction.

Feb 22, 2007

I have a design a SSIS Package for ETL Process. In my package i have to read the data from the tables and then insert into the another table of same structure.

for reading the data i have write the Dynamic TSQL based on some condition and based on that it is using 25 different function to populate the data into different 25 column. Tsql returning correct data and is working fine in Enterprise manager. But in my SSIS package it show me time out ERROR.

I have increase and decrease the time to catch the error but it is still there i have tried to set 0 for commandout Properties.

if i'm using the 0 for commandtime out then i'm getting the Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.

and

Failed to open a fastload rowset for "[dbo].[P@@#$%$%%%]". Check that the object exists in the database.

Please help me it's very urgent.

View 3 Replies View Related

Distributed Transaction Completed. Either Enlist This Session In A New Transaction Or The NULL Transaction.

Feb 6, 2007

I am getting this error  :Distributed transaction completed. Either enlist this session in a new
transaction or the NULL transaction. Description:
An unhandled exception occurred during the execution of the current web
request. Please review the stack trace for more information about the error and
where it originated in the code. Exception Details:
System.Data.OleDb.OleDbException: Distributed transaction completed. Either
enlist this session in a new transaction or the NULL transaction.have anybody idea?!

View 1 Replies View Related

Distributed Transaction Completed. Either Enlist This Session In A New Transaction Or The NULL Transaction.

Dec 22, 2006

i have a sequence container in my my sequence container i have a script task for drop the existing tables. This seq. container connected to another seq. container. all these are in for each loop container when i run the package it's work fine for 1st looop but it gives me error for second execution.

Message is like this:

Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.

View 8 Replies View Related

Distributed Transaction Completed. Either Enlist This Session In A New Transaction Or The NULL Transaction. (HELP)

Jan 8, 2008

Hi,

i am getting this error "Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.".

my transations have been done using LINKED SERVER. when i manually call the store procedure from Server 1 it works but when i call it through Service broker it dosen't work and gives me this error.



Thanks in advance.


View 2 Replies View Related

Transaction Scope - The Operation Could Not Be Performed Because OLE DB Provider SQLNCLI For Linked Server XXX_LINKED_SERVER Was Unable To Begin A Distributed Transaction. OLE DB Provider SQLNCLI&a

May 15, 2008

Hello, I've a problem with a software developed in C# with the framework 2.0. This is the error I receive : The operation could not be performed because OLE DB provider "SQLNCLI" for linked server "XXX_LINKED_SERVER" was unable to begin a distributed transaction. OLE DB provider "SQLNCLI" for linked server "XXX_LINKED_SERVER" returned message "No transaction is active.". If I try directly to restart the process, it works fine. Is there someone who can help me ? This is the process 1. In C# --> Call of a Query : select from the linked server (db in sql 2005) and insert into a table SQL 2005 2. In the C# --> using (TransactionScope scope = new TransactionScope()) and insert in a table in SQL 2005 which is link server Thank in advance.

View 1 Replies View Related

Sending Large Binary To SQL Server Via ADO.NET

Feb 27, 2007

Hi,I am having some trouble with my ASP page sending a file to a SQL Server 2005 database running on another machine. An exception is generated as follows:A transport-level error has occurred when sending the request to the server. (provider: TCP Provider, error: 0 - An existing connection was forcibly closed by the remote host.)  It seems to work ok with smaller files, but when I attempt to upload a 11MB file, it gives me the above exception. I upped the httpRuntime maxRequestLength in the web.config to 2048. Any Ideas? Here is my code:1 if (FileUploader.HasFile)
2 {
3
4 // call the stored proc
5 SqlConnection conn = new SqlConnection();
6 conn.ConnectionString = "Password=password;Persist Security Info=True;User ID=MyUser;Initial Catalog=MyDb;Data Source=MySqlServerMachine;Connect Timeout=300";
7 conn.Open();
8
9 SqlCommand cmd = new SqlCommand();
10 cmd.Connection = conn;
11 cmd.CommandType = CommandType.StoredProcedure;
12 cmd.CommandText = "spAddTestBinary";
13
14 SqlParameter param = new SqlParameter();
15 param.ParameterName = "@binaryParam";
16 param.SqlDbType = SqlDbType.VarBinary;
17 param.Direction = ParameterDirection.Input;
18 param.Value = FileUploader.FileBytes;
19 cmd.Parameters.Add(param);
20
21
22 cmd.ExecuteNonQuery();
23
24
25 conn.Close();
26 }
  Here is my table:1 BinaryTable(
2 [id] [int] IDENTITY(1,1) NOT NULL,
3 [myBinary] [varbinary](max) NULL,
4 CONSTRAINT [PK_BinaryTable] PRIMARY KEY CLUSTERED
5 (
6 [id] ASC
7 )WITH (IGNORE_DUP_KEY = OFF) ON [PRIMARY]
8 ) ON [PRIMARY]
My stored proc:1 ALTER PROCEDURE [dbo].[spAddTestBinary]
2 -- Add the parameters for the stored procedure here
3 @binaryParam varbinary(MAX)
4 AS
5 BEGIN
6 -- SET NOCOUNT ON added to prevent extra result sets from
7 -- interfering with SELECT statements.
8 SET NOCOUNT ON;
9
10 -- Insert statements for procedure here
11 Insert into BinaryTable
12 (myBinary)
13 Values
14 (
15 @binaryParam
16 )
17 END
 

View 3 Replies View Related

Large Move From Access To SQL Server 200

Jan 12, 2004

Hi,


My client has a rather large database with some very large reports. Some of the reports have around 20 sub-reports a piece. We have decided to move the client's application to a .NET web application and would migrate them to SQL Server 2000.

The only problem is now, designing the reports. I have tried doing what Microsoft says (converting to stored procedures and views) but I keep getting syntax errors on the SQL side of things when I cut and paste.

For example, the following code is taken from Access :

SELECT tblProjects.fldCountry, tblProjects.fldDescription, tblOrganizations.fldAcronym, tblProjects.fldProjID, Max(tblProjYears.fldStartDate) AS MaxOffldStartDate, Max(tblProjYears.fldEndDate) AS MaxOffldEndDate, qryProjLocsWithFEData.fldProjPeriodID
FROM (tblProjects INNER JOIN tblOrganizations ON tblProjects.fldOrgID = tblOrganizations.fldOrgID) INNER JOIN (tblProjYears INNER JOIN qryProjLocsWithFEData ON tblProjYears.fldProjPeriodID = qryProjLocsWithFEData.fldProjPeriodID) ON tblProjects.fldProjID = tblProjYears.fldProjID
GROUP BY tblProjects.fldCountry, tblProjects.fldDescription, tblOrganizations.fldAcronym, tblProjects.fldProjID, qryProjLocsWithFEData.fldProjPeriodID
ORDER BY tblProjects.fldCountry, tblOrganizations.fldAcronym, tblProjects.fldProjID;

But when I try that in SQL Query Analyzer i get the error : The text, ntext, and image data types cannot be compared or sorted, except when using IS NULL or LIKE operator.

I'm pretty sure it's on the tblProjects.fldDescription Group By, but if I leave it out, it still throws an error. Anybody have any ideas?

Thanks

View 5 Replies View Related

Best Configuration For Large MSSQL 2K5 Server?

Apr 27, 2007

Hey Everyone,

I'm building a new PC for a set of very large MS-SQL databases (currently weighing in at approx 5TB, and should grow significantly going forward as well). This is for personal business-related use -- I do not have access to a database expert, I am not in an enterprise (self employed), etc., so I am turning here for help.

I'm wondering what type of machine I should get to handle this type of database. It will not particularly be under particularly heavy load (just a few simultaneous connections, not too many queries running at once), but is obviously a very large database (with over 22 billion rows of data). I don't know what type of hardware setup would be optimal for this machine (mainly processors).

What I have spec'd out at the moment is:
11 * SAMSUNG SpinPoint T Series HD501LJ 500GB 7200 RPM SATA 3.0Gb/s Hard Drives
1 * ASUS DSBF-D/SAS Dual Socket 771 Intel 5000P SSI EEB 3.61 Server Motherboard
1 * Areca ARC-1220 PCI-Express x8 SATA II Controller Card
1 * Kingston 4GB(2 x 2GB) 240-Pin DDR2 FB-DIMM DDR2 667 (PC2 5300) ECC Fully Buffered Dual Channel Kit Server Memory Model KVR667D2D4F5K2/4G
2 * Intel Xeon 5060 Dempsey 3.2GHz Socket 771 Active or 1U Processor Model BX805555060A

Priced on newegg, comes to a total of $3638.34

In the <$4000 range, can you guys think of anything that would be superior? The 11*500GB's I want to run in RAID50, and the Areca card seems to be one of the best on the market (not to mention has 24 ports for future scalability). The RAM is FB-DIMM's, Xeons..... am I missing anything?

Also, I'm a bit concerned about performance on the Xeons. The ones I chose are dual cores. Since there arent going to be toooo many simul queries (really, often, prolly 1 at a time even), should i try for single core processors?

Anyways, thanks for the help.

View 11 Replies View Related

Storing Large Files In The SQL Server 2005

Feb 9, 2006

I have a table that I'm inserting a file into and using the Image data type to store the binary object.  Now the code below works fine for files around 1.5 MB, but anything larger and it's like the code won't even execute and I get a Page Not found error.
I'm in the process of running some traces to find out what's going on in the backend, but I'm assuming there's something amiss with my code.  The Image data type should handle files that size with no problem but for some reason it isn't.
Does anyone see anything wrong?
Thanks
Dim iLength As Integer = CType(File1.PostedFile.InputStream.Length, Integer)
If iLength = 0 Then Exit Sub 'not a valid file
Dim sContentType As String = File1.PostedFile.ContentType
Dim sFileName As String, i As Integer
Dim bytContent As Byte()
ReDim bytContent(iLength) 'byte array, set to file size

'strip the path off the filename
i = InStrRev(File1.PostedFile.FileName.Trim, "")
If i = 0 Then
sFileName = File1.PostedFile.FileName.Trim
Else
sFileName = Right(File1.PostedFile.FileName.Trim, Len(File1.PostedFile.FileName.Trim) - i)
End If
conn = New SqlConnection(eco)
conn.Open()
cmd = New SqlCommand("INSERT INTO ECO_Attachments (ECOID, FromType, DocName,OldRev,NewRev,NtLogin,DisplayName, FileName, FileSize, FileData, ContentType) VALUES (@ECOID, @FromType,@DocName,@OldRev,@NewRev,@NtLogin,@DisplayName, @FileName, @FileSize, @FileData, @ContentType) ")
cmd.Connection = conn
Try
File1.PostedFile.InputStream.Read(bytContent, 0, iLength)
With cmd
.Parameters.Add("@ECOID", SqlDbType.Int)
.Parameters.Add("@FromType", SqlDbType.NVarChar, 50)
.Parameters.Add("@DocName", SqlDbType.NVarChar, 250)
.Parameters.Add("@OldRev", SqlDbType.NVarChar, 50)
.Parameters.Add("@NewRev", SqlDbType.NVarChar, 50)
.Parameters.Add("@NTLogin", SqlDbType.NVarChar, 100)
.Parameters.Add("@DisplayName", SqlDbType.NVarChar, 200)
.Parameters.Add("@FileName", SqlDbType.NVarChar, 255)
.Parameters.Add("@FileSize", SqlDbType.Real)
.Parameters.Add("@FileData", SqlDbType.Image)
.Parameters.Add("@ContentType", SqlDbType.NVarChar, 50)
.Parameters("@ECOID").Value = ECOID
.Parameters("@FromType").Value = From
.Parameters("@DocName").Value = DocName
.Parameters("@OldRev").Value = OldRev
.Parameters("@NewRev").Value = NewRev
.Parameters("@NTLogin").Value = NTLogon
.Parameters("@DisplayName").Value = DisplayName
.Parameters("@FileName").Value = sFileName
.Parameters("@FileSize").Value = iLength
.Parameters("@FileData").Value = bytContent
.Parameters("@ContentType").Value = sContentType
.ExecuteNonQuery()
'.ExecuteScalar()
End With
Catch ex As Exception
Response.Write(ex)
'Handle your database error here
conn.Close()
End Try

View 1 Replies View Related

SQL Server Is Occupying Large Memory Space

Oct 3, 2001

In an Intranet Application using Win NT, Apache, Tomcat and SQL Server, the memory space used by SQL Server is drastically increasing and finally the system crashes. Nearly 40 people are accessing the system. The hardware configuration is P2 processor with 393 MB RAM and 2GB Virtual Memory. SQL Server,Web server and Servlet Engine are running on same machine.
Within three hours, SQL Server occupies 200M memory and the system perfomance comes down and finally the system stopes the tomcat servlet engine.
Anybody have any idea on this? We have nearly 1500 JSP pages,200 Bean files and 300 tables in SQL Server.

View 2 Replies View Related

Transferring A LARGE Database To A New Server - QUICKLY!!!!

Nov 13, 2000

Good morning one and all,

I need to transfer a database (contining one table) containing over 35 million records from one server to another. I have two options at present :
(a) Use DTS to do the transfer
(b) Copy the mdf file across and sp_attach_db it

Does any1 have a better idea, or does any1 know which of the two methods will be the quickest?

TIA

Gurmi

View 1 Replies View Related

How To Store Large Character Data In MS SQL Server ???

Sep 11, 1998

I am developing a simple ASP based form, that stores user info in MS SQL server. I have created a table in the SQL server to store the data and defined the body field with this line: `body char(255)`. The problem is that if the user inputs a string longer then 255 characters it gets choped off. How would you suggest solving this problem? Should I use `text` datatype instead?? Any comments are very appreciated !!!

View 2 Replies View Related

SQL Server 2008 :: Add ID And Primary Key To Large Table(s)

Apr 24, 2015

Background:

* SQL Server 2008 R2
* Database was created from a third party product. The product writes to the 3 tables that I need to make changes to 24/7 and downtime is not an option. All changes must be done live.
* Database overall size is ~200 GB
* The 3 tables I must update make up ~190 GB of that space.
* Tables have no primary key or ID columns. Therefore, the data is highly fragmented.
* Of the ~190 GB of space allocated for the tables, there is roughly 70 GB of actual data.
* Rows of the table are not guaranteed to be unique. In fact, on one of the tables, tests were ran with a small sample of data and duplicates were very much evident.

What I'm trying to accomplish here is to get an ID column added to the 3 tables and set that ID field as the primary key. Doing so will force the data to become much less fragmented than it is currently and with purging and new inserts, eventually fragmentation will be nearly non-existent.

Problem:
Making table changes on tables this large while data is constantly being added poses many risks and can cause data loss. This was tried on a smaller table than these three and the entire table was lost in the process. Restore from backup was needed to get back to most recent log backup point.

Original Solution:
My original plan was to create a backup of each table and run the script below to migrate the majority of the data temporarily into the new table. I could then update the original table (which now would contain much less data) and then migrate the data back.

CREATE TABLE #temp
(
MsgDate varchar(10)
,MsgTime varchar(8)
,MsgPriority varchar(30)
,MsgHostname varchar(255)

[Code] ....

Original Solution Problem:
The problem with the solution above is that it calls the DELETE function on the original table using the values from the temporary table. When there are duplicate rows, which have not all been inserted into the backup table yet, they will all be removed from the original table because there is nothing unique to separate them out. In my testing, I had 10,000 rows in the original table and ended up with 9,959 rows in the backup table.

Question 1: Is my approach to making these table changes reasonable?
Question 2a: If so, how can I make sure I don't lose data as part of this temporary migration of the data to my backup tables?
Question 2b: If not, what would be a better approach that isn't going to cause disruption to the application that INSERTs data 24/7 and won't have any risk of data loss?

View 9 Replies View Related

SQL Server 2008 :: Large Tables In OLTP

Jul 14, 2015

How many no of records of the tables are called large tables.

We are getting more deadlocks. We are using default isolation. Read & insert statements are blocking each other and causes dead locks.

I am thinking that might be purging will reduce deadlocks.

The table has 15million records. Is this table consider as large table or not in OLTP systems?

In general how many records we need to consider as large table.

View 1 Replies View Related

Problems Moving LARGE DB From MySQL To SQL Server

Apr 9, 2008

Hi gang,

This is a cross-post from the MSDN Managed Newsgroups, which I never seem to get a response on... hoping someone over here can help me out!

I am having a problem and could also use some help understanding something...

We have a DB in MySQL (5.0.51) that I am transferring to SQL Server 2K5
using DTS. I am going through the MySQL ODBC v3.51 driver, and bringing data
over via query.

My query is just SELECT * FROM <table>, and it take approximately 1 minute
to run on the MySQL server.

First, the problem... the source table is 5 columns (2-ints, 1-float,
2-varchar(255)s) and 22M rows. Looks to me like about 600MB of data.

When I go through the DTS wizard it runs for about an hour and then I get a
"MySQL client out of memory" error. After a bit of research, I found a
switch in the ODBC driver to use forward-only cursors, which seems to have
alleviated that problem, however... and this is where I need help in
understanding the process, when I get to the page in the wizard where I enter
the query, I click next, and the machine gets busy for an hour. Looking at
the processes on the MySQL server, I can see the entire result set is
streaming across the wire to my SQL Server box. Then the page comes back,
and I specify the table to import into, click Edit Mappings... and wait for
another 30+ minutes for the mapping screen to come up. Once I enter the
mappings, I can complete the wizard, but the Performing Operation page takes
about an hour again to complete.

The duration of all of this seems ridiculous to me... especially in moving
such a (relatively) trivial amount of data (MySQL server has 100s of gigs of
hard drive space available, plus 8GB or RAM, and my SQL Server box has 3GBs
of RAM and plenty of drive space... plus we are running gigabit ethernet).

It appears that the query page is retrieving all of the rows just to take me
to the page where I map the columns. That, assumedly, is the case because on
that page I can Preview the data. Is there any way to supress this?

Is there anything anyone can suggest to speed this whole ordeal up?

I am wondering (haven't tried this yet), if I save the DTS Package and run
that if it will execute more rapidly because much of the work is already done
by that point.

Any info would be greatly appreciated!

Cheers,

Chris

View 3 Replies View Related

Large Data Warehouse Handled Bei SQL Server?

Jul 20, 2005

Hi,I would like to know if anyone out there really uses SQLServer 2000 (which edition?) to hold the data for a datawarehouse? How much data does it handle efficiently?TIAFrank

View 1 Replies View Related

Restoring An SQL 2005 Server Express DB With A LOG That Is To Large

Jul 31, 2006

SQL 2005 Express - Database Restore size problem
Was this post helpful ?














Hi,

I'm trying to restore a SQL Server DB Backup from a SQL Server DB Server on to my Laptop (SQL 2005 Express)

When I execute a restore filelistonly command on the backup file, It seems that the Database included is 1GB, but the Log file is 91 GB in size, which exceeds my diskspace.

I can restore the Data on its own without the log file, but the the Database stays in "restoring" mode. I've tried to switch the restore flag off (update sys.databases set state = 0 where name = 'G001'), but I can`t seem to be able to do it, even if I try to allow updates via:

sp_configure 'allow updates', 1

GO

RECONFIGURE WITH OVERRIDE

GO



Any ideas how I can restore the database without restoring the enormous logfile?

Thanks in advance...



View 3 Replies View Related

SQL Server 2005 Large IO And Logical Reads

May 22, 2008

A table in one of my databases is running very slowly. The IO is very high and below is a printout from the SET STATISTICS IO ON command run on a common query used on the table:


(4162 row(s) affected)

Table 'WebProxyLog'. Scan count 3, logical reads 873660, physical reads 3493, read-ahead reads 505939, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.

I have a clustered unique index and a nonclustered index on the table. I have ran SQL Profiler and opened the trace in Database Tuning Advisor, DTA displays 0% improvement suggestions. I have a number of statistics on the table and index which are all up to date and fragmentation is less than 1%. I've tried a number of variations on indexes to improve performance but to no avail. There is only one query which runs on the table, and the nonclustered index created on the table did significantly improve performance, however the query still runs at around 23 seconds. The query does bring back a large amount of data however i'm sure there is a way to bring down the IO and logical reads to improve performance.

The table and index scripts are below:




Code Snippet
-- =================== Table and Clustered index ===========================
CREATE TABLE [dbo].[WebProxyLog](
[ClientIP] [bigint] NULL,
[ClientUserName] [nvarchar](514) NULL,
[ClientAgent] [varchar](128) NULL,
[ClientAuthenticate] [smallint] NULL,
[logTime] [datetime] NULL,
[servername] [nvarchar](32) NULL,
[DestHost] [varchar](255) NULL,
[DestHostIP] [bigint] NULL,
[DestHostPort] [int] NULL,
[bytesrecvd] [bigint] NULL,
[bytessent] [bigint] NULL,
[protocol] [varchar](12) NULL,
[transport] [varchar](8) NULL,
[operation] [varchar](24) NULL,
[uri] [varchar](2048) NULL,
[mimetype] [varchar](32) NULL,
[objectsource] [smallint] NULL,
[rule] [nvarchar](128) NULL,
[SrcNetwork] [nvarchar](128) NULL,
[DstNetwork] [nvarchar](128) NULL,
[Action] [smallint] NULL,
[WebProxyLogid] [int] IDENTITY(1,1) NOT NULL,
CONSTRAINT [pk_webproxylog_webproxylogid] PRIMARY KEY CLUSTERED
(
[WebProxyLogid] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]

-- =================== Nonclustered Index ===========================

CREATE NONCLUSTERED INDEX [dta_ix_WebProxyLog_Kaction_clientusername_logtime_uri_mimetype_webproxylogid] ON [dbo].[WebProxyLog]
(
[Action] ASC
)
INCLUDE ( [ClientUserName],
[logTime],
[uri],
[mimetype],
[WebProxyLogid]) WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]

-- =================== Query which is called regularly on the table ===========================

SELECT [User] = CASE
WHEN LEFT(clientusername,3) = domain' THEN RIGHT(clientusername,LEN(clientusername) - 3)
ELSE clientusername
END,
logtime AS [Date],
desthost AS [Site],
uri AS [Actual Site]
FROM webproxylog
WHERE CONVERT(Datetime,CONVERT(VarChar(25),logtime,106),106) BETWEEN '20 apr 2008' AND '14 may 2008'
AND(RIGHT(uri,4) NOT IN('.css','.jpg','.gif','.png','.bmp','.vbs'))
AND (RIGHT(uri,3) NOT IN('.js'))
AND LEFT(mimetype,6) = 'text/h'
AND (uri NOT LIKE '%sometext.local%')
AND (uri NOT LIKE '%sometext.co.uk%')
AND [action] = 9
AND (clientusername IN ('USERNAME'))
ORDER BY logtime ASC;





PS There are 60,078,605 rows in the table

Please help!

Many Thanks

View 6 Replies View Related

Not Able To Restore SQL Server Database From A Large Backup File

Aug 12, 2006

Hi everybody,
On executing the RESTORE command of SQL Server to restore from a backup of 78.3 MB, the "Server Application Unavailable" error message comes up.The error message in the Application log is as follows:aspnet_wp.exe  (PID: 2184) was recycled because memory consumption exceeded the 152 MB (60 percent of available RAM).
However using Query Analyser of SQL Server I am able to restore the database.
What is the solution to this problem?

View 2 Replies View Related

Read Large Binary Data From Sql Server 2005

Jul 14, 2007

Hi I've followed a tutorial on how to write and read varbinary(max) data to and from a database. But when i try to read the data i get the error that the data would be truncated, but only when the varbinary(max)  is greater then 8kB. I've used a system stored procedure (sp_tableoption) to set the table that holds the data to store data outside rows. To select the data i'm using a stored procedure:               SELECT imageData , MIMEType FROM Pictures WHERE (imageTitle = @imageTitle)        And then using an .aspx page to Response.Write the data:Using conn As New sql.SqlConnection            conn.ConnectionString = ConfigurationManager.ConnectionStrings("myConnectionString").ToString            Dim getLogoCommand As New sql.SqlCommand            getLogoCommand.CommandType = Data.CommandType.StoredProcedure            getLogoCommand.CommandText = "GetPicture"            getLogoCommand.Connection = conn            Dim imageTitleParameter As New sql.SqlParameter("@imageTitle", Data.SqlDbType.NVarChar, 200)            imageTitleParameter.Value = Request("imageTitle")            imageTitleParameter.Direction = Data.ParameterDirection.Input            getLogoCommand.Parameters.Add(imageTitleParameter)            conn.Open()            Using logoReader As sql.SqlDataReader = getLogoCommand.ExecuteReader                logoReader.Read()                If logoReader.HasRows = True Then                    Response.Clear()                    Response.ContentType = logoReader("MIMEtype").ToString()                    Response.BinaryWrite(logoReader("imageData"))                End If            End Using            conn.Close()        End Using  Can anyone please help me with this?!

View 2 Replies View Related

How To Read And Store A Large File From SQL Server 2000?

Feb 23, 2006

hi..
   I want to store a RMVB file to SQL SERVER 2000 ,and read from it,iwant to play the RMVB file in web,the size of the RMVB file is more than 300MB less 1G.the  SQL Field Image can include it.
Now My Quesstion is How can i Store and Read the RMVB file from SQL Server2000?
I used SqlInsertCommand.ExecuteNoquery()  in my program,but it Too slow,ao make a unknown error.
Thank you for your help.

View 6 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved