TableDiff Out Of Memory Exception On Large Tables.

Sep 20, 2007

Hello,
I hope I am posting this in the right forum.

I am using tableDiff.exe to create a diff SQL script for a very large table (~4 million rows).


After a few minutes, I recieve a "System.OutOfMemoryException".

I have 4GB of ram on the machine executing the table diff.
The server is 32-bit, so adding ram is not an option.

I am executing the following command line:





Code Snippet

TableDiff.exe" -sourceserver "SERVER" -sourcedatabase "SourceDB" -sourcetable "Table1" -destinationserver "SERVER" -destinationdatabase "DestDB" -destinationtable "Table1" -f "C:TableDiffsTable1"

I have seen reports of other users executing tableDiff against 2million row tables.

Is there anyway to buffer tableDiff, so that I do not run out of memory on the server?

Could anything else be causing this error?

Thanks,
Dave

View 3 Replies


ADVERTISEMENT

SSIS Package Out Of Memory Exception

Aug 23, 2007

I have an SSIS Package that loads data from a log file. Prior to loading the data I need to prepare the file. I run a script that cleans the file. Then I import the flat file into SQL Server.

Log File Management Task
1. Run Unix Log File Task
2. Import the new log file (flat file) into SQL Server

Error
i.Unix.dtsx
Message: The script threw an exception: Exception of type 'System.OutOfMemoryException' was thrown.

Is this because the system is running out of memory? The RAM on the server is 4gb. Below is a sample of the script. The job doesn't always fail; there are times when the job executes with success and other times when it fails.

Script Source Code
-----------------------------------------------------------
' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Imports System.Collections.Generic
Imports System.IO
Imports System.Text
Imports System.Diagnostics
Imports System.Globalization
Imports Microsoft.VisualBasic
Imports System.Text.RegularExpressions
Public Class ScriptMain
'********** Begin Error Log Settings **********
'Dim sSource As String = "i.SSIS.Unix.FileManager"
'Dim sLog As String = "Application"
'Dim sMachine As String = "."
'Dim ELog As New EventLog(sLog, sMachine, sSource)
'********** End Error Log Settings **********

Public Sub Main()
'variables for the unix log file
Dim newFile As String = "D:iLogunixlog.txt"
Dim copyFile As String = "\server16iLogunixlog.txt"
'variables for working log files
Dim oldFile As String = "D:i empunixlog.txt"
Dim difFile As String = "D:i empunixdiff.txt"
Dim trimdiff As String = "D:i empunixdifft.txt"
Dim formatTemp As String = "D:i empunixlog_formatted.txt"
Dim errorFile As String = "D:i empunixlog_bad.txt"

'delete unixlog.txt copy unixlog.txt
'if the file is on the local server delete it and copy the new file over
'if the file is not present copy the new file over
Try
If File.Exists(newFile) Then
File.Delete(newFile)
File.Copy(copyFile, newFile)
Else
File.Copy(copyFile, newFile)
End If
While Not File.Exists(newFile)
System.Threading.Thread.Sleep(1000)
End While
'ELog.WriteEntry("FileCopy.Success".ToString(), EventLogEntryType.SuccessAudit, 4, CType(4, Short))
Catch ex As Exception
'ELog.WriteEntry("FileCopy.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 2, CType(4, Short))
End Try

'open the old file; read backwards until we reach the carriage
'return and store that "seek" position; now open the new file and
'seek to that stored position. finally, read the rest of the file
'and write that data to the difference file.
' determine position of last line in the old file
Dim lastLine As Long = GetLastLinePosition(oldFile)
' get all data in new file starting at position determined above
Dim fi As New FileInfo(newFile)
Dim buffer(fi.Length - lastLine) As Byte
Dim fs As New FileStream(newFile, FileMode.Open)
Try
fs.Seek(lastLine, SeekOrigin.Begin)
fs.Read(buffer, 0, buffer.Length)
fs.Close()
' write that new data to the difference file
fs = New FileStream(difFile, FileMode.OpenOrCreate, FileAccess.Write, FileShare.None)
fs.Write(buffer, 0, buffer.Length)
fs.Close()
'ELog.WriteEntry("FileCopy.Success".ToString(), EventLogEntryType.SuccessAudit, 4, CType(4, Short))
Catch ex As Exception
'ELog.WriteEntry("FileCopy.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 2, CType(4, Short))
End Try

'remove the partial row from the difference file
Try
TrimFinal(difFile, trimdiff)
'ELog.WriteEntry("TrimFinal.Call.Success".ToString(), EventLogEntryType.SuccessAudit, 4, CType(4, Short))
Catch ex As Exception
'ELog.WriteEntry("TrimFinal.Call.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 2, CType(4, Short))
End Try
'perform the file formatting
sFormatFile(trimdiff, formatTemp, errorFile)
'
Dts.TaskResult = Dts.Results.Success
End Sub

Function GetLastLinePosition(ByVal fileName As String) As Long
Dim pos As Long = -1
Dim fs As New FileStream(fileName, FileMode.Open, FileAccess.Read, FileShare.ReadWrite)
Try
fs.Seek(-2, SeekOrigin.End) ' -2 to skip a potential vbcrlf at the end of file
While fs.Position > 0
fs.Seek(-1, SeekOrigin.Current)
If fs.ReadByte = 10 Then
pos = fs.Position
Exit While
Else
fs.Seek(-1, SeekOrigin.Current)
End If
End While
fs.Close()
'ELog.WriteEntry("GetLastLinePosition.Success".ToString(), EventLogEntryType.SuccessAudit, 4, CType(4, Short))
Catch ex As Exception
'ELog.WriteEntry("GetLastLinePosition.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 2, CType(4, Short))
End Try
Return pos
End Function

Sub TrimFinal(ByVal difFile As String, ByVal trimdiff As String)
Dim fi2 As New FileStream(difFile, FileMode.OpenOrCreate, FileAccess.Read)
Dim fo2 As New FileStream(trimdiff, FileMode.OpenOrCreate, FileAccess.Write)
Dim sr2 As New StreamReader(fi2)
Dim sw2 As New StreamWriter(fo2)
Dim line2 As String
Try
Do While sr2.Peek <> -1
line2 = sr2.ReadLine()
If (sr2.Peek <> -1) Then
sw2.WriteLine(line2)
End If
Loop
sw2.Flush() : sw2.Close()
sr2.Close()
fi2.Close() : fo2.Close()
'ELog.WriteEntry("TrimFinal.Success".ToString(), EventLogEntryType.SuccessAudit, 4, CType(4, Short))
Catch ex As Exception
'ELog.WriteEntry("TrimFinal.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 2, CType(4, Short))
End Try
End Sub

Sub sFormatFile(ByVal currentFile As String, ByVal tempFile As String, ByVal errorFile As String)
Dim tfp As New Microsoft.VisualBasic.FileIO.TextFieldParser(currentFile)
Dim sw As New System.IO.StreamWriter(tempFile)
Dim swErrorFile As New System.IO.StreamWriter(errorFile)
tfp.TextFieldType = FileIO.FieldType.Delimited
tfp.SetDelimiters(",")
tfp.HasFieldsEnclosedInQuotes = True
tfp.TrimWhiteSpace = True
Dim fields() As String
Try
While Not tfp.EndOfData
Try
fields = tfp.ReadFields()
If fields.Length <> 23 Then
'write bad rows to error-file
swErrorFile.WriteLine(String.Join(",", fields))
Else
If fields(3) = "" And fields(13) = "" Then
sw.WriteLine(Chr(34) & String.Join(Chr(34) & "," & Chr(34), fields) & Chr(34))
ElseIf IsDate(fields(3)) = True OrElse fields(3) = Format(CDate(fields(3)), "yyyy-MM-dd HH:mms") And fields(13) = "" Then
sw.WriteLine(Chr(34) & String.Join(Chr(34) & "," & Chr(34), fields) & Chr(34))
ElseIf IsDate(fields(13)) = True OrElse fields(13) = Format(CDate(fields(13)), "yyyy-MM-dd HH:mms") And fields(3) = "" Then
sw.WriteLine(Chr(34) & String.Join(Chr(34) & "," & Chr(34), fields) & Chr(34))
ElseIf IsDate(fields(3)) = True OrElse fields(3) = Format(CDate(fields(3)), "yyyy-MM-dd HH:mms") _
And IsDate(fields(13)) = True OrElse fields(13) = Format(CDate(fields(13)), "yyyy-MM-dd HH:mms") Then
sw.WriteLine(Chr(34) & String.Join(Chr(34) & "," & Chr(34), fields) & Chr(34))
Else
swErrorFile.WriteLine(String.Join(",", fields))
End If
End If
Catch ex As Exception
'ELog.WriteEntry("sFormatFile.TFP.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 0, CType(4, Short))
End Try
End While
tfp.Close()
sw.Close()
swErrorFile.Close()
File.Delete(currentFile)
File.Move(tempFile, currentFile)
'ELog.WriteEntry("sFormatFile.Success".ToString(), EventLogEntryType.SuccessAudit, 0, CType(4, Short))
Catch ex As Exception
'ELog.WriteEntry("sFormatFile.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 0, CType(4, Short))
Finally
GC.Collect()
End Try
End Sub
End Class
-------------------------

Does my script seem okay for releasing the server memory usage?

Thanks.

View 1 Replies View Related

ReportViewer Control Throws OutOfMemory Exception For Large Data

Nov 28, 2007

I have a dataset with 500,000 records and I'm getting the following error with ReportViewer control for local report. "An error has occurred during local report processing. An error has occurred during report processing. Exception of type System.OutOfMemoryException was thrown. any help with this would be highly appreciated.

View 1 Replies View Related

Scheduled Job Fails With &#39;EXCEPTION - Insufficient Memory&#39;

May 31, 2001

A series of export/import jobs are scheduled on a dozen databases sitting on one of our servers, and are run at regular intervals through the day. Some of the jobs are failing with the following error recorded in the 'View Job History..':

EXCEPTION: Insufficient memory for this operation. Process Exit Code 2. The step failed.

Will this be cured by increasing the memory available to SQL Server (it has 512Mb already, 1/2 of the total physical RAM)? Also, why are only some jobs failing and others completing? Should I run performance monitor when the next schedule is?

Thanks

Derek

View 1 Replies View Related

Out Of Memory Exception When Running A Package With XML Task

Jun 1, 2007

Hi..



I'm running a package that has a XML Task in the control flow. This task tranforms a XML file with a XSLT.



The file is about 2 megs on a daily basis, but at the end of the month there is a full dump of data that makes the file to be around 400 megs. There is where my problem is.



I run this on my 2 GB memory workstation and when the memory gauge on the task manager reaches about 1.5gb the package fails with an "Out of memory exception".



I also run this package on a 8GB Ram server, and same applies.



Is there any way of making this package utilize all the available memory, I even increased the virtual memory to see if that helped my issue, but nothing.



Thanks

View 10 Replies View Related

SQL Server 2012 :: Processing Of XML With OPENROWSET Gives Out Of Memory Exception

Jun 20, 2014

I have an XML file about 25MB. When I read it with openrowset it gives me 'System.OutOfMemoryException'. The machine I'm running has 16GB of RAM and memory is definetely not exceeded. When I run smaller XML files it works fine.

I've read that older versions of SQL Server had this problem and it was caused by the parser having limited amount of memory. Is this still the case? Is there a way to change this?

View 7 Replies View Related

SQL Server Admin 2014 :: In-Memory Previous Transaction Aborted Exception

Sep 21, 2015

I'm working on a large scale project that is currently in production. We have a big process that recently changed to use In-Memory Tables with SQL 2014 for performance efficiency.

The Process uses:

51 In-Memory SQL Tables.
50 Stored Procedures (not native) that loads data(Insert) from about 150 regular Tables and IM tables.
300 Validations (short stored procedure not native) Selecting from those 50 In-Memory Tables (And insert to In-Memory table that save the validation errors if exists on In-Memory table).

At the end of this process we clean the table from the data that relavnt to etch prosses(DELETE FROM WHERE)

B.T.W
No UPDATE STAT on In-Memory are used-when we test the prosses it slow as down and cause some locks.

We are calling this process from ADO.Net, loads stored procedure first and then validations, each SP use different SQL Connection. In normal use, everything works fine and takes about 1.5 second.

Under stress test (6 Clients X 100 Tasks) for 30 minutes. After several minutes we are starting to get this SQL Exception (1 SQL Exception for every 20 tasks):

41301. A previous transaction that the current transaction took a dependency on has aborted, and the current transaction can no longer commit.

Transactions in Memory-Optimized Tables

The Exception is not clear. We are not using BEGIN TRANSACTION in the process. The SQL Exception occurs in different stored procedures each time.

View 2 Replies View Related

SQL Server Is Occupying Large Memory Space

Oct 3, 2001

In an Intranet Application using Win NT, Apache, Tomcat and SQL Server, the memory space used by SQL Server is drastically increasing and finally the system crashes. Nearly 40 people are accessing the system. The hardware configuration is P2 processor with 393 MB RAM and 2GB Virtual Memory. SQL Server,Web server and Servlet Engine are running on same machine.
Within three hours, SQL Server occupies 200M memory and the system perfomance comes down and finally the system stopes the tomcat servlet engine.
Anybody have any idea on this? We have nearly 1500 JSP pages,200 Bean files and 300 tables in SQL Server.

View 2 Replies View Related

Asp.net Worker Process Runs Out Of Memory When Using A Large Dataset

Feb 27, 2006

Hi,
I'm running an application on a server which grabs data from a database table on another server using SqlConnection, SqlDataAdapter and DataSet.
The application then updates every row in that DataSet's DataTable and the updates are saved back using DataAdapter. The code is pretty much straightforward code that you would find on MSDN documentation for using DataSets. The table contains a little over a million rows.
When I run the application, I get an error saying the Server Application is not available. Upon looking into the application event log, I get this message.
aspnet_wp.exe was recycled because memory consumption exceeded the 306 MB (60 percent of available RAM)
How do I get round this? I thought DataSets were supposed to handle large datatables comfortably without having memory issues.
-Thanks

View 1 Replies View Related

Sizing A Pagefile On Servers With Large Amounts Of Memory

Sep 19, 2007

I know the standard Microsoft recommendation is to make the pagefile at least 1.5 to 3 times larger then the amount of physical memory. However, if you're talking about a server with lots of memory such as 16GB or 32GB, would following this rule be unnecessary. With SQL 2000 running on Windows 2000 Server or Windows Server 2003 I typically see pagefile usage no more then 12% for a 2GB pagefile. Anything over 15% means I need to look at other indicators to see if a memory bottleneck has developed. If I have 32GB of physical memory and make the pagefile only 1.5 x 32GB I have a 48GB pagefile. 10% of this is 4.8GB, which I would hope I never see consumed.

Any thoughts?

Thanks, Dave

View 11 Replies View Related

SQL Server 2008 :: How To Find Statements That Cause Large Memory Paging

Apr 22, 2015

I am monitoring our production server, and noticed that periodically we have spikes of Memory Paging Rate (pages/sec).

How to find particular queries/stored procedures that causing this?

View 5 Replies View Related

Visual Studio/BIDS Out Of Memory Errors Opening Large DTSX Files

Jul 26, 2006

I'm currently experiencing major problems with SSIS when opening and editing large .DTSX package files that contain Exec DTS 2000 Tasks which have the package data loaded internally. I have no issues if I point the task to a .DTS file, or to an actual DTS package on a SQL 2000 server - but if I load the package internally then once the underlying .DTSX file gets over around 17MB or so in size (which doesnt take long making a few edits to even fairly simple packages now and then), I start to experience major issues with VS/BIDS 2005 crashing randomly when I try to perform any action with the package (open, save etc). Things like OutOfMemory exception errors, followed by the properties of Exec DTS 2000 task being deleted, and also sometimes accompanied by messages about the application not being installed properly.

Again its ONLY when the underlying .DTSX file reaches a certain size limit, and only when I've got an Exec DTS 2000 task with the package loaded internally. I've replicated the issue using several different package files on several different machines (even on servers with lots of memory, fwiw).



Can anyone out there help me with this? SSIS - namely SSIS Exec DTS 2000 package tasks - are our lifeblood at my company and this trend of random and serious crashing on large package files is very disturbing to say the least.



thanks,



Wil

View 1 Replies View Related

SQL Server 2008 :: How To Find Which Queries / Processes Causing Large Memory Paging Rate

Mar 30, 2015

Our monitoring tool shows that our production system periodically experiencing large rate - up to 800 memory pages/sec. How to find out which particular queries, S.P., processes that initiate this?

View 3 Replies View Related

'((System.Exception)($exception)).Message' Threw An Exception Of Type 'System.NotSupportedException'

Jan 16, 2008

Greetings everyone, I am attempting to build my first application using Microsofts Sql databases. It is a Windows Mobile application so I am using Sql Server Compact 3.5 with Visual Studio 2008 Beta 2. When I try and insert a new row into one of my tables, the app throws the error message shown in the title of this topic.
'((System.Exception)($exception)).Message' threw an exception of type 'System.NotSupportedException'



My table has 4 columns (i have since changed my FavoriteAccount datatype from bit to Integer)
http://i85.photobucket.com/albums/k71/Scionwest/table.jpg

Account type will either be "Checking" or "Savings" when a new row is added, the user will select what they want from a combo box.

Next is a snap shot of my startup form.
http://i85.photobucket.com/albums/k71/Scionwest/form.jpg



Where it says "Favorite Account: None" in the top panel, I am using a link label. When a user clicks "None" it will go to a account creation wizard, and set the first account as it's primary/favorite. As more accounts are added the user can select which will be his/her primary/favorite. For now I am just creating a sample account when the label is clicked in an attempt to get something working. Below is the code used.


private void lnkFavoriteAccount_Click(object sender, EventArgs e)

{

FinancesDataSet.BankAccountRow account = this.financesDataSet.BankAccount.NewBankAccountRow();

account.Name = "MyBank Checking Account";

account.AccountType = "Checking";

account.Balance = Convert.ToDecimal("15.03");

account.FavoriteAccount = 1;//datatype is an integer, I have changed it since I took the screenshot.

financesDataSet.BankAccount.Rows.Add(account);
//The next three lines where added while I was trying to get this to work.
//I don't know if I really need them or not, I receive the error regardless if these are here or not.



this.bankAccountTableAdapter1.Update(financesDataSet);

this.financesDataSet.AcceptChanges();

refreshDatabase();

}


the refreshDatabase() code is here:


private void refreshDatabase()

{

this.bankAccountTableAdapter1.Fill(this.financesDataSet.BankAccount);

//Aquire a count of accounts the user has

int numAccounts = financesDataSet.BankAccount.Count;

//Loop through each account and see which one is the primary.

for (int num = 0; num != numAccounts; num++)

{
//Works ok in frmMain_Load, but when my lnkFavoriteAccount_click calls this, it throws the error.

if (this.financesDataSet.BankAccount[num].FavoriteAccount == 1)

{
//Display the primary account on our home page. User can click the link label & be taken to their account register.

this.lnkFavoriteAccount.Text = this.financesDataSet.BankAccount[num].Name.ToString();

this.lnkFavoriteFunds.Text = this.financesDataSet.BankAccount[num].Balance.ToString();

break;

}

}

}


and my form_load code

private void frmMain_Load(object sender, EventArgs e)

{

refreshDatabase();

}


So, when I click on the lnkFavoriteAccount label, and my new row gets added, the app stops at the following line in my DataSet.Designer

[global:ystem.Diagnostics.DebuggerNonUserCodeAttribute()]

public byte FavoriteAccount {

get {

try {

return ((byte)(this[this.tableBankAccount.FavoriteAccountColumn]));

}

catch (global:ystem.InvalidCastException e) {
//Stops at the following line, this error was caused by 'if (this.financesDataSet.BankAccount[num].FavoriteAccount == 1)'

throw new global:ystem.Data.StrongTypingException("The value for column 'FavoriteAccount' in table 'BankAccount' is DBNull.", e);

}

}

set {

this[this.tableBankAccount.FavoriteAccountColumn] = value;

}

}


I have no idea what I am doing wrong, all of the code I used I retreived from Microsofts help documentation included with VS2008. I have tried used my TableAdapter.Insert() method and it still failed when it got to

if (this.financesDataSet.BankAccount[num].FavoriteAccount == 1)

in my refreshDatabase() method it still failed.

When I look, the data has been added into the database, it's just when I try to retreive it now, it bails on me. Am I retreiving the information wrong?

Thanks for any help you guys can offer.

Johnathon

View 1 Replies View Related

Large Tables In SQL 7.0

Jul 12, 2001

We currently have a data warehouse running on SQL 7.0, SP2. One of our primary fact tables is now well over 155 million rows in it. The table is not very wide, as it only contains 17 columns, most of which are defined as integers. The entire database is only 20 GB.

The issue is that the loads from the staging table to this fact table have significantly deteriorated over the last month or so, dropping from over 400 transactions per second to around 85. We drop all the indexes on the fact table before we load the data into it.

Are there issues with a manageable table size in SQL 7.0 that we need to be concerned about? And should we consider partitioning the table into several smaller tables and join them with a "union all" view?

I really need to get this performance issue resolved, as our IT support vendor is pushing us to port the data warehouse to UDB because they tell us that SQL server is not scalable enough to handle this volume of data.

Thanks for any help you can provide.

George M. Parker

View 6 Replies View Related

Large Tables

Aug 10, 2000

Hi,

How can i partition the large tables so that the insert and updates which iam doing on the tables take less time.

I want to know how can i partition large tables and if i do that how is that the performance is going to be increased.

Thanks.

View 1 Replies View Related

Large Tables

Mar 13, 2001

How can I find largest 5 or 10 tables in a database?

Thanks in advance
Chan

View 2 Replies View Related

Searching In Large Tables........

Mar 7, 2007

Hi there,i am having some problem related to SQL server........ Actually i am having a table called ZipCodes that have around 80,000 rows... and the size of the table is around 100 MB...... and my table is now on web Server,.  now my problem is that when i fire some query that needs to go through whole of the table then it estimated time to execute the query comes to be 13 seconds and the corsor threshold is set to 7 seconds (and i can't change that)....... so the SQL server cancels the query to be fired........Now i need some Methodology/Technique through which i can search Large Tables with minimum calculations in minimum Time............(Any Ideas)....

View 3 Replies View Related

COMPRESSING LARGE TABLES

Mar 19, 2001

Is it possible to compress the large tables in the database,

like COMPRESS, ARCHIVE options we use to reduce the size of files
stored on any operating system.

I know there is a difference between the file stored on disk and the table created in the database, but currently I am facing space problems wherein, I have to manage my database within the space available, so please advice me if the option is available in SQL Server 6.5 or 7.
I will be happy if I get the solution immediatly as currently I am facing this problem and waiting for your reply.
Thank you
Amol

View 1 Replies View Related

Restore Two Tables From Large DB Into A New DB

Feb 12, 2008

I am fairly new to SQL, so please forgive me if my question is a bit elementary. I need to pull two individual tables out of a massive DB into a new DB for testing.

Thanks for the help.

View 2 Replies View Related

Manipulating Large Tables

Feb 15, 2008

I'm in the midst of a long file conversion job. Today I found that one of the tables (converted from csv) to be 6.7 million records. My sql script which I use to reconfigure the weird original date format, into something the rest of the planet uses, times out due to the size.

Does anyone please know of a file utility to automagically split sql server 2005 tables for later re-combining once my scripts have successfully completed their task on the smaller tables?

View 7 Replies View Related

Partioning Large Tables

Nov 14, 2007

I am making a warehouse managment system. The system will cotain much data, but only a small portion of the data will be accessed frequently. Most of the data will only be accessed seldomly, but the customer wants to keep all historic data (just in case they should need it sometime). I have figured I need to partion the tables somehow to keep what is fresh in one place, and historical data in another place. What is the best way to do this? I am thinking about making historical tables. For example I can have a table named PickList and another table named PickListHistorical. When a picklist is processed/complete I can move it over to the PicklistHistorical table, but when the users need to search for a specific picklist I have to look in both tables. I can ofcourse create a view for this to make it transparant. Sql server 2005 introduced some automatically partioning. Will it be better to use this than create my own historic tables? If so, can you please tell me how I do it?

Thank you!

View 11 Replies View Related

Comparing Large Tables

Oct 19, 2007



I've successfully created SSIS packages where I compare two tables in different databases on different servers. However, this is good enough to compare hundreds of thousands of records quickly. The process becomes a huge performance problem when trying to compare table differences when I'm looking at tables that each contain tens of millions of records.


One database is on a SQL 2005 box and the other DB is SQL 7.0 so the lookup component fails for this type of SQL Server. I've been implementing merge joins and conditional components to do my standard table comparisons.

Is there another way to implement this process or maybe partition it somehow to take pieces of the table at a time and compare them? I'm open to ideas.

View 11 Replies View Related

Tablediff Bug?

Sep 12, 2006

Hey,

I recognized a strange behaviour when using tablediff. Because I don't want to replicate all columns, I create two views on selected attributes and compare them.

Have two tables:
create table t1 (timestamp timestamp, b int PRIMARY KEY, c varchar(30))create table t2 (b int PRIMARY KEY, c varchar(30))

and two views:
create view vt1 as select b, c from t1 create view vt2 as select b, c from t2

Compare the views with tablediff and get a failure:
... vt1 and vt2 ... have different schemas and cannot be compared.

Now I just find out that it works when table t2 also has a timestamp column.

drop table t2create table t2 (b int PRIMARY KEY, c varchar(30), ts timestamp)

Compare the views again with tablediff and get:
... vt1 and vt2 ... are identical.

What dou you think, do I always need the same datatypes in source- and target-table I want to compare, even if a create a subselection via views?

Cheers,

View 5 Replies View Related

What Is Most Efficient Way To Use DataAdapters With Large SQL Tables

Jan 9, 2008

 I'm using DataAdapters with my SQL database with the intention of all the SELECT, UPDATE, INSERT, DELETE commands to be automatically generated.One table is huge so I'm wondering is it more efficient to "SELECT Top(1) * FROM hugetable" instead of  "SELECT * FROM hugetable" in order to facilitate the generation of commands.I hope this isn't too confusing.Thanks,Geoff  

View 2 Replies View Related

4 Seperate Tables Or One Large Table?

May 10, 2008

I have 4 tables with the respective amount of records
1) 6755
2) 2021
3) 2021
4) 355

They all have the same columns. However, they need to be seperate, or at least when I query them. I'll be accessing this database via the web. i was first afraid that a large database would cause major slow down when accessing the db. So I broke it up into 4 tables. If I combined all 4 tables into one large table and just had a column that differentiated the 4, how significant would be the change in speed when accessing the table? It's not a big deal to keep them seperate, its just that when I have to add or remove a column from one table I have to remove it from all the tables. Furthermore, I'm using a module from DEVEXPRESS, don't know if anyone has heard of it, but when you use a gridview, it loads up the entire table even though your paging (which I think is retarded), so for that reason I was afraid it would slow up my access to the db. Any thoughts?

View 2 Replies View Related

Slow Inserts Into Large Tables

Nov 29, 2000

We are inserting into a table, which includes an identity primary key column. When the table gets really large (i.e. 1.5 million records), the performance of the inserts reduce.

I noticed that when we insert into the table an exclusive lock on the table is obtained. Do inserts into tables with identities always lock the table?

Given the table size is unavoidable, does anyone have a suggestion to improve the performance?

Thanks,
Matt

View 6 Replies View Related

Temp Tables Vs. Large Table

Aug 4, 2005

I have a few hundred users, maybe a dozen or two active at any given time, accessing the same database via ASP. The database has many tables, one being a very large orders table with a few million records, in which I have created a view against. A view only because I need to allow the user to filter quite extensively against the results. The users typically only need to view records for the last 30 days and results for each user might be five thousand records or less.

My question is this. Would I be better off writing each user's resultset to a temp table for that user's session and allow the filtering and sorting by the user go against that temp table and increase my hardware requirements to accomodate that. Possibly to the point of creating a database cluster. OR would I be better off leaving it as is where each users uses the same view.

FYI...each user may need visibility to only a hand full of fields, but over all the view must maintain many fields.

Any thoughts on this would be greatly appreciated. Thanks in advance.

Dave

View 2 Replies View Related

Large Number Of Tables And Performance

Jan 25, 2008

Hi gurus, I'm creating a web application where I will have a large number of tables (between 10k and 20k), this is done for the sake of scalability as tables will be moved to different database servers as the application grows and also for performance (smaller indexes). I'm worried though how having a large number of tables could affect the performance of SQL Server as the application will start on one single database server. I tried to find some resources on that on the internet but couldn't find any.

I would really appreciate if you can give me some advice and if you have any good links that would be great...

View 10 Replies View Related

Query Optimization Help For Very Large Tables

Nov 1, 2007

I have the following table structure:
tableA (~85,000 rows) primary key = [colA,colB]
tableB (~850,000 rows) primary key = [colA,colC]
tableC (~120,000,000 rows) primary key = [colA,colB,colC]

IMPORTANT: colC is DATETIME

For a SET of rows in tableA (about 50,000) I need to pull the MOST RECENT (given a date) corresponding values from tables B and C. The only way I can think of doing this is the following:

SELECT tableA.colA
,(SELECT TOP 1 colX FROM tableB WHERE colA = tableA.colA AND colC <= @INPUTDATE ORDER BY colC desc)
,(SELECT TOP 1 colY FROM tableB WHERE colA = tableA.colA AND colC <= @INPUTDATE ORDER BY colC desc)
,... --some more columns from tableB
,(SELECT TOP 1 colX FROM tableC WHERE colA = tableA.colA AND colB = tableA.colB AND colC <= @INPUTDATE ORDER BY colC desc)
,(SELECT TOP 1 colY FROM tableC WHERE colA = tableA.colA AND colB = tableA.colB AND colC <= @INPUTDATE ORDER BY colC desc)
,... --some more columns from tableC
FROM tableA
WHERE tableA.colX = 'some criteria'


Is there any other way anyone can suggest? Unfortunately, because tableC is so large, the disk IO (I think) causes this query to take over an hour. (If I had monster RAM and super fast disk this wouldn't be as big an issue, but that's not an option right now )

Thanks in advance!

View 7 Replies View Related

Large Number Of Tables And Performance

Jan 25, 2008

Hi gurus, I'm creating a web application where I will have a large number of tables (between 10k and 20k), this is done for the sake of scalability as tables will be moved to different database servers as the application grows and also for performance (smaller indexes). I'm worried though how having a large number of tables could affect the performance of SQL Server as the application will start on one single database server. I tried to find some resources on that on the internet but couldn't find any.

I would really appreciate if you can give me some advice and if you have any good links that would be great...

Waleed Eissa
http://www.waleedeissa.com

View 9 Replies View Related

Performance Issues With Large Tables

Dec 5, 2007

Hi,

I have a table with over 61 million records having a clustered index on an identity column(Primary key). Simple count queries are taking minutes to execute on this table (ex: select count(1) from table1). I have checked the statistics on the primary key which displayed me the histogram having the 39th million record as the Range-hi-key. I updated the statistics on this column and tried requerying, but still it took atleast 5 minutes to give me the count of records in the table. Also, there were no users using the table when I queried. Inserts into this table were working fine. I have other tables in my database with 41 million records having no such issues. Can anyone point me to the problem areas in such scenarios?


Thanks,
Harish

View 6 Replies View Related

The Script Threw An Exception: Exception Of Type 'System.OutOfMemoryException' Was Thrown.

Jan 31, 2007

Hi,

I got an strange problem with one of my packages.

When running the package in VisualStudio it runs properly, but if I let this package run as part of an SQL-Server Agent job, I got the message "The script threw an exception: Exception of type 'System.OutOfMemoryException' was thrown." on my log and the package ends up with an error.

Both times it is exactly the same package on the same server, so I don't know how the debug or even if there is anything I need to debug?

Regards,

Jan

View 2 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved