Database Load Fails With Message Insufficient Size(ver 6.5)
Jul 14, 2000
Hi All,
I am trying to restore a DB from my production to Standby server..it gives me a mesage "Msg 3105, Level 16, State 1
Data on dump will not fit into current database. Need 6500 Mbyte database."
The production server DB size is 5000MB and I have increased the size of standby DB to 6500MB but still the same message...
This thing is driving me crazy. I have a simple flat .txt file I'm trying to import into a DB, and this USED to work perfectly with SQL Server 2000. With 2005, I get "product level insufficient."
I've read that and I've looked around. It seems like the solution is to actually connect to the Windows box that the server itself is running on (where SSIS is installed) and import there?? But that's a pain.. that means I'd have to Remote Desktop into the server EVERY time I want to import a text file. Surely there's some other way to do this??
A series of export/import jobs are scheduled on a dozen databases sitting on one of our servers, and are run at regular intervals through the day. Some of the jobs are failing with the following error recorded in the 'View Job History..':
EXCEPTION: Insufficient memory for this operation. Process Exit Code 2. The step failed.
Will this be cured by increasing the memory available to SQL Server (it has 512Mb already, 1/2 of the total physical RAM)? Also, why are only some jobs failing and others completing? Should I run performance monitor when the next schedule is?
I have a SSIS package which I had created on my machine, running Windows XP with SQL2K5 Development Edition. If I run the package through the BI studio or by clicking the dtsx file, it runs without any issues.
When the same package is deployed onto another machine (SBS 2003 with SQL2K5 Workgroup Edition), the package errors with the above message when the dtsx file is clicked and run. However, if the source files are opened up and ran from within the BI studio, it all runs OK.
I've ran through the SQL setup file (through Add/Remove Programs) and Integration Services is installed, however the SSIS service is not on the machine. I've read however that the service is not required to run the package, so what gives? Plus, this doesn't explain why running it through BI studio is fine but not when run by itself.
The data flow task that the package fails on contains 2 SQL sources, 1 SQL destination, 3 sorts, 1 multicast and a couple of merge joins. Are some of these pieces not available in the Workgroup Edition? This would still not explain why it can run in the BI studio on that machine...
I'm using the import wizard to create a new table from a flat file source. The table gets created but no data gets copied. What's wrong? Here's the report:
I have read previous threads on this, still not working.
New installation of SQL Server 2005 on a Windows Server 2003 box. I have installed SP1 on both the server & the client, do I have to re-create the package?
Sometime ago i had to edit a dts package in ssis. To do that i've instaled the sql2005 dts. It worked fine at the time. But now i'm trying to open a dts in ssis after installing sp2 and it doesn't work. Every time i try to open a dts package i got the following error.
Error HRESULT E_FAIL has been returned from a call to a COM component. (Microsoft Visual Studio)
------------------------------ Program Location:
at DTS.CDTSLegacyDesignerClass.ShowDesigner() at Microsoft.SqlServer.Dts.Tasks.Exec80PackageTask.GeneralView.btnEdit_Click(Object sender, EventArgs args)
I dont understand why this is happening. Just one thing i use to have windows server 2003 now i've xp sp2. I don´t know if this can be important but.
MSI (s) (80!8C) [11:25:26:067]: Transforming table Error.
MSI (s) (80!8C) [11:25:26:067]: Note: 1: 2262 2: Error 3: -2147287038 MSI (s) (80!8C) [11:25:26:067]: Product: Microsoft SQL Server 2005 Tools (64-bit) -- Error 29549. Failed to install and configure assemblies C:Program Files (x86)Microsoft SQL Server90NotificationServices9.0.242Binmicrosoft.sqlserver.notificationservices.dll in the COM+ catalog. Error: -2147024894 Error message: The system cannot find the file specified. Error description: Could not load file or assembly 'System.EnterpriseServices.Wrapper.dll' or one of its dependencies. The system cannot find the file specified.
Error 29549. Failed to install and configure assemblies C:Program Files (x86)Microsoft SQL Server90NotificationServices9.0.242Binmicrosoft.sqlserver.notificationservices.dll in the COM+ catalog. Error: -2147024894 Error message: The system cannot find the file specified. Error description: Could not load file or assembly 'System.EnterpriseServices.Wrapper.dll' or one of its dependencies. The system cannot find the file specified. <EndFunc Name='LaunchFunction' Return='-2147024894' GetLastError='0'> MSI (s) (80:70) [11:25:26:333]: Executing op: Header(Signature=1397708873,Version=301,Timestamp=914840316,LangId=1033,Platform=589824,ScriptType=2,ScriptMajorVersion=21,ScriptMinorVersion=4,ScriptAttributes=1) MSI (s) (80:70) [11:25:26:333]: Executing op: DialogInfo(Type=0,Argument=1033) MSI (s) (80:70) [11:25:26:333]: Executing op: DialogInfo(Type=1,Argument=Microsoft SQL Server 2005 Tools (64-bit)) MSI (s) (80:70) [11:25:26:333]: Executing op: RollbackInfo(,RollbackAction=Rollback,RollbackDescription=Rolling back action:,RollbackTemplate=[1],CleanupAction=RollbackCleanup,CleanupDescription=Removing backup files,CleanupTemplate=File: [1]) MSI (s) (80:70) [11:25:26:333]: Executing op: RegisterBackupFile(File=C:Config.Msi6d67d1.rbf)
I seem somehow stuck in some Script Task problems in SSIS on a x64-environment which has SP2 installed already.
If I'm running the package in 32-bit mode, I'm continuously getting the following error "Warning: Precompiled script failed to load. Attempting to recompile. For more information, see the Microsoft Knowledge Base article, KB931846 (http://go.microsoft.com/fwlink/?LinkId=81885)."
on one of the package's scripts -- the given links advises to install SP2... what a useless hint.
And if I switch the project to run in Run64BitRuntime=True , another script tasks complains "ScriptTask_78888010d24c477ea7c8d69f4f5568a5 is not a valid Win32 application. (Exception from HRESULT: 0x800700C1)
-- whatever that might mean...
Maybe I should add that the packages were running in 32-bit mode for the last 12 months now. Without any problems like that.
Those effects are occuring since we put SP2 onto the server...
Any comments on this topic? Similar experiences? Maybe yet some solutions too?
I've got a merge replication set up between boxes. They're on separate sites both behind an ADSL Nat modem router. The publisher connects to the subscriber via a port forward / Nat translation at the router. This replication set up has been running for some months now. This morning I got this failure message from the merge agent on the publisher.
"the specified remote server name may not be the network name of the remote server or the remote server is unreachable due to network problems. The step failed"
Currently the Enterprise Mgr at the publisher can see the subscriber as can query analyzer running on the publisher.
The host name of the subscriber is the same name used in the server registration at the publisher. The registration uses a Client Network Utility alias to resolve the name.
If I run a ping command at the publisher using the subscribers name I get replies (the resolution of that name is done via a hosts file entry)
Both machines are Win2k Server boxes running SQL Server 2000 Standard.
Any ideas why the agent can't see the subscriber despite the fact its usual communications channel is working just fine?
Since applying July 2007 Microsoft patches it was noted that we were not able to open the editor for the Script Task component in an SSIS package. We received an error dialog with text similar to 'Engine failed with unknown error. (Microsoft.VisualBasic.Vsa.Dt).' All other solutions provided in the forum did not resolve our issue. It was noted somewhere else that the OLE32.dll seemed to have been affected. We reregistered this file using the command 'regsvr32.exe ole32.dll' and suddenly we were able to get into the editor. I don't know why this fixed the issue, but it did for us in our environment. Good luck!
I have a SQL server scheduled to replication. This works fine if it has to do that one the sever on the same mechine. If it has to be done to another server, the replication fails with the following error message.
08001[Microsoft][ODBC SQL Server Driver][dbnmpntw]Connectionopen(Createfile())
The following error occurred but did not provide any additional info.
[DTS.Pipeline] Error: The PrimeOutput method on component "Pgrs - tr_hist" (14596) returned error code 0xC02090F5. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
For this package I am trying to use DataDirect's ODBC driver for a Progress OpenEdge Database version 9.1E to move data to SQL Server 2005 in order to make use of SQL's BI suite. I have consulted the DataDirect and have that the driver is functioning properly. The driver works perfectly from another application. They suggest that the problem is in the the datareader's or the connection manager's implementation of the driver.
Please tell me what is the cause of this or at least how to troubleshoot this problem.
I am having an interesting SSIS problem where the package fails to load with the following error message:
Code: 0xC0010018 Source: {BE86A659-AB44-403A-9C89-3524821879E0} Description: Error loading value "" DTS:Name="SqlStatementSource">"Select dbo.fnGetLastOpenExtract('" + @[User::in_ExtractName] + "') as eh_ID"" from node "DTS: PropertyExpression".
This very same package runs on our test server, but fails to even load on UAT server.
SSIS packages are the same on both Test and UAT servers (I compared not just dates and sizes - they are literally the same: byte-to-byte) DTExec version is 9.00.3042.00 on both servers. HKLMSOFTWAREMicrosoftMicrosoft SQL Server90DTSSetupVersion = 9.2.3042.00 on both machines.
This started to happen when the UAT machine was upgraded to Service Pack 2 of SQL Server 2005. Please note that the UAT server only runs SSIS packages and does not have SQL 2005 database engine installed. There is, however, an older installation of SQL Server 2000 on UAT machine (I am not sure if Test machine has it - will check tomorrow).
Any help is greatly appreciated.
Thanks,
Alex
Here is the compete output from DTExec:
Code Snippet
D:AM5Jobs>"C:Program FilesMicrosoft SQL Server90DTSBinnDTExec.exe" /File "D:ExtractsGBG_ExtractSSISImport_ExtractStartComplete_03.dtsx" /Checkp OFF /Cons MT /Set Package.Variables[User::in_ExtractName].Properties[Value];SagittaMapping_Replication /Set Package.Variables[User::in_StartComplete].Properties[Value];Start Microsoft (R) SQL Server Execute Package Utility Version 9.00.3042.00 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved.
Started: 12:33:21 PM Error: 2007-07-17 12:34:52.98 Code: 0xC0010018 Source: {BE86A659-AB44-403A-9C89-3524821879E0} Description: Error loading value "<DTS:PropertyExpression xmlns:DTS="www.microsoft.com/SqlServer/Dts" DTS:Name="SqlStatementSource">"Select dbo.fnGetLastOpenExtract('" + @[User::in_ExtractName] + "') as eh_ID"</DTS:PropertyExpression>" from node "DTS:PropertyExpression". End Error Error: 2007-07-17 12:34:52.98 Code: 0xC0010018 Source: {BE86A659-AB44-403A-9C89-3524821879E0} Description: Error loading a task. The contact information for the task is "Execute SQL Task; Microsoft Corporation; Microsoft SQL Server v9; ? 2004 Microsoft Corporation; All Rights Reserved;http://www.microsoft.com/sql/support/default.asp;1". This happens when loading a task fails. End Error Error: 2007-07-17 12:34:52.98 Code: 0xC0010021 Source: Description: Element "{1c66489c-2a3f-4c8a-b9e7-0161875427a2}" does not exist in collection "Executables". End Error Error: 2007-07-17 12:34:52.98 Code: 0xC0010018 Source: Description: Error loading value "<DTS:Executable xmlns:DTS="www.microsoft.com/SqlServer/Dts" IDREF="{1c66489c-2a3f-4c8a-b9e7-0161875427a2}" DTS:IsFrom="-1"/>" from node "DTS:Executable". End Error Error: 2007-07-17 12:34:52.98 Code: 0xC0010018 Source: Description: Error loading value "<DTS:PrecedenceConstraint xmlns:DTS="www.microsoft.com/SqlServer/Dts"><DTS:Property DTS:Name="Value">0</DTS:Property><DTS:Property DTS:Name="EvalOp">2</DTS:Property><DTS:Property DTS:Name="LogicalAnd">-1</DTS:Property><DTS:Property DTS:Name="Expression"></" from node "DTS:PrecedenceConstraint". End Error Could not load package "D:ExtractsGBG_ExtractSSISImport_ExtractStartComplete_03.dtsx" because of error 0xC0010014. Description: The package failed to load due to error 0xC0010014 "One or more error occurred. There should be more specific errors preceding this one that explains the details of the errors. This message is used as a return value from functions that encounter errors.". This occurs when CPackage::LoadFromXML fails. Source: Started: 12:33:21 PM Finished: 12:34:53 PM Elapsed: 91.938 seconds
I'm seeing some strange behavior from the OLE DB Destination when using the "fast load" access mode and setting the "Maximum insert commit size".
When I do not set the "Rows per batch" or the "Maximum insert commit size", the package I'm working with inserts 123,070 rows using a single "insert bulk" statement. The data seems to flow through the pipeline until it gets to the OLE DB Destination and then I see a short pause. I'm assuming the pause is from the "insert bulk" statement handling all of the rows at once.
When I set the "Rows per batch" option but leave the "Maximum insert commit size" alone, I generally see the same behavior -- a single "insert bulk" statement that handles all 123,070. In this case, however, the "insert bulk" statement has a "ROWS_PER_BATCH" option appended to the statement that matches the "Rows per batch" setting. This makes sense. I'm assuming the "insert bulk" then "batches" the rows into multiple insert statements (although I'm unsure of how to confirm this). This version of the "insert bulk" statement appears to run in about the same time as the case above.
When I set the "Maximum insert commit size" option and leave the "Rows per batch" statement alone, I see multiple "insert bulk" statements being executed, each handling the lower of either the value I specify for the "Maximum insert commit size" or the number of rows in a single buffer flowing through the pipeline. In my testing, the number of rows in a buffer was 9,681. So, if I set the "Maximum insert commit size" to 5,000, I see two "insert bulk" statements for each buffer that flows into the OLE DB Destination (one handling 5,000 rows and one handling 4,681 rows). If I set the "Maximum insert commit size" to 10,000, I see a single "insert bulk" statement for each buffer that flows into the OLE DB Destination (handling 9,681 rows).
Now the problem. When I set the "Maximum insert commit size" as described in the last case above, I see LONG pauses between buffers being handled by the OLE DB Destination. For example, I might see one buffer of data flow through (and be handled by one or more "insert bulk" statements based on the "Maximum insert commit size" setting), then see a 2-3 minute pause before the next buffer of data is handled (with its one or more "insert bulk" statements being executed). Then I might see a 4-5 minute pause before the next buffer of data is handled. The pause between the buffers being passed through the OLE DB Destination (and handled via the "insert bulk" statements) is sometimes shorter, sometimes longer.
Using Profiler, I don't see any other activity going on within the database or within SQL Server itself that would explain the pauses between the buffers being handled by the OLE DB Destination and the resulting "insert bulk" statements...
Can anyone explain what is going on here? Is setting the "Maximum insert commit size" a bad idea? What are the differences between it and the "Rows per batch" setting and what are the recommended uses of these two options to try to improve the performance of the insert (particularly when handling millions of rows)?
We have changed the Network Domain name of a SQL server and restarted before changing any of the settings in SQL server. When we now try to start the SQL server service manager we get the following error
Your SQL server is either corrupt or has been tampered with. Unknown package id please rerun setup
Any suggestions on how to get the SQL server back on line without having to take the server down and changing the Netowrk Domain Name back ??
This is my first posting and actually I have big problem which I need to resolve immediately:
I have a 38 GB Database on my SQL Server which I want to full-backup. Until now backup was created on the same volume where the database resides. But now the volume is out of disk space. So what we did is that we connected an external USB disk drive with 500 GB to the SQL Server and try to backup there.
But our first and all consecutive backup attemps failed with error SQL-DMO (ODBC SQLState: 42000). It's working fine the first few minutes as we can see that the backup file is continously growing. But then it abruptly aborts with above error message.
Can anyone figure out why ? Any help is appreciated.
I'm converting a replication script from SQL 2000 to SQL 2005.
I am getting an error with push merge with no way to figure out what is wrong.
I've configured replication on a single XP server in SQL 2005 RTM version.
I have a push merge set up between A and B and between B and C. All 3 databases are 9.0 compatibility.
The snapshot and merge jobs for the A to B run fine with no errors, and merge replicates ok.
The snapshot for B to C fails with this message: Message 2006-03-09 17:30:35.94 --------------------------------------------- 2006-03-09 17:30:35.94 -BcpBatchSize 100000 2006-03-09 17:30:35.94 -HistoryVerboseLevel 2 2006-03-09 17:30:35.94 -LoginTimeout 15 2006-03-09 17:30:35.94 -QueryTimeout 1800 2006-03-09 17:30:35.94 --------------------------------------------- 2006-03-09 17:30:35.95 Connecting to Publisher 'MyInstance' 2006-03-09 17:30:35.97 Publisher database compatibility level is set to 90. 2006-03-09 17:30:35.97 Retrieving publication and article information from the publisher database 'MyInstance.MyDB' 2006-03-09 17:30:36.22 [0%] The replication agent had encountered an exception. 2006-03-09 17:30:36.22 Source: Replication 2006-03-09 17:30:36.22 Exception Type: Microsoft.SqlServer.Replication.ReplicationAgentSqlException 2006-03-09 17:30:36.22 Exception Message: Data is Null. This method or property cannot be called on Null values. 2006-03-09 17:30:36.22 Message Code: 52006 2006-03-09 17:30:36.22
Love that exception message: "Data is Null" - very helpful to someone who is clairvoyant perhaps. I checked the snapshot bcp files. The tables being merged all have data.
If you have any ideas on how to fix this, I'd be most grateful. As it is after 6pm I probably won't read this again until morning. Thanks for any suggestions.
I have a flat file which is loaded into the database on a daily basis. The file contains rows of strings which I load into a table, specifically to a column of length 8000.
The string has a length of 690, but the format is like 'xxxxxx xx xx..' and so on, where 'xxxx' represents data. So there are spaces, etc present in the middle.
Previously I used SQL 2000 DTS to load the files in, and it was just a Column Transformation with the Col001 from the text file loading straight to my table column. After the load, if I select len(col) it gives me 750 for all rows.
Once I started to migrate this to SSIS, I allocated the Control Flow Task and specified the flat file source and the oledb destination, and gave the output column a type of String and output column width of 8000. But when I run the data flow task it copies only 181 or 231 characters out of the 750 required. I feel it stops where it finds the SPACES and skips the rest.
I specified row delimiters or CR, and LF. I checked the file under UltraEdit and there were no special characters in the file that would cause the problem.
Any suggestions how I can get it to load the full data?
I have a simple SSIS package that reads a flat file and copies it into a SQL Server table.
When the flat fiel is on the C drive I have no problem runnign this package from SQL Server Agent, but as soon as I update the path to a network location the package only works when I run it manually, but fails when is executed via the SQL Server agent job.
The error says "cannot open the datafile", while the datafile location is valid.
Is this a kind of limitation of a SQL Server Agent that only local files are allowed to be processed?
Hello,I am trying to bulk copy some data from a text file to SqlServer. In my case, the table in SqlServer is simple. It has two columns: Symbol <nchar(5), Primary Key> and Company <nvarchar(50)>. Each row in the text file is Symbol and Company separated by a "#". Below is the code of my bulk copy: public static void StartImport(string sourceFile) { SqlBulkCopy bulkCopy = new SqlBulkCopy(connString_local, SqlBulkCopyOptions.TableLock); bulkCopy.DestinationTableName = "dbo.NasdaqSymbols"; DataTable dt = CreateSymbolDataTable(sourceFile); bulkCopy.WriteToServer(dt); } private static DataTable CreateSymbolDataTable(string filePath) { DataTable dt = new DataTable(); DataColumn dc; DataRow dr; dc = new DataColumn(); dc.DataType = Type.GetType("System.String"); dc.ColumnName = "Symbol"; dc.Unique = true; dt.Columns.Add(dc); dc = new DataColumn(); dc.DataType = Type.GetType("System.String"); dc.ColumnName = "Company"; dc.Unique = false; dt.Columns.Add(dc); StreamReader sr = new StreamReader(filePath); string input; while ((input = sr.ReadLine()) != null) { string[] s = input.Split(new string[] { "#" }, StringSplitOptions.None); dr = dt.NewRow(); dr["Symbol"] = s[0].Trim(); dr["Company"] = s[1].Trim(); dt.Rows.Add(dr); } sr.Close(); return dt; } The problem is, I got the following exception when I tried to call my StartImport method (thrown from SqlBulkCopy.WriteToServer): System.InvalidOperationException: The given value of type String from the data source cannot be converted to type nvarchar of the specified target column. It turned out that the problem seems not to be String to nvarchar; because when I use a source text file which contains only about a dozen of rows, it works! I have no idea why SqlBulkCopy.WriteToServer works fine on a small set of data. Or is there something I overlooked? Thank you for time and help.Gary
I have developed an SSIS package that includes a Script Task on a 32-bit machine. The PrecompileScriptIntoBinaryCode property is set to True. After I build the package, the .dtsx file includes a <BinaryItem> element for that Task. Package runs fine on the dev machine, both in BIDS and as SQL Server Agent job.
When I deploy the package to a 64-bit server, it runs fine when I execute the package ad hoc from SQL Server Management Studio. However, when I schedule the package for execution as a SQL Server Agent job, the package fails with the message: "the script files failed to load."
I have reviewed posts on this error from late 2005, but the solutions don't work in this case. Specifically:
1. The Precompile property is already set to True.
2. I have already verified that the script was compiled.
I have developed a SQL Server 2005 Integration Services (SSIS) package that includes a Script Task on a 32-bit machine. The PrecompileScriptIntoBinaryCode property is set to True. After I build the package, the .dtsx file includes a <BinaryItem> element for that Task. Package runs fine on the dev machine, both in BIDS and as SQL Server Agent job. When I deploy the package to a 64-bit server, it runs fine when I execute the package ad hoc from SQL Server Management Studio. However, when I schedule the package for execution as a SQL Server Agent job, the package fails with the message: "the script files failed to load." I have reviewed posts on this error from late 2005, but the solutions don't work in this case. Specifically: 1. The Precompile property is already set to True. 2. I have already verified that the script was compiled. Any further suggestions?
Hi, i use this script that show me the size of each table and do the sum of all the table size.
SELECT X.[name], REPLACE(CONVERT(varchar, CONVERT(money, X.[rows]), 1), '.00', '') AS [rows], REPLACE(CONVERT(varchar, CONVERT(money, X.[reserved]), 1), '.00', '') AS [reserved], REPLACE(CONVERT(varchar, CONVERT(money, X.[data]), 1), '.00', '') AS [data], REPLACE(CONVERT(varchar, CONVERT(money, X.[index_size]), 1), '.00', '') AS [index_size], REPLACE(CONVERT(varchar, CONVERT(money, X.[unused]), 1), '.00', '') AS [unused] FROM (SELECT CAST(object_name(id) AS varchar(50)) AS [name], SUM(CASE WHEN indid < 2 THEN CONVERT(bigint, [rows]) END) AS [rows], SUM(CONVERT(bigint, reserved)) * 8 AS reserved, SUM(CONVERT(bigint, dpages)) * 8 AS data, SUM(CONVERT(bigint, used) - CONVERT(bigint, dpages)) * 8 AS index_size, SUM(CONVERT(bigint, reserved) - CONVERT(bigint, used)) * 8 AS unused FROM sysindexes WITH (NOLOCK) WHERE sysindexes.indid IN (0, 1, 255) AND sysindexes.id > 100 AND object_name(sysindexes.id) <> 'dtproperties' GROUP BY sysindexes.id WITH ROLLUP) AS X ORDER BY X.[name]
the problem is that the sum of all tables is not the same size when i make a full database backup. example of this is when i run this query against my database i see a sum of 111,899 KB that they are 111MB,but when i do full backup to that database the size of this full backup is 1.5GB,why is that and where this size come from?
I am trying to resize a database initial log file from 500M to 2M. I€™m using€?
ALTER DATABASE <DBNAME> MODIFY FILE ( NAME = <DBLOGFILENAME, SIZE = 2 ) "
And I'm getting "MODIFY FILE failed. Specified size is less than current size." I tried going into the database properties and setting the log file to 2M, but it doesn€™t keep the changes.
Hi, I am using exec sp_helpdb go dbcc sqlperf(logspace) for getting database size and log size. Is this gives the correct database size and log size or Is there any other way to get the logsize and database size by means of query analyzer.
Appreciate if somebody take out a moment and help me here. I'm having a production database backing up every night at 10:00PM. I've one more development server sitting side by and want this development server to load the production backup copy every night. But the problem what I'm encoutering is, When there are users connected to my development server, It wont get load saying "Database is in Use".
Therefore is there a command, I kill all users at one short and have a task which will load the database? If so, Please let me know how to do this.
Does everyone have experience to load a big database? My SQL Server 2K is running on W2K. The database is 4GB, and the biggest table has 12 million records. Thanks a lot.