SqlWb.exe : Application Error- Memory Could Not Be Read Error
Sep 14, 2007
Hi,
I am trying to execute a SSIS package from a client through BIDS, but when I start BIDS, I am getting the error -
SqlWb.exe : Application error - The instruction at "0X77D...." referenced memory at "0X00000002". The memory could bot be "read".
Click on OK to terminate the program.
Click on CANCEL to terminate the program.
Please help.
Other information:
I have tried running the package on the server and it executes properly. I really dont know why this is a problem with the SQL Server clients alone. I have also tried googling around and could not find any resolution. Can anyone point me to the right direction please!!!
Hi,I've got this dts package in sql server 2000 sp4 that has severaltransformation tasks with an oracle database (10g) as the destination.The package executes successfully when run through the dts designer butwhen run using dtsrun, I get the following error at the completion ofall tasks in the dts package. The data inserts into the oracle databasebut i always get this Application errorThe Instruction at "0x7c8327f9" referenced memory at "Oxffffffff". Thememory could not be "read"Does anyone know how I can fix this?ThanksLyn
I have created few reports using SSRS 2005. I am using Oracle database in Data Source to fetch my data. It is working fine and showing me report correctly. But after running the report 8 to 10 times, it starts giving me Memory error. To get rid of that, I need to recycle (stop-start) ReportingService from IIS.
I am exactly getting following error...
Attempted to read or write protected memory. This is often an indication that other memory is corrupt.
I am not getting the actual problem, why is it giving memory error only after running few times? Please let me know if anyone facing same problem or knowing the solution for the problem.
Database 'C:INETPUBDEMO.TRUTHSTONE.COMAPP_DATAASPNETDB.MDF' cannot be upgraded because it is read-only or has read-only files. Make the database or files writeable, and rerun recovery.Cannot open user default database. Login failed.Login failed for user 'NT AUTHORITYNETWORK SERVICE'. Hi, Strange one. I have copied my working web-site to the production server and get this message. Tried copying new set of database files and reset the server to ensure no applications could be accessing the database files. Still get the same error. Works fine on my Visual Studio VWD Express version.
Mitch PS I am not trying to upgrade the database, just use it!
I'm trying to import data from a Sybase ASE 12.0 database called "OurTestDatabase" into MS SQL Server 2005. I started SSIS Wizard and indicated "Sybase ASE OLEDB Provider" as a source and SQL Native Client as the target. I'm gettign the following error message:
The same data source worked with DTS when we thought we'd convert to MS SQL Server 2000. Is this a bug in SSIS? What can be done? Using ".Net Framework Provider for ODBC" is not a good option because this doesn't allow me to choose any tables from the Sybase source.
Hi, I am getting error while opening the Data Base Engine in Sql server Management Studio. We applied SP2. and restarted the server but no luck. Error Message:Attempted to read or protected memory. This is often an indication that other memory is corrupt(mscorlib).
When we try to run aggregation or purge queries on some tables we are getting following message:
" error [I/O error (bad page ID) detected during read at offset 0x000001ad65a000 in file 'E:MSSQL2KDataGenesys_DataMartGenesys_Datamart.mdf '. Severity 24, State 2, Procedure 'PWMGENESYSDB1 n u! ll', Line 1]"
After this we executed DBCC CHECKDB. Attaching the output obtained after executing this command, to fix these errors we executed DBCC repair_allow_data_loss. I am attaching output for this also. Pls go thru the logs and pls let me know what could be the problem and how it can be addressed.
hi , i am creating a service broker application between two different instance.when i am initiating a dialog from the source my message remain in the sys.transmission_queue.but its transmission_status column is empty.
i attached the profiler with both source and target by including all the service broker event.
in my source profiler i am getting the error like -- Connection attempt failed with error: '10061(No connection could be made because the target machine actively refused it.)'. with event Broker:connection
and in the target profiler error is --This message could not be delivered because the security context could not be retrieved. with event Broker:Message Undelivarible.
i have checked my port also using telnet with remotely and localy both working fine. i am using port no. 4001 and i have mentioned the port no. in the address of the route.
I recently installed SQL Server 2005 Enterprise on a machine running Server 2003. I have successfully configured Reporting Services (see below for summary of settings) - Used the defaults for the Repor tServer and Report Manager Virtual Directories - Windows Service Identity set to domain user. The domain user is part of the administrator group on the machine and has sysadmin rights to the database - Web Service Identity set to NT AuthorityNetworkService
When I open http://localhost/reports/, I get the following error:
I have check a bunch of forums, but have no success. Any advise would be greatly appreciated!!
Server Error in '/Reports' Application.
Compilation Error Description: An error occurred during the compilation of a resource required to service this request. Please review the following specific error details and modify your source code appropriately.
Compiler Error Message: CS0016: Could not write to output file 'c:WINDOWSMicrosoft.NETFrameworkv2.0.50727Temporary ASP.NET Files eports2cbaf422c4330628App_global.asax.th5hkjqv.dll' -- 'The directory name is invalid. '
Source Error:
[No relevant source lines] Source File: Line: 0
Show Detailed Compiler Output:
c:windowssystem32inetsrv> "C:WINDOWSMicrosoft.NETFrameworkv2.0.50727csc.exe" /t:library /utf8output /R:"C:WINDOWSassemblyGAC_32System.Web2.0.0.0__b03f5f7f11d50a3aSystem.Web.dll" /R:"C:WINDOWSMicrosoft.NETFrameworkv2.0.50727Temporary ASP.NET Files eports2cbaf422c4330628assemblydl3a890e9c0 068591f_f54cc701ReportingServicesFileShareDeliveryProvider.DLL" /R:"C:WINDOWSassemblyGAC_MSILSystem.Web.Mobile2.0.0.0__b03f5f7f11d50a3aSystem.Web.Mobile.dll" /R:"C:WINDOWSassemblyGAC_32System.Data2.0.0.0__b77a5c561934e089System.Data.dll" /R:"C:WINDOWSassemblyGAC_MSILSystem.Web.Services2.0.0.0__b03f5f7f11d50a3aSystem.Web.Services.dll" /R:"C:WINDOWSassemblyGAC_MSILSystem.Configuration2.0.0.0__b03f5f7f11d50a3aSystem.Configuration.dll" /R:"C:WINDOWSassemblyGAC_32System.EnterpriseServices2.0.0.0__b03f5f7f11d50a3aSystem.EnterpriseServices.dll" /R:"C:WINDOWSassemblyGAC_MSILSystem.IdentityModel3.0.0.0__b77a5c561934e089System.IdentityModel.dll" /R:"C:WINDOWSassemblyGAC_MSILSystem.ServiceModel3.0.0.0__b77a5c561934e089System.ServiceModel.dll" /R:"C:WINDOWSassemblyGAC_MSILSystem2.0.0.0__b77a5c561934e089System.dll" /R:"C:WINDOWSMicrosoft.NETFrameworkv2.0.50727Temporary ASP.NET Files eports2cbaf422c4330628assemblydl34a099978 068591f_f54cc701ReportingServicesEmailDeliveryProvider.DLL" /R:"C:WINDOWSMicrosoft.NETFrameworkv2.0.50727mscorlib.dll" /R:"C:WINDOWSMicrosoft.NETFrameworkv2.0.50727Temporary ASP.NET Files eports2cbaf422c4330628assemblydl3de5a2332 0958a20_f54cc701ReportingServicesWebUserInterface.DLL" /R:"C:WINDOWSassemblyGAC_MSILSystem.Xml2.0.0.0__b77a5c561934e089System.Xml.dll" /R:"C:WINDOWSMicrosoft.NETFrameworkv2.0.50727Temporary ASP.NET Files eports2cbaf422c4330628assemblydl3498aee86 042473c_93d0c501ReportingServicesCDOInterop.DLL" /R:"C:WINDOWSassemblyGAC_MSILSystem.Runtime.Serialization3.0.0.0__b77a5c561934e089System.Runtime.Serialization.dll" /R:"C:WINDOWSassemblyGAC_MSILSystem.Drawing2.0.0.0__b03f5f7f11d50a3aSystem.Drawing.dll" /R:"C:WINDOWSMicrosoft.NETFrameworkv2.0.50727Temporary ASP.NET Files eports2cbaf422c4330628assemblydl3cd47089b 0f2a80e_f54cc701Microsoft.ReportingServices.Interfaces.DLL" /R:"C:WINDOWSMicrosoft.NETFrameworkv2.0.50727Temporary ASP.NET Files eports2cbaf422c4330628assemblydl3f180608d 0083daf_b16dc701Microsoft.ReportingServices.Diagnostics.DLL" /R:"C:WINDOWSMicrosoft.NETFrameworkv2.0.50727Temporary ASP.NET Files eports2cbaf422c4330628assemblydl323449648 068591f_f54cc701ReportingServicesNativeClient.DLL" /out:"C:WINDOWSMicrosoft.NETFrameworkv2.0.50727Temporary ASP.NET Files eports2cbaf422c4330628App_global.asax.th5hkjqv.dll" /debug- /optimize+ /w:4 /nowarn:1659;1699;1701 "C:WINDOWSMicrosoft.NETFrameworkv2.0.50727Temporary ASP.NET Files eports2cbaf422c4330628App_global.asax.th5hkjqv.0.cs" "C:WINDOWSMicrosoft.NETFrameworkv2.0.50727Temporary ASP.NET Files eports2cbaf422c4330628App_global.asax.th5hkjqv.1.cs"
Microsoft (R) Visual C# 2005 Compiler version 8.00.50727.1433 for Microsoft (R) Windows (R) 2005 Framework version 2.0.50727 Copyright (C) Microsoft Corporation 2001-2005. All rights reserved.
error CS0016: Could not write to output file 'c:WINDOWSMicrosoft.NETFrameworkv2.0.50727Temporary ASP.NET Files eports2cbaf422c4330628App_global.asax.th5hkjqv.dll' -- 'The directory name is invalid. '
Version Information: Microsoft .NET Framework Version:2.0.50727.1433; ASP.NET Version:2.0.50727.1433
I have been researching this problem on the net. I have everything setup, here is the wierd part. I did a test windows application, and that application connects into my database just fine. I have SQL DEV 2005, everything is on my laptop. I have websites that I have wrote that connects fine.
I have application that give me this error. Now the only thing is that I wrote this windows app on a diffrent machine, with a diffrent SQL 2005 server. I had to move everything on my laptop, so I can finish on site at the clients place. So I grabbed the project, copied it over, and opened it on my laptop. No problems. I created a new database, scripted everything from the other server, and ran those scripts on my laptop to create the tables, SP etc.
Now my test application I created on my laptop, I can connect to the databse just fine with this string.
I can also connect to the database that I brought over from the other server. Could there be somthing with the Datasets, or something like that? Any Thoughts?
Hello. I have received the follwoing error upon an attempt to Browse the Cube. All other tabs are functional, including the Calculations tab. We are running Windows Server 2003 SP2 and SQL Server 2005 SP2. Any suggestions would be greatly appreciated!
**EDIT** - Have confirmed SP1 for VS2005 is installed both locally and on server, also.
Attempted to read or write protected memory. This is often an indication that other memory is corrupt. (Microsoft Visual Studio)
------------------------------ Program Location:
at Microsoft.Office.Interop.Owc11.PivotView.get_FieldSets() at Microsoft.AnalysisServices.Controls.PivotTableFontAdjustor.TransformFonts(Font font) at Microsoft.AnalysisServices.Browse.CubeBrowser.UpdatePivotTable(Boolean translate) at Microsoft.AnalysisServices.Browse.CubeBrowser.UpdateAll(Boolean translate) at Microsoft.AnalysisServices.Browse.CubeBrowser.InitialUpdate() at Microsoft.AnalysisServices.Browse.CubeBrowser.SupportFunctionWhichCanFail(FunctionWhichCanFail function)
have SQL Server 2005 std edition SP1 installed on Windows 2003 Std edition .Configured Transactional (single Publisher and no clustered environment.) Replication past two months working fine, Now 1.Distrib.exe application err is coming.
Due to which my job is failing (Distributor to Subscriber). Iam attaching thw file. Thanks Sandeep
i'm going nuts with SQL server notification thing. I have gone throigh this artical which tells how to set user http://www.codeproject.com/KB/database/SqlDependencyPermissions.aspx. This article show how to create new user and setup for sql server notification.But In my case user was alredy existing in database. which is very common senario in most cases. So i did following( check the SQL script below) but then i get this error "A connection was successfully established with the server, but then an error occurred during the login process. (provider: Shared Memory Provider, error: 0 - No process is on the other end of the pipe.)" this my sql script use [master]Go -- Ensuring that Service Broker is enabled ALTER DATABASE [DatabaseName] SET ENABLE_BROKERGO -- Switching to our databaseuse [DatabaseName]GO CREATE SCHEMA schemaname AUTHORIZATION usernameGO ALTER USER username WITH DEFAULT_SCHEMA = schemaname GO /* * Creating two new roles. We're not going to set the necessary permissions * on the user-accounts, but we're going to set them on these two new roles. * At the end of this script, we're simply going to make our two users * members of these roles. */EXEC sp_addrole 'sql_dependency_subscriber' EXEC sp_addrole 'sql_dependency_starter' -- Permissions needed for [sql_dependency_starter]GRANT CREATE PROCEDURE to [sql_dependency_starter] GRANT CREATE QUEUE to [sql_dependency_starter]GRANT CREATE SERVICE to [sql_dependency_starter]GRANT REFERENCES on CONTRACT::[http://schemas.microsoft.com/SQL/Notifications/PostQueryNotification] to [sql_dependency_starter] GRANT VIEW DEFINITION TO [sql_dependency_starter] -- Permissions needed for [sql_dependency_subscriber] GRANT SELECT to [sql_dependency_subscriber] GRANT SUBSCRIBE QUERY NOTIFICATIONS TO [sql_dependency_subscriber] GRANT RECEIVE ON QueryNotificationErrorsQueue TO [sql_dependency_subscriber] GRANT REFERENCES on CONTRACT::[http://schemas.microsoft.com/SQL/Notifications/PostQueryNotification] to [sql_dependency_subscriber] -- Making sure that my users are member of the correct role.EXEC sp_addrolemember 'sql_dependency_starter', 'username'EXEC sp_addrolemember 'sql_dependency_subscriber', 'username'
This is my first time to use SQL Server with VB, so if I've left out any relevant information, please let me know.
Here's my problem: When I try to execute the RDO AddNew method, I get an error -- "The Resultset is Read Only." When I click on the debug option, the code has stopped at the .addnew line.
When I connect to the SQL database, I use this code:
I am getting error 8645 on a two processor 1GB RAM machine with SQL server 7.0 svcpack 1 on Windows NT svcpack 4. It has a 4GB db which is used by a VB. application running on another server. Initially I left the memory setting to dynamic, getting error 8645 when more users try to access the server. Upon lot of complains I changed the memory configuration and fixed it to 800MB, other application running on the server is Seagate backupexec but the problem exists even before this application. I stopped the service just to make sure if problem occurs because of backup software. Server is completely dedicated SQL server. Number of users are set to 400. I can see average 200 connections every day but active connections are almost 20. When error pops up in SQL agent display log, users complain about screen freezing up and cannot do anything. Machine of this size should be able to handle that many users without any glitch. Page file size is 1.5GB.
After careful examination I made following changes on SQL Server
1. Configured to use "Boost SQL Server priority on Windows NT" 2. Configured to use "Use NT fibers" 3. Dropped number of connections from 500 to 400. 4. Changed memory option back to dynamic allocation.
After these options changed there was no problem for couple of days and users were happy that not a single downtime occurs. But this morning it started to acting up again, it does not show any error message in SQL server error log but in SQL agent error log. My questions are: What areas I need to look into to improve performance and to avoid these memory errors? Why these messages appearing in SQL Agent error log? Should I apply service pack 5 on NT and service pack 2 on SQL server? Do I need to increase RAM.
I will really appreciate if you address this problem on priority.
I use linkedserver to get data from Oracle, but sometimes I get the error below:
Server: Msg 7399, Level 16, State 1, Procedure GetDataFromERP, Line 21 OLE DB provider 'OraOLEDB.Oracle' reported an error. 2006-02-09 [OLE/DB provider returned message: ] [OLE/DB provider returned message: ROW-00001: Cannot allocate memory] OLE DB error trace [OLE/DB Provider 'OraOLEDB.Oracle' ICommandText::Execute returned 0x80004005: ].
I think maybe it is because the data is very large. How do I solve this problem?
I'm trying to generate a very large report. My server's got 32 gigs of memory though, so memory shouldn't be an issue. However, when I try to generate my report, I get the following outofmemory error message in the error log on the server:
The w3wp.exe process maxes out at ~1,250 MB. I imagine I just have to change a setting somewhere to allow this to grow beyond that, but I can't seem to find that setting. What do I have to do??
I have gotten two 614 error messages over the past two weeks. I also have a known memory leak problem on my production server. After I have identified the data on the page, I have gone back to requery that page and have not had a problem. I am unable to run checkdb on the db, because it is to big, and I am 24x7. My question/assumption is, the data on this page gets loaded into cache, and because of the memory leaks I am getting glitches like this, has anybody out there, had this type of problem?
From BOL Error 614 is as follows:
Message Text A row on page %Id was accessed that has an illegal length of %d in database `%.*s`.
Explanation This error occurs when an attempt is made to access a row on a page and the actual length of the row is greater than expected. The error indicates that there is a structural problem on the database page accessed during a read or write operation, and it involves a specific row from that page. This structural problem can occur as a result of the following: ·Hardware problem related to but not limited to the hard drive, controller, or the controller`s implementation of write caching ·A structural problem determined when loading a database dump; it may indicate a integrity problem with the database dump file
My company has a database that is throwing a weird error. We've tried reinstalling both the OS and the SQL instance, and the error still persists. We think this error might have to do with some .NET code we've written to run on the box, but I cannot find anything out on the internet about it. Here is the Enterprise Manager Error Log:
Insufficient memory available.. Error: 17803, Severity: 20, State: 4 Query Memory Manager: Grants=0 Waiting=0 Maximum=97638 Available=97638 Global Memory Objects: Resource=912 Locks=42 SQLCache=67 Replication=2 LockBytes=2 ServerGlobal=20 Xact=12 Dynamic Memory Manager: Stolen=2138 OS Reserved=1048 OS Committed=1026 OS In Use=1022 Query Plan=1777 Optimizer=0 General=1066 Utilities=12 Connection=262 Procedure Cache: TotalProcs=488 TotalPages=1787 InUsePages=542 Buffer Counts: Commited=5168 Target=131072 Hashed=1917 InternalReservation=191 ExternalReservation=0 Min Free=128 Visible= 131072 Buffer Distribution: Stolen=351 Free=1113 Procedures=1787 Inram=0 Dirty=599 Kept=0 I/O=0, Latched=23, Other=1295 WARNING: Failed to reserve contiguous memory of Size= 65536.
I can find information if I do a Google search on "Error: 17803, Severity: 20" But as soon as I add "State: 4" to the query I get no results. Also, the articles that I have seen that give the same error messages (but different states) tend to deal with servers that have more than 4GB of memory. This server has ONLY 4GB of memory and in order to try and resolve this issue, we have limited the server to 1GB of physical memory to no avail.
We have an application running on a SQL cluster (Win 2003) and SQL 2005 SP2 within it's own instance - 12 total databases and about 100G of data total. The node this instance is on has 64G of RAM with 16 allocated to this instance (only 8G allocated to other instances currently).
Now to the problem there is one process that when running we get the error below and we cannot figure how to correct this - the process runs 8 times a day and has been running great and then all of a sudden stopped running with the memory error. I am in search of any tips to diagnose or correct this issue
[298] SQLServer Error: 701, There is insufficient system memory to run this query. [SQLSTATE 42000]
I am currently encountering an error of testing an SSIS package in the server.
The package runs fine on my laptop, but not in the server.
I appreciate it for any of your input, comments, and suggestions.
The package is to populate records (150k rows) from a DB2 table and insert them into
another DB2 table (12,754,715 rows).
The server,
Windows Server 2003 Enterprise
SQL server 2005 sp1
8 CPUs
6 GB RAM
Native OLE DBIBM OLE DB Provider for DB2
My laptop,
Windows 2000 Professional
SQL server 2005 sp1
2 CPUs
2 GB RAM
Native OLE DBIBM OLE DB Provider for DB2
It fails on the insertion part (see the error message at the bottom). I have been playing with "DefaultBufferMaxRows" and "DefaultBufferSize" properties, but still no luck so far.
For the testing purpose, I even only select 2 rows from the source table, but it still fails with the same error message. And strangely, it still takes a very long time to process (for just two rows). In my laptop, it only takes few second to finish.
I have been really pulling my head to try to figure it out. Any of your help/input is highly appreciated! Thanks!
======================
Error Message
[POLICY [1683]] Error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x8007000E. An OLE DB record is available. Source: "Microsoft Cursor Engine" Hresult: 0x8007000E Description: "Out of memory.".
I am reading multiple XML file and i am using for each loop to get file name. these xml files are create by other process. If file is in use then i will get following error .
[Read XML file] Error: The component "Read XML file" was unable to process the XML data. There is an unclosed literal string. Line 1300, position 26.
i know that this error is coming because file is under process but i want to skip reading file which is reurning error. i don't want to read any data from this file.
In Data flow task i am using following steps 1. Reading file through XML Source 2. inserting data through script component task.
This is my first time trying to import data from an XML file. It seems wonderfully straightforward. I currently have my Data Flow set up correctly. However, when I try to run my package I get a series of errors. One error is a 0x80131500 which is a very generic error. It say's it's haveing a hard time with a 'Url' field in the XML. The size of the field is set to 255 which should take care of any issues with Truncating. The only issue that I can see with that field is that it contains a query string which of course contains a special '&' XML character (&). When I set the Error Handling to Ignore that Error on that column, it imports just fine, however, NONE of the URL's get inserted. So there must be a problem with every URL in the XML. Here are the error messages I recieved from the Progress section:
[XML Source [1]] Error: The "component "XML Source" (1)" failed because error code 0x80131500 occurred, and the error row disposition on "output column "Url" (59)" specifies failure on error. An error occurred on the specified object of the specified component. [XML Source [1]] Error: The component "XML Source" (1) was unable to process the XML data. Pipeline component has returned HRESULT error code 0xC0209029 from a method call.
----And then alot of DTS.Pipeline errors.
I have done alot of searching for this error in regards to SSIS, but the information is scarce. Any and all help would be greatly appreciated. Thank you for your time in this matter.
P.S. Just started using SSIS last week and I love it! The team did an excellent job!
Application popup: oprd.exe - Application Error : The application failed to initialize properly (0xc0000142). Click on OK to terminate the application.
The application log keeps generating this error message and I can't seem to find any information on it. Can anyone shed some light?
Event filter with query "select * from __InstanceModificationEvent within 10 where TargetInstance isa 'Win32_Service'" could not be (re)activated in namespace "//./root/Microsoft/SqlServer/ComputerManagement" because of error 0x80041010. Events may not be delivered through this filter until the problem is corrected.
We are running a news paper website on Windows 2003 IIS and MSSQL 2005.
I'm facing the error shown bellow. I'm not an MSSQL expert not an ASP or ASP .net developer.
I need some help here to pinpoint the issue. Is it a OS / DB / MSSQL / ASP / ASP .net / Developer bug!
The error
Server Error in '/' Application.
Transaction (Process ID 64) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.
Exception Details: System.Data.SqlClient.SqlException: Transaction (Process ID 64) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.
Source Error:
An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below.
Stack Trace:
[SqlException (0x80131904): Transaction (Process ID 64) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.] System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection) +857466 System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection) +735078 System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj) +188 System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj) +1838 System.Data.SqlClient.SqlDataReader.HasMoreRows() +150 System.Data.SqlClient.SqlDataReader.ReadInternal(Boolean setTimeout) +214 System.Data.SqlClient.SqlDataReader.Read() +9 System.Data.SqlClient.SqlCommand.CompleteExecuteScalar(SqlDataReader ds, Boolean returnSqlValue) +39 System.Data.SqlClient.SqlCommand.ExecuteScalar() +148 database.setonline(String pagetitle) +262 _Default.Page_Load(Object sender, EventArgs e) +3831 System.Web.UI.Control.OnLoad(EventArgs e) +99 System.Web.UI.Control.LoadRecursive() +47 System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) +1061
Version Information: Microsoft .NET Framework Version:2.0.50727.42; ASP.NET Version:2.0.50727.42
Hello. I'm working with SQL Server 2005 Standard edition. I have a Java program that loads PDF files into the database. I have a table called T08_entity which, among others, has two IMAGE columns. The first Image column is for the original PDF file. The second one is for the PDF file with modified permissions (printing, saving, etc). This is made using the i-text library. The programs looks for the content of a disk folder, reads the contents of the folder, and inserts, one by one, the pdf files (besides other fields, like the name of the file, and ID, etc... but these are varchar or int fields. No problem with these. When the folder has only small files (smaller that 7-8 mb), it loads them without any problem into the database. But when the folder has bigger files (>10mb, more or less...) I get an OUT OF MEMORY error. I'm using the latest sqljdbc.jar driver (v1.2.2727). My server computer has only 1GB of RAM... but I've read that this latest driver can load big amounts of binary data using the connection property "responseBuffering=adaptive". Here is a sample of my code (at least the most relevant lines):
This is my connection code:
public String getConnectionUrl(){ return "jdbc:sqlserver://"+serverName+":"+portNumber+";databaseName=" +databaseName+";responseBuffering=adaptive;selectMethod=cursor"; }
public java.sql.Connection getConnection(){ try{
...
Class.forName("com.microsoft.sqlserver.jdbc.SQLServerDriver"); con = java.sql.DriverManager.getConnection(getConnectionUrl(),userName,password); if (con.getAutoCommit()) { con.setAutoCommit(false); } ... } catch(Exception e){ System.out.println("etc, etc..."); } return con; }
The following is a loop where each loop represents a file in the folder:
...And this is the insertDirectory procedure which inserts every file: the pdffile and pdffilenoperm are the IMAGE columns. The rest are varchar or int columns:
public void insertDirectorio(File archivo) {...if (archivo.isFile()){ pstmt =con.prepareStatement("INSERT INTO temp_carga "+ "(directory, name, dir_sup, filetype, pfilesize,pdffile,pdffilenoperm)"+ " values (?,?,?,?,?,?,?)"); }... long tamano = archivo.length();
//INSERTS ORIGINAL FILE................ int fileLength = Integer.MIN_VALUE; is = new FileInputStream(pdffile); fileLength= (int) pdffile.length(); pstmt.setBinaryStream(6, is, fileLength);
//INSERTS FILE WITHOUT PERMISSIONS (THIS PART OF THE CODE IS LONG AND IRRELEVANT, IT JUST USES THE ITEXT LIBRARY TO MODIFY THE PDF FILE. AT THE END, I HAVE THE FILE IN AN OUTPUT STREAM, AS SHOWN HERE:) ByteArrayInputStream inputnoimp = new ByteArrayInputStream(outnoimp.toByteArray()); pstmt.setBinaryStream(7,inputnoimp,(int)outnoimp.size()); } catch(Exception e) { err = e.toString(); } } pstmt.executeUpdate(); con.commit(); pstmt.close(); this.closeConnection(); }catch(java.sql.SQLException e) { err = e.toString(); } }
Well, as I said, when I run the program, when it reads smaller files, there's no problem. But when it gets a big file, I get the OUT OF MEMORY error. I have another application that reads pdf files ONE AT A TIME, using a code very much like this one, and it reads big files (>30mb) with no problems. The problems is with this one. Any help will be appreciated. If you have any question to clarify the problem, just tell me. Thanks in advance. Eric.
Hi, Is there any setting in IS that I should have adjusted in order to avoid this message?
Information: 0x4004800C at EXTRACT from MSCRM and AX (From Source to Working Tables for Dimension), DTS.Pipeline: The buffer manager detected that the system was low on virtual memory, but was unable to swap out any buffers. 124 buffers were considered and 124 were locked. Either not enough memory is available to the pipeline because not enough is installed, other processes are using it, or too many buffers are locked