We have built two testing apps for sending and receiving files across the network reliably using SQL Express as the database backend. The apps seem to be working fine under light load. However during stress test, we always get the following exception:
"System.Data.SqlClient.SqlException: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding."
During stress test, both the sender and receiver are running on the same machine. Sender creates file fragments, store them in the sender database and then send out to the network. File fragments will be deleted from the sender database when the sender receives acknowledgement from the receiver. On the receiver side, file fragments will be stored in the receiver database as they are coming in from the network. Corresponding file fragments will be deleted from the receiver database when a complete file is received.
There is maximum of about 1500 updates and 1500 deletes per second on the sender database. On the receiver side, maximum is about 300 updates and 300 deletes per second. Our goal is to send 30 GB of data (it should run for about 10 hrs). As said before we never have a good completed test run, a "timeout" exception is always thrown from the sender app (when it tries to end a transaction). It could happen as early as 1.5 hrs after we started the test. Note that although we are sending 30 GB of data, but at any point in time the database shouldn't be too big (should be well within 4 GB limit) because we delete file fragments relatively soon.
Next we changed the "Query Wait" setting in the Management Studio Advanced setting from the default "-1" to a very big number, then we have a successful run of sending 30 GB of data.
- First of all, are we not doing this properly in terms of dealing with SQL Express? Is SQL Express able to handle long running heavy load transactions for hours?
- We also noticed even before we got the timeout exception, the memory usage of sqlserver.exe keeps growing. Maybe it doesn't have a chance to cleanup internally. If the app hammers SQL Express for hours, I wonder how does it handle fragmentation? I assume it needs some sort of de-fragmenation, otherwise performance will degrade significantly...
- Seems like the Query Wait setting plays an important role here, any guideline on how to pick a reasonable value? Or should we pick a relatively small number and then do re-try in our app when we get timeout exceptions?
- Is it possible that we are running into some SQL Express resource limits? Any idea of how can we tell other than the VM size of sqlserver.exe?
Any help or suggestions would be greatly appreciated!
Hi, I'm running a CLR stored procedure through my web using table adapters as follows: res = BLL.contractRateAdviceAdapter.AutoGenCRA() 'with BLL being the business logic layer that hooks into the DAL containing the table adapters. The AutoGen stored procedure runs fine when executed directly from within Management Studio, but times out after 30 seconds when run from my application. It's quite a complex stored procedure and will often take longer than 30 seconds to complete. The stored procedure contains a number of queries and updates which all run as a single transaction. The transaction is defined as follows: ---------------------------------------------------------------------------------------------------------------------- options.IsolationLevel = Transactions.IsolationLevel.ReadUncommittedoptions.Timeout = New TimeSpan(1, 0, 0) Using scope As New TransactionScope(TransactionScopeOption.Required, options) 'Once we've opened this connection, we need to pass it through to just about every 'function so it can be used throughout. Opening and closing the same connection doesn't seem to work 'within a single transactionUsing conn As New SqlConnection("Context Connection=true") conn.Open() ProcessEffectedCRAs(dtTableInfo, arDateList, conn) scope.Complete() End Using End Using ---------------------------------------------------------------------------------------------------------------------- As I said, the code encompassed within this transaction performs a number of database table operations, using the one connection. Each of these operations uses it's own instance of SQLCommand. For example: ----------------------------------------------------------------------------------------------------------------------Dim dt As DataTable Dim strSQL As StringDim cmd As New SqlCommand cmd.Connection = conn cmd.CommandType = CommandType.Text cmd.CommandTimeout = 0Dim rdr As SqlDataReaderstrSQL = "SELECT * FROM " & Table cmd.CommandText = strSQL rdr = cmd.ExecuteReader SqlContext.Pipe.Send(rdr) rdr.Close() ---------------------------------------------------------------------------------------------------------------------- Each instance of SQLCommand throughout the stored procedure specifies cmd.CommandTimeout = 0, which is supposed to be endless. And the fact that the stored procedure is successful when run directly from Management studio indicates to me that the stored procedure itself is fine. I also know from output messages that there is no issues with the database connection. I've set the ASP.Net configuration properties in IIS accordingly. Are there any other settings that I need to change? Can I set a timeout property when I'm calling the stored procedure in the first place? Any advice would be appreciated.
When I execute a long running procedure, I get timeout errors when other users try to execute other procedures with UPDATE or INSERT statements.
I suspect that the other procedures are trying to execute DML statements on tables that are locked by the long running procedure.
I have a sharred trigger on all my tables that creates and updates records in tables AuditLogDetails and AuditLogParent for keeping a log of modifications. I suspect that tables AuditoLogDetails and AuditLogParent are locked by the long running procedure.
How can I change the LOCKING behavior of the long running procedure to fix the time out errors that I get?
The long running procedure is displayed below.
ALTER PROCEDURE [dbo].[spPostPresenceToHistory2]
@PostDate DateTime,
@Department Int,
@Division Int,
@Testing Bit = 0,
@XDoc xml OUTPUT,
@XDoc2 xml OUTPUT,
@ModifierID varchar(20),
@Comment varchar(200)
AS
BEGIN
BEGIN TRANSACTION
DECLARE @PostCount Int,@PreCount Int,@DiffCount Int
I have a new server where 32GB of RAM is installed and I have user databases on this server.I am using SQL server 2000 Enterprise edition and Platform is Windows 2003 adv server, which supports upto 128GB of memory.
sp_configure 'awe enabled' is set to 1 and at OS level, AWE is enabled as well.
max server memory (MB) is 2147483647
I was doing some stress test on this server but memory usage doesn't go beyond 180MB....can someone suggest a test for physical RAM ?
How can I make sure that application will make full use of available physical memory?
I have been asked to perform a performance stress test on a SQL server with new hardware that we are going to be receiving.
How have some of you performed your stress analysis against new or existing hardware?
This hardware that I am going to receive will have to be configured within a high availabilty environment. I want to take this opportunity to really put a beat down on this server.
I have a Windows 2003 Server running SQL 2005. The server has 32 GB of memory and I have enabled AWE in SQL. I have also configured the min and max SQL memory as 1 GB and 28 GB, respectively. However, this server currently has very low activity so I'm not sure whether my AWE-related changes worked. SQLSERVR.EXE process takes up about 100 MB of memory. Is there any tool or scripts that I can use to memory stress SQL to confirm that AWE is really in effect ?
If I start a long running query running on a background thread is there a way to abort the query so that it does not continue running on SQL server?
The query would be running on SQL Server 2005 from a Windows form application using the Background worker component. So the query would have been started from the background workers DoWork event using ado.net. If the user clicks an abort button in the UI I would want the query to die so that it does not continue to use sql server resources.
Hello!I am looking for someone who has solved this multi-million people'sproblem. EVERYONE seems to ahve this problem.Im a creating a data set and populating it with a call to a store proc.Its a complex stored proc with the end result as an insert to a temptable. Then I do a select from the temp table - in the store proc.I get the following sqlException error on the following line:DataAdapterName.Fill(DataSetName, "TableName")The error is:Timeout expired. The timeout period elapsed prior to completion of theoperation or the server is not responding.My connectiong string looks like this:<add key="cnITDevWinUser" value="Data Source=server; IntegratedSecurity=SSPI; Initial Catalog=dbname; pooling=false;connectionreset=false;connection lifetime=5;min pool size=1;max poolsize=10;connection timeout=120" />I have admin rights on that db.I have set my command.timeout to 500.If i run this same code in a windows application, it works fine.If I use a DataReader with the same storeProc, it works fine.If I run this same code on a simple selec (hello world), it also worksfine.If I run this store proc in QueryAnalyzer it works fine and is donewithin 6 seconds.If I run this on a different machine it produces the same result.I am using SQL2000 with vb.net in VS2003.I have looked everywhere for the answer. I can't find it anywhere.PLEASE SOMEONE HELP.regards,Stas K.(a.k.a Sorcerdon)
I created a new test database on my database server using a daily backup of the live database. I did an structure and data compare and it is identical. From looking at the permissions it looks identical too. the problem is when I run an update proc the database connection times out. Ive changed the connection string to use the sa login and is still timesout. Ive also tested it by changing the database name to the live db and it works fine then. I must be missing something. Ive also tried to run "Exec sp_change_users_login 'auto_fix', 'sa' to see if it would work but nothing. The select statements seem to work though. Thanks for nay help in advance!Ryan
Fellow .Net'ers I have a stored procedure that I know takes a bit of time to complete. I have searched the Internet looking for ways to extend my timeout period for an ASP.net 2.0 page. I still only get 30 seconds ([SqlException (0x80131904): Timeout Expired]). I have tried: a) Server.ScriptTimeout = 90; (in the Page_Load)b) Connection Object setting: Connect Timeout=90Some have also recommended some SQL command property, but I can't find it in the new V2 SQLDataSource object. Im sure this is a common need, how do you get more time for your procedures to complete? Please advise. Thanks
I've moved this to this forum to see if I can get an answer, there seem to be a lot of other people haveing the same problem, but no real answer. Please does anyone really understand this problem. I've been searching for months and am at wits end.
The problem - everytime I load my mdf database when it reaches the code line "me.Inventorytableadapter.fill...." I get a timeout expired error message. does anyone know how to corrected this problem, I keep reading about changing sqlcommand execute time, but I have to idea how that is done. If I changed my database to an access database would that resolve my problem. Any ideas will be appreciated.
I've been getting this error a lot lately when trying to connect, cannot find a particular pattern. It happens with the 2 databases I'm working with. I tried the Autoclose option disabled and I still get it, not as much though.
Also, I'm getting tons of login failed errors on the Event Viewer. I don't know how to debug those.
This is my code connect to SQL Server SqlConnection con = new SqlConnection("Data Source=OIT;Initial Catalog=big_db;User ID=sa; Password="); SqlDataAdapter cmd = new SqlDataAdapter("select * from myDB", con); SqlCommand sqlCmd = new SqlCommand(); DataTable dt = new DataTable(); cmd.Fill(dt); // It throw exception When myDB table have a lot of data, it throw exception like this : "It reached the time-out. Did the time-out period pass before completing the operation or the server doesn't respond. ". I config TCP/IP for SQL Server is Enable, but it throws SqlException too. How can I do? Help me please!!!
Hi, I have a website where the user uploads DBF files and then i am calling a sproc to scrub the data and put them from the staging database to the production database. If the folder size small may be less that 6 it transfers the data from the staging database to the production.. But if the size of the folder is big.. more the 6 mb i get an timeout exception.. But i know the data is been put in few tables in the production databse..
for eg... if the size of the file is 33mb it takes around 15 mins or so for the sproc to process . I have tried setting the connect and commd timeout whcih has not helped.. I tried increasing the server.script time and it fails too..
this is the error in my event viewer
Error: System.Data.SqlClient.SqlException: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
No Changes made to the ClientPlan table.
Inserting records(s) in ClientPlan table.
No Changes made to ClientPlan table.
at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection)
at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection)
at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj)
at System.Data.SqlClient.SqlCommand.InternalExecuteNonQuery(DbAsyncResult result, String methodName, Boolean sendToPipe)
at System.Data.SqlClient.SqlCommand.ExecuteNonQuery()
at Microsoft.ApplicationBlocks.Data.SqlHelper.ExecuteNonQuery(SqlConnection connection, CommandType commandType, String commandText, SqlParameter[] commandParameters)
at Microsoft.ApplicationBlocks.Data.SqlHelper.ExecuteNonQuery(String connectionString, CommandType commandType, String commandText, SqlParameter[] commandParameters)
at Microsoft.ApplicationBlocks.Data.SqlHelper.ExecuteNonQuery(String connectionString, String spName, Object[] parameterValues)
at icc.BaseClasses.DL.DLImportPlan.Import(Int32 ClientId, DateTime StartDate, DateTime EndDate, Int32 UserId) in C:Documents and SettingskroslundMy DocumentskareniccPlanStatementsiccImportDLDLImportPlan.cs:line 120
For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp.
I am getting SQL Time out exception after long run of 15-20 Hours, Please find the attachment for more details. My SQL queries are not taking much time for execution , simple Update /Insert statements to local database.
Observed Activity manager also, looks fine. One more application connected to same database (Insert statements) works fine. In connection string I have modified Connection string ConnectTimeout = 2147483647; (max)
I have a program in C# that I used to run on Visual Studio 2000 and SQL Server 2000 without any problem.
Now I upgrade to VS2005 and SQL Server 2005 and get and exception while
insert via SqlCommand.ExecuteNonQuery(). the insert is after many inserts to the same table by that program. the CommandTimeout is set to 600 (in the 2000 version I never had to change the default value of 30 seconds).
the insert is through a connection that is kept open throughout the program for more than 30 minutes
the exception message is:
Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
the trace is:
at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection) at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection) at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj) at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj) at System.Data.SqlClient.SqlCommand.RunExecuteNonQueryTds(String methodName, Boolean async) at System.Data.SqlClient.SqlCommand.InternalExecuteNonQuery(DbAsyncResult result, String methodName, Boolean sendToPipe) at System.Data.SqlClient.SqlCommand.ExecuteNonQuery()
produces the exception: "Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding." on the ExecuteNonQuery() line.
It's not a problem with the CommandTimeout or the ConnectionTimeout (from within the connection string), they are both set to 300 (and even when set to 0, the problem is the same).
It's very strange, because only two stored procedures don't work, the others work without any problems and under SQL Server (using EXEC stored_procedure) all of them work fine. It seems like the stored procedure is executed, but no response is returned to the client (the WinForm app), thus the timeout exception.
We have been bothered with this problem for a while. Usually I happens in the early moring. Later on after the error is gone on auser, the error never happens again on any user for the day. Is this a web Server problem or an aspx.vb coding error? Thanks,Jeffrey Server Error in '/SSSSS' Application.
Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.Data.SqlClient.SqlException: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.Source Error:
Line 270: oDBCommand.Connection.Open() Line 271: Line 272: Dim myReader As SqlDataReader _ Line 273: = oDBCommand.ExecuteReader(CommandBehavior.CloseConnection) Line 274:Source File: E:SSSSSScheduling.aspx.vb Line: 272 Stack Trace:
[SqlException (0x80131904): Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.] System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection) +862234 System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection) +739110 System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj) +188 System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj) +1956 System.Data.SqlClient.SqlDataReader.ConsumeMetaData() +31 System.Data.SqlClient.SqlDataReader.get_MetaData() +62 System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString) +297 System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async) +903 System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, DbAsyncResult result) +132 System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method) +32 System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method) +122 System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior) +62 Soma.Scheduling.GetNextAutoAppointment() in E:SSSSSAScheduling.aspx.vb:272 Soma.Scheduling.Page_Load(Object sender, EventArgs e) in E:SSSSSScheduling.aspx.vb:61 System.Web.UI.Control.OnLoad(EventArgs e) +99 System.Web.UI.Control.LoadRecursive() +47 System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) +1061
Hello all-I need to check to see if the database is running before I begin all sorts of processing and figured I would do something like this. Any better ways to do this? I want to check to see if the db is up and running and also check to make sure I can connect to it. Private Function DatabaseRunning() As Boolean 'Checks to see if the database is up and running.Dim objDataConnection As SqlConnection = New SqlConnection(ConnectionString) Try objDataConnection.Open()Catch ex As Exception Return False End Try objDataConnection.Close() Return True End Function
Have 6 SQL Server 2012 failover clusters environments on Windows 2012 R2 standard edition.Have intermittent connectivity issues when using Windows Authentication, with the error "test connection failed because of an error in initializing provider. login timeout expired" . Am checking by using a UDL file and have tried the below.
1) Have made port changes to use static 1433 port. 2) Have enabled shared memory and using Named Pipes and TCP/IP by using cliconfig. 3) Have turned off firewall. 4) Loopback is disabled 5) SQL Browser is running, have changed 'Built in Account setting' from 'Local Service' to 'Network Service'. but with no effect.
Still I am getting intermittent connectivity issues.
My requirement is to sling a rowset from one place in SQL server into a table in another place in the most performant way. I want this to be parameterizable - I want to provide just a connection string and some SQL for the source and a connection string and a table name for the destination. The package should do the rest.
The solution I chose was an 2014 SSIS package with source and destination as ADO.NET connections configured from project variables. The package has a script task to bulk copy the data. For performance I disable the non-clustered indexes first.
But this performance precaution causes the bulk copy to timeout after delivering the correct rowcount to the destination table. What I can do to avoid this error?
Here's my script code:
//get hold of the source and a data reader from it SqlConnection sqlconnSource = new SqlConnection(); sqlconnSource = (SqlConnection)(Dts.Connections["source"].AcquireConnection(Dts.Transaction) as SqlConnection); SqlCommand sourcesqlCommand = new SqlCommand(SourceSQL, sqlconnSource); sourcesqlCommand.CommandTimeout = 1500;
[Code] ....
This takes 128 seconds to put 13 million thin rows into my empty destination table and then throws an exception with this message:
Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
I'm running a package that has a XML Task in the control flow. This task tranforms a XML file with a XSLT.
The file is about 2 megs on a daily basis, but at the end of the month there is a full dump of data that makes the file to be around 400 megs. There is where my problem is.
I run this on my 2 GB memory workstation and when the memory gauge on the task manager reaches about 1.5gb the package fails with an "Out of memory exception".
I also run this package on a 8GB Ram server, and same applies.
Is there any way of making this package utilize all the available memory, I even increased the virtual memory to see if that helped my issue, but nothing.
I have a pretty complex query that aggregates lots of data and inserts multiple rows of that data into a reporting table. When I call this SPROC from SQL Server Management Studio, it executes in under 3 seconds. When I try to execute the same SPROC using .NET's SqlCommand object the query runs indefinitely until the CommandTimeout is reached. Why would this SPROC behave differently with the same inputs, but being called from .NET? Thanks for your help!
Hi everyone.... I'm trying to execute this update statement... It takes an eternity... any ideas on how to rewrite or speed it up?
It's a several step process... below is everything that i run, one step at a time. The final update statement is what takes so long. It should only affect about 2600 rows out of a potential 9000. That's why I'm confused on the response time
select d.olddevicename, de.device, d.newdevicename into #temp9 from dns d, devices de where de.device = d.olddevicename
update #temp9 set device = newdevicename where olddevicename = device
update devices set device = #temp9.device from #temp9, devices where #temp9.device in (select #temp9.device from #temp9, devices where #temp9.olddevicename = devices.device)
I have 3 three scheduled job: one runs onece a day, one runs once per hour, and another runs every 17 minutes. It is a NetIQ application. I just scheduled SQL Server maintianace job last night which ran at 2:00Am and 4:00Am. This morning, I came in office and found all my jobs were still running; and they were all blocked by the first 3 jobs. I had to kill all of them. In this afternoon, I kicked off one of my many DTS packages which runs usually about 40 minutes, but it failed. I tried several times but no luck. I suspected one of user tables corrupted or one of stored procedures corrupted. After I recycle the server, and dropped the table and the stored procedure, and recreated them, the package went fine. The store procedure involves many updates and inserts.
The question I have is: is it possible to cause this problem because I killed the unfinished jobs (especially the sql maintanace job)?
NOTE: the sql maintanace job does not include the backup of database and transaction log.
My backups are running 5-6 hours on SQL2000. I'm sure they only used to take 1 hour or so. On another server, backing up the same database (both about 50 gig), the backup only takes 45 min - 1 hour. What can I look at to see why it's taking so long ?
Trying to come up with a way to monitor (without profiler, hopefully with a job and a select statement) a specific sql job that may cause a problem if the duration is too long. It seems that there is an sp called sp_sqlagent_log_jobhistory that shoves a record in sysjobhistory, but only after all the job steps run. Anyone tried this before?
Hello Gurus I am using sql 2005 and one job status is executing in job monitor in 2005,How can i check since how long this job is running? Please advice