Remote Database Updates Take Long Time To Complete
Jul 20, 2005
Hi
I have have two linked SQL Servers and I am trying to get remote writes
working correctly (fast).
I have configured the DB link on both machines to:
Point at each others DB.
I have security set up to map each others server logins
and Server Options: Collation Compatible, Data Access, RPC, RPC Out, Use
Remote Collation all checked
My problem is that when a SP performs
Begin Transaction
Update Local Table
Update Remote Table
Commit Tran
It takes several seconds to complete. (about 7 seconds not acceptable to
us)
This is due to the remote update - how can I improve the response time?
if (@@error <> 0)
begin
rollback tran mytran
set @strResult = 'Failure'
set @strErrorDesc = 'Fail @ Step :' + @strStep + ' Error : ' + @@Error
return -1969
end
else
begin
set @strResult = 'Success'
set @strErrorDesc = ''
end
-- commit tran if we started it
Hi There,I have an update statement to update a field of a table (~15,000,000records). It took me around 3 hours to finish 2 weeks ago. After thatno one touched the server and no configuration changed. Untilyesterday, I re-ran it again and it took me more than 18hrs and stillnot yet finished!!!What's wrong with it? I can ran it successfully before. I have triedtwo times but the result was still the same.My SQL statement is:update [all_sales] aset a.accounting_month = b.accounting_monthfrom date_map bwhere a.sales_date >= b.start_date and a.sales_date < b.end_date;An index on [all_sales].sales_date is built successfully.A composite index on ([date_map].start_date, [date_map].end_date) isbuilt successfully.My server config is:SQL Server 2000 with Service Pack 3Windows 2000 with Service Pack 4DELL PowerEdge 6650 ServerDUAL XEON 1900MHz Processors2G RAM2G Page File on Drive C2G Page File on Drive DDELL Diagnostics on all SCSI harddisks were all PASSED.Any experts could simly give me a help????Thanks x 1,000,000,000
I made a website in ASP.net and using sql server 2005 as database. There is sometime processing data that need long time processing ( about 20 minutes ) and big data. It works fine in dev box, but when I place on shared hosting, and some people access it crashed. The website can not be accessed. Hosting support told me maybe I need to reprogram my code. So anybody has solution for this problem ? Should I create new thread ?
I had a database of electronic resources which had 28000 records earlies and was working fine. Now we have added a whole bunch to make it 800K records which has increased the search time to 14-22 seconds which is not acceptable. I have all the tables indexed.
Please help me how to solve this problem. Let me know what other information I should put up here to make my problem undestandable. Thanks in advance, Archana
Just curious if there's a reason why everytime I try to apply the Microsoft SQL Server 2005 Express Edition Service Pack 2 from Windows Update that it will sit at "Installing" and appear to do nothing (no hard drive activity, but the MSI and setup processes are chewing a little bit of CPU time). How long should I give this update to complete? I am running a 2-CPU dual-core Xeon 3.0GHz w/ 2GB of RAM setup, so I would have thought it would be completed very quickly. :(
Edit:
Disregard; I had to wait for a full 15 minutes! :O
I'm trying to rebuild my master database for sql server 2000. The process of rebuilding stared fine. But it is almost 4 hours since it got started. Performing it on a test system. Got doubtful and started the same on another test system. Issue is same and it is almost 2 hours. The Db size is less than 100 MB in both cases. IS IT NORMAL? I've tried the same for SQL SERVER 2005 and it got finished in couple of minutes. Please advise.
Try this script to see what queries are taking over a second.To get some real output, you need a long-running query. Here's one(estimated to take over an hour):PRINT GETDATE()select count_big(*)from sys.objects s1, sys.objects s2, sys.objects s3,sys.objects s4, sys.objects s5PRINT GETDATE()Output is:session_id elapsed task_alloc task_dealloc runningSqlText FullSqlTextquery_plan51 32847 0 0 select count_big(*) from sys.objects s1, sys.objects s2,sys.objects s3, sys.objects s4, sys.objects s5 SQL PlanClicking on SQL opens the full SQL batch as a .txt file, including the PRINTstatementsClicking on Plan allows you to see the .sqlplan file in MSSMS========Title: Using a VB Script to show long-running queries, complete with queryplans.Today (July 14th), I found a query running for hours on a development box.Rather than kill it, I decided to use this opportunity to develop a scriptto show long-running queries, so I could see what was going on. (ReferenceRoy Carlson's article for the idea.)This script generates a web page which shows long-running queries with thecurrently-executing SQL command, full SQL text, and .sqlplan files. The fullSQL query text and the sqlplan file are output to files in your tempdirectory. If you have SQL Management Studio installed on the localcomputer, you should be able to open the .sqlplan to see the query plan ofthe whole batch for any statement.'LongestRunningQueries.vbs'By Aaron W. West, 7/14/2006'Idea from:'http://www.sqlservercentral.com/columnists/rcarlson/scriptedserversnapshot.asp'Reference: Troubleshooting Performance Problems in SQL Server 2005'http://www.microsoft.com/technet/prodtechnol/sql/2005/tsprfprb.mspxSub Main()Const MinimumMilliseconds = 1000Dim srvnameIf WScript.Arguments.count 0 Thensrvname = WScript.Arguments(0)Elsesrvname = InputBox ( "Enter the server Name", "Server", ".", VbOk)If srvname = "" ThenMsgBox("Cancelled")Exit SubEnd IfEnd IfConst adOpenStatic = 3Const adLockOptimistic = 3Dim i' making the connection to your sql server' change yourservername to match your serverSet conn = CreateObject("ADODB.Connection")Set rs = CreateObject("ADODB.Recordset")' this is using the trusted connection if you use sql logins' add username and password, but I would then encrypt this' using Windows Script Encoderconn.Open "Provider=SQLOLEDB;Data Source=" & _srvname & ";Trusted_Connection=Yes;Initial Catalog=Master;"' The query goes heresql = "select " & vbCrLf & _" t1.session_id, " & vbCrLf & _" t2.total_elapsed_time AS elapsed, " & vbCrLf & _" -- t1.request_id, " & vbCrLf & _" t1.task_alloc, " & vbCrLf & _" t1.task_dealloc, " & vbCrLf & _" -- t2.sql_handle, " & vbCrLf & _" -- t2.statement_start_offset, " & vbCrLf & _" -- t2.statement_end_offset, " & vbCrLf & _" -- t2.plan_handle," & vbCrLf & _" substring(sql.text, statement_start_offset/2, " & vbCrLf & _" CASE WHEN statement_end_offset<1 THEN 8000 " & vbCrLf & _" ELSE (statement_end_offset-statement_start_offset)/2 " & vbCrLf & _" END) AS runningSqlText," & vbCrLf & _" sql.text as FullSqlText," & vbCrLf & _" p.query_plan " & vbCrLf & _"from (Select session_id, " & vbCrLf & _" request_id, " & vbCrLf & _" sum(internal_objects_alloc_page_count) as task_alloc, " &vbCrLf & _" sum (internal_objects_dealloc_page_count) as task_dealloc " &vbCrLf & _" from sys.dm_db_task_space_usage " & vbCrLf & _" group by session_id, request_id) as t1, " & vbCrLf & _" sys.dm_exec_requests as t2 " & vbCrLf & _"cross apply sys.dm_exec_sql_text(t2.sql_handle) AS sql " & vbCrLf & _"cross apply sys.dm_exec_query_plan(t2.plan_handle) AS p " & vbCrLf & _"where t1.session_id = t2.session_id and " & vbCrLf & _" (t1.request_id = t2.request_id) " & vbCrLf & _" AND total_elapsed_time " & MinimumMilliseconds & vbCrLf & _"order by t1.task_alloc DESC"rs.Open sql, conn, adOpenStatic, adLockOptimistic'rs.MoveFirstpg = "<html><head><title>Top consuming queries</title></head>" & vbCrLfpg = pg & "<table border=1>" & vbCrLfIf Not rs.EOF Thenpg = pg & "<tr>"For Each col In rs.Fieldspg = pg & "<th>" & col.Name & "</th>"c = c + 1Nextpg = pg & "</tr>"Elsepg = pg & "Query returned no results"End Ifcols = cdim filenamedim WshShellset WshShell = WScript.CreateObject("WScript.Shell")Set WshSysEnv = WshShell.Environment("PROCESS")temp = WshShell.ExpandEnvironmentStrings(WshSysEnv("TEMP")) & ""filename = temp & filenameDim fso, fSet fso = CreateObject("Scripting.FileSystemObject")i = 0Dim cDo Until rs.EOFi = i + 1pg = pg & "<tr>"For c = 0 to cols-3pg = pg & "<td>" & RTrim(rs(c)) & "</td>"Next'Output FullSQL and Plan Text to files, provide links to themfilename = "topplan-sql" & i & ".txt"Set f = fso.CreateTextFile(temp & filename, True, True)f.Write rs(cols-2)f.Closepg = pg & "<td><a href=""" & filename & """>SQL</a>"filename = "topplan" & i & ".sqlplan"Set f = fso.CreateTextFile(temp & filename, True, True)f.Write rs(cols-1)f.Closepg = pg & "<td><a href=""" & filename & """>Plan</a>"'We could open them immediately, eg:'WshShell.run temp & filenamers.MoveNextpg = pg & "</tr>"Looppg = pg & "</table>"filename = temp & "topplans.htm"Set f = fso.CreateTextFile(filename, True, True)f.Write pgf.CloseDim oIESET oIE = CreateObject("InternetExplorer.Application")oIE.Visible = TrueoIE.Navigate(filename)'Alternate method:'WshShell.run filename' cleaning uprs.Closeconn.CloseSet WshShell = NothingSet oIE = NothingSet f = NothingEnd SubMain
I am using VS2005 (VB) to develop a PPC WM5.0 Program. And I am using SQLCE 3.0. My PPC Hardware is in 400MHz.
The question is when the program try to insert the first record into sdf database after each time the program started. It takes a long time. Does anyone know why and how can I fix it?
I will load the whole database into a dataset when the program start and do all the "Insert", "Update", "Delete" in this dataset and fill it into database after each action.
cn.Open() sda = New SqlCeDataAdapter(SQL, cn) 'SQL = Select * From Table scb = New SqlCeCommandBuilder(sda) sda.Update(dataset) cn.Close()
I check the sda.update(), it takes about 0.08s for filling one record into database normally. But:
1. Start the PPC Program
2. Load DB into dataset
3. Create a ONE new record in dataset
4. Fill back to DB
When I take this four steps everytime, the filling time is almost 1s or even more!
Actually, 0.08s is just a normal case. Sometimes, it still takes over 1s to filling back a dataset which only inserted one record when the program is running. (Even all inserted records are exactly the same in data jsut different in the integer key)
However, when I give up the dataset and using the following code:
cn.Open() Dim cmd As New SqlCeCommand(SQL, cn) ' I have build the insert SQL before (Insert Into Table values(XXXXXXXXXXXXXXX All field)
I found that it is still the same that the first inserted record takes more time, but just about 0.2s. And the normal insert time is around 0.02s. It is 4 times faster!!!
I am running an update query.It is taking long time. To find the estimated completion time i checked sys.dm_exec_request or sys.dm_exec_session or sp_who2 but there is no clue. It is showing as zero.
As in title. Is there any tool? I'm asking beceuse, I have some big bases, and processing may take a lot of time (at least few hours), and I'll be glad if it's possibility to know estimate time before query runs. I'm using MsSql 2005 Developer edition.
I have to create a task, and in that task i have check exection time of package and if it more then specified time, send a mail with custom message to some specific users.
or
Can i write a event which occur after specific time and send a mail?
I have been successful with DTS packages and various SQL statements. However, I have a new challenge. I have a table in an SQL Server database. One of the columns is employee number and a column for department number(which is not populated) In a remote AS400 file I have the employees number and department number. I want to create a package to connect to remote table and update SQL Server table with department number where the two tables match on the employee number.
I have MS Time Seeries model using a database of over a thousand products each of which has hundreds of cases. It amazingly takes only a few minutes to finish processing the model, but when I click Mining Model Viewer to view the models, it takes many hours to show up. Once the window is open, I can choose model for different products almost instantly. Is this normal?
I am looking at building multiple SSIS packages. There will be some similarities. Flexibility is of highest importance. The main packages will need to connect to SQL Server1 as a source and SQL Server2 as a destination to transfer over dimenion data from multiple databases. (other SSIS packages may need to use SQL Server2 as a source and SQL Server1 as a destination)
For a single dimension table containing column dim_id on the target server (SQLServer2). I need to pass the results of the following SQL and insert into SQLServer2.database.dim_table
select dim.id from SQLServer1.database08.dim_table union select dim.id from SQLServer1.database07.dim_table union select dim.id from SQLServer1.database06.dim_table
Now next year the names of the databases on SQLServer1 will be database09,database08,database07!
Now so far my best thought is creating views in my destination SQL Server. So I need some way of dropping and recreating the views. Previously in DTS I would expect to see SQL Server connection that I could use as source and destination. Now I can see SQL Server destination but not source? Also How do I just use SSIS to run some SQL. i.e execute a stored procedure, drop and creat views?
Many thanks, Ells p.s Flexibility is the key, in the last three months all the ip and server names have changed more than once so need to be as flexible as possible.
We are running a SQL Server 2008R2 64-bit database system on a Windows 2012 R2 64-bit Standard system. I have noticed in recent weeks that our differential backups periodically are taking longer than expected to complete. The usual amount of time is about one hour, but on several occasions, it has taken upwards to five hours. The nights when the job takes longer to complete are on Friday.
I did some checking online, and one possible reason for this issue is my scheduling the reindexing of the database on the morning of the Differential backup. For example, this past Friday the reindexing occurred at 1:00 AM with the Differential running at 10:00 PM that night.The article that I read suggested the reindexing, which takes several minutes, if that, to complete, should be scheduled to run just before the Full backup job.
If my application updates a SQL Express database at the same time as my service is performing synchronisation of the same database via RMO, intermittently some of the rows updated will not be included the next time synchronisation is performed.
Does anyone know what's happening and what I can do about it?
I have this scenario which is working fine, but would like to know if others have tried it or can recommend a better approach. Below is a brief description, but code should fully explain:
A process updates a SQL Server table via a stored proc, which in turn writes information to a service broker target queue. In my client/server architecture, I would like clients to see this new information as soon as it gets written to the target queue. To do this, I have a WCF service that calls a stored procedure; the stored proc contains an infinite loop which has a RECEIVE statement with an infinite timeout. Once RECEIVE reads data from the SB queue, data is retrieved via a SqlDataReader and results are sent to clients via a pub/sub. I then wait for more rows until RECEIVE unblocks and so on.
Stroed Proc
Code Snippet
-- ... WHILE 1 = 1 BEGIN -- Param declarations
...
BEGIN TRANSACTION; WAITFOR ( -- Blocks while TargetQueue is empty RECEIVE TOP(1) @conversation_handle = conversation_handle, @message_type_name = message_type_name, @conversation_group_id = conversation_group_id, @message_body = CASE WHEN validation = 'X' THEN CAST(message_body AS XML) ELSE CAST(N'' AS XML) END FROM [dbo].[TargetQueue] -- No time out! )
-- Handle errors ...
-- Return received information. After this statement is executed, -- reader.NextResult unblocks and reader.Read() can read these -- new values SELECT 'Conversation Group Id' = @conversation_group_id, 'Conversation Handle' = @conversation_handle, 'Message Type Name' = @message_type_name, 'Message Body' = @message_body ;
COMMIT TRANSACTION
END -- WHILE
C# Code
Code Snippet
// Create a SqlCommand, and initialize to execute above stored proc // (set command time to zero!) ...
// ExecuteReader blocks until RECEIVE reads data from the target queue using (SqlDataReader reader = cmd.ExecuteReader()) { // Begin an infinite loop while (true) { // Process data retrieved by RECEIVE while (reader.Read()) { Trace.WriteLine("Conversation Group Id :" + reader.GetGuid(0) + "Conversation Handle: " + reader.GetGuid(1) + "Message Type Name : " + reader.GetString(2) + "Message Body : " + reader.GetString(3));
// Send data via pub/sub ... }
// Blocks until stored procedure returns another select statement // i.e., blobks until RECEIVE unblocks and retrieves more data from queue reader.NextResult(); } // while }
I have a stored procedure that is called from a VB.NET application that takes an enormously long time to execute. In the QA it only takes 10sec but in the application it takes ages. The stored procedure is as follows:
PROCEDURE NAME IS SPTOPTWENTYUSERS
SELECT TOP 20 STRUSERNAME,SUM(INTBYTESRECVD) AS INTDOWNLOAD FROM TBLISAWEBLOGS WHERE DTELOGDATE BETWEEN @BEGINDATE AND @ENDDATE GROUP BY STRUSERNAME ORDER BY INTDOWNLOAD DESC
The code that runs it is as follows:
sSQLString = SPTOPTWENTYUSERS Using cnn As New SqlConnection(GetPath) Try Dim cmd As New SqlCommand(sSQLString, cnn) Dim dr As SqlDataReader
With cmd .CommandType = CommandType.StoredProcedure .CommandTimeout = 0 .Parameters.Add("@BEGINDATE", SqlDbType.DateTime) .Parameters.Add("@ENDDATE", SqlDbType.DateTime) .Parameters("@BEGINDATE").Value = dtpStartDate.Value .Parameters("@ENDDATE").Value = dtpEndDate.Value End With cnn.Open() dr = cmd.ExecuteReader
Any help on why this happens would be much appreciated.
I set up DB mirror between a primary (SQL1) and a mirror (SQL2); no witness. I have a problem when I issue command:
alter database DBmirrorTest Set Partner = N'TCP://SQL2.mycom.com:5022'; go
The error message is:
The remote copy of database "DBmirrorTest" has not been rolled forward to a point in time that is encompassed in the local copy of the database log.
I have the steps below prior to the command. (Note that both servers' service accounts use the same domain account. The domain account I login to do db mirror setup is a member of the local admin group.)
1. backup database DBmirrorTest on SQL1
2. backup database log
3. copy db and log backup files to SQL2
4. restore db with norecovery
5. restore log with norecovery
6. create endpoints on both SQL1 and SQL2
CREATE ENDPOINT [Mirroring]
STATE=STARTED
AS TCP (LISTENER_PORT = 5022, LISTENER_IP = ALL)
FOR DATA_MIRRORING (ROLE = PARTNER)
7. enable mirror on mirror server SQL2
:connect SQL2
alter database DBmirrorTest
Set Partner = N'TCP://SQL1.mycom.com:5022';
go
8. Enable mirror on primary server SQL1
:connect SQL1
alter database DBmirrorTest
Set Partner = N'TCP://SQL2.mycom.com:5022';
go
This is where I got the error.
The remote copy of database "DBmirrorTest" has not been rolled forward to a point in time that is encompassed in the local copy
Hi There, We have developed a application in VB and connected to SQL Server 6.5, we have some stored procedures where it brings the data from SQL Server 6.5, this application is running since some months, when we run this application it usually take only one minute to generate the report but since couple of days it is taking 25 Minutes to generate the report, even when I run that stored procedure at backend in Query analyzer at Server it is taking 15-20 Minutes to give the result. please can any one help in identifying the problem, What all the things I need to check to identify it. Give me the solution.
Problem: I schedule a job that calls a stored procedure which loads around 1.5 million records. The Job takes 19 hrs to complete. However, if i run that stored procedure manually in Query Analyser it takes only 45 minutes..
Did anyone faced this problem? Is this known problem..Any suggestions/recommendations?
I have a CTE query that is used to fill in nulls on a history table. The With statement executes just fine. sub 2 seconds on 974 records, however the main query is what's turning the whole query into a turtle. I know that it's the looping that it's doing there that is causing the slow down, but I'm just not sure how to fix it. I've tried inserting it into a temp table, refactored the code a hundred times, but nothing seems to be working.
Code is below and the execution plan is attached. Server Version: 12.0.2342.0 Enterprise: 64bit
;WITH BuildTable AS ( SELECT [GEGTH].[ID] , [GEGTH].[Changed By] , CAST( [dbo].[GetWeekStarting] ([GEGTH].[Changed Date] , 2 ) AS DATE) AS WeekOf , [GEGT].[Title]
When I login using QA to my SQL Server database, it takes 15-20 secondsto establish a connection and open a query window. Drilling into adatabase via Enterprise Manager is similar. Once the connection isestablished, the server runs plenty fast however.Can someone tell me why it could take a long time for a connection tobe established?This behavior occurs when I am local on the box.Thanks,John
Hi,I've a strange problem with a INSERT query. It's taking a long time toexecute. The format is like this :INSERT INTO table1SELECT ..FROM table2Executing the SELECT .. FROM table2 is taking 30 seconds. The resultis nothing: no records are selected.When i include the INSERT part it will take 12 hours to completeINSERT INTO table1SELECT ..FROM table2There's is an index on the table and when i delete it, it gives stillthe problem.Keh?Greetz,Hennie
I have a query which returns approximately 50000 records, I am using a linked server to connect to two databases and retrieve data. For some reason it is taking a liitle more than hour to execute the query, but on MS Sql Server query window it comes after few minutes but the query runs for a long time.
How can expediate my query execution process.
Environment details
Database: MS Sql Server 64bit 2005 MS Sql jar file: sqljdbc_1.2.jar OS: Windows both server and client.
I'm running a query (see below) on my development server and its taking around 45 seconds. It hosts 18 user databases ranging from 3 MB to 400 MB. The production server, which is very similar but with only 1 25 MB user database, runs the query in less than 1 second. Both servers have been running on VMWare for almost 1 year with no problems. However last week I applied SP 2 to the development server, and yesterday I applied Critical Update KB934458. The production server is still running SQL Server 2005 Standard SP 1. Other than that, both servers are identical and running Windows 2003 Server Standard SP 1. I'm not seeing this discrepancy with other queries running against user databases.
use MyDatabase
GO
select db_name(database_id) as 'Database', o.name as 'Table',
s.index_id, index_type_desc, alloc_unit_type_desc, index_level, i.name as 'Index Name',
If I use the following query for a Dataset and the execution takes a few seconds to show results
SELECT *
FROM dbo.ICParameter
WHERE (PatientID = @ID ) AND (LogTime > DATEADD(day, -1, GETDATE()))
ORDER BY LogTime
If I replace '99010200101' with @ID and enter '99010200101' when prompted for ID, the execution takes forever. Actually I have never got any results even after waiting for 10 minutes.
It seems inserting records takes a relatively long time. My guess is it needs time to allocate disk space for the extra space needed. Assuming this is true, are there any DB settings that allow auto space allocation in bigger chunk? I am looking for something like "DB growth factor" or " Table growth factor"
Dear friends, Our package which at the time of normal execution takes 2-2:30 mins for fetching data on VPN with some select queries. But some times its job runs for hours and hours. What could be the exact reason behind it? I guess its queries are stuch somewhere, but not when we run from the BIDS or run the job manually. Please help. Thanks.