Strange Performance Issue With UPDATE FROM
Jun 26, 2007
Hello!
I have this piece of SQL code:
UPDATE a
SET Field1 = c.Field1
FROM a
INNER JOIN b ON a.GUID1 = b.GUID1
INNER JOIN c ON b.GUID2 = c.GUID2
WHERE c.Type = 1
AND @date BETWEEN b.DateFrom AND b.DateTo
This query takes hours to complete.
Now while trying to find out what's causing the poor performance (it
surely looks simple enough!) I've rewritten it to use temp tables:
SELECT a.GUID1, a.Field1, c.Type, b.DateFrom, b.DateTo INTO #temptable
FROM a
INNER JOIN b ON a.GUID1 = b.GUID1
INNER JOIN c ON b.GUID2 = c.GUID2
WHERE c.Type = 1
AND @date BETWEEN b.DateFrom AND b.DateTo
UPDATE a SET Field1 = subsel.Field1
FROM (SELECT * FROM #temptable) AS subsel
WHERE subsel.GUID1 = a.GUID1
Now it completes in 10 seconds.
My question is why? Am I wrong in saying that the two batches above
produce same results? Is there something I've missed about the UPDATE
FROM syntax? Why would the first query perform THAT poorly?
Table sizes:
a: 24k rows
b: 268k rows
c: 260k rows
GUIDs are of type uniqueidentifier.
Any answers appreciated!
Regards,
// Richard
View 8 Replies
ADVERTISEMENT
Apr 23, 2008
Hi All,
Not sure if this question belongs in this area or the server area but I'll start here. Here is my problem. We have an C/C++ app that was originally written for SQL 2000 and uses DBLibrary. We have converted it to SQL 2005 and are using ODBC/Native client to access the SQL 2005 database. This all works great. So we were doing some performance testing and we noticed that our update performance seems slower in the SQL 2005/ODBC case than it did in the SQL 2000/DBLibrary case. Inserts and queries all perform great, in fact the inserts are significantly faster in the SQL 2005/ODBC case which is good. We are using Array inserts/updates/queries wherever possible as this is faster obviously. In our update case, it takes 1.14 seconds to update 2000 rows in table in the SQL 2005/ODBC case, while SQL 2000/DBLibrary case takes .39 seconds to the exact same thing. The table in question is a 12 column table with all integer columns, with an index on the first three columns.
So my question is this. Is there something different about Array Updates in SQL 2005 ? I've looked thru the list of hot fixes available since SQL 2005 SP2 and haven't seen anything that directly mentions Updates so I'm hesitant to go off and start applying the hot fixes to our server to see if the behavior changes. It seems strange to me that Array Inserts would be very fast, but Updates wouldn't be. I've checked the ODBC Data Source and we aren't doing anything fancy there. I'm not actually even sure if this problem is client side or server side as I said earlier.
If anyone has any ideas or thoughts that would be great since this is really bugging me.
Here is a sample of what our C code is doing. This is simplied and I've removed a lot of our own code but these are the SQL calls that we are making in order so maybe can see something wrong I'm doing.
//----Sample Code -------------------------------------------------------------------------------------
pSQL = "Update bob set VV=? where VI=?" // not done this way actually in our code but just to show the update text
status = SQLPrepare ( hS, (UCHAR *)pSQL, SQL_NTS );
// Called twice with nCol = 1 and then with nCol = 2
status = SQLSetParam (
hS,
abs(nCol),
cType, // cType = 5 = SQL_C_LONG
sqlType, // sqlType = 4 = SQL_INTEGER
38, // size needed in case the column is numeric or decimal
0,
p16Data, // Pointer to my array of data
NULL
);
SQLSetStmtAttr(hS, SQL_ATTR_PARAMSET_SIZE, (SQLPOINTER)numrows, 0); // numrows = 2000
SQLSetStmtAttr(hS, SQL_ATTR_PARAMS_PROCESSED_PTR, &p16cRows, 0);
status = SQLExecute (hS);
SQLParamOptions(hS, 1, NULL);
SQLFreeStmt ( hS, SQL_RESET_PARAMS );
//----Sample Code -------------------------------------------------------------------------------------
Thanks,
Nick
View 5 Replies
View Related
Jul 20, 2005
I have a strange performance question hopefully someone can clarifyfor me. I take a production database and make a copy of it, calledtest, on the same instance on the same server both running at the sametime. All the same things are running for each database and no one isusing each database but me. From Query Analyzer I run a SQL againstproduction three times in a row and it takes 1 minute 40 seconds onthe last/best run. I then run the same SQL against the test copy andrun it three times in a row and the last/best time is 12 seconds. Cananyone explain this behavior? If so, I hope this points to something Ican do to my production database to get the same performance.Thanks,MGB
View 5 Replies
View Related
May 12, 2006
Hi,
I have a strange performance issue. I have the following query which takes 40+ seconds to run. SELECT Count(x)
FROM view WHERE x = 347
AND y = 10056
AND z = 2
AND w = '01'
But if i switch it to below, the query returns the one result quickly (1 second).SELECT x
FROM view WHERE x = 347
AND y = 10056
AND z = 2
AND w = '01'
If the view is returning results quickly, why is it so much trouble for SQL to run the aggregate function on the results?Any help is appreciated. -Brian
View 3 Replies
View Related
Nov 28, 2005
Hi,I have a really interesting one for you guys...SQL Server 2000 sp3 on Windows Server 2003If I run this query:declare@find varchar(50)SET @find = 'TTLD0006423203'SELECT TOP 250ConsignmentID,c.CreatedFROM tblConsignment c WITH (NOLOCK)WHERE c.ConNoteNum LIKE @find + '%'ORDER BY c.Created DESCIt takes 5 - 7 seconds with an Index Scan on the Consignment TableHOWEVER, if I run either of the next two queries below they are instant(under 1 second) with no scan only an Index Seek ..declare@find2 varchar(50),@SQL nvarchar(4000)SET @find2 = 'TTLD0006423203'SET @SQL = 'SELECT TOP 250ConsignmentID,c.CreatedFROM Tranzbranch_archive.dbo.tblConsignment c WITH (NOLOCK)LEFT JOIN tblCustomer cu WITH (NOLOCK) ON c.FreightPayerCode =cu.CustCodeWHERE c.ConNoteNum LIKE ''' + @find2 + '%''ORDER BY c.Created DESC'execute sp_executesql @stmt = @SQLORSELECT TOP 250ConsignmentID,c.CreatedFROM tranzbranch_archive.dbo.tblConsignment c WITH (NOLOCK)LEFT JOIN tblCustomer cu WITH (NOLOCK) ON c.FreightPayerCode =cu.CustCodeWHERE c.ConNoteNum LIKE 'ttld0006423203%'ORDER BY c.Created DESCCan you please help me as this is causing Huge issues in our Live systemand I really don't want to rewrite 400+ stored procedures!!!!Thank you thank you thank you in advance....:-)Auday*** Sent via Developersdex http://www.developersdex.com ***
View 2 Replies
View Related
Apr 21, 2008
Dear All,
I got a strange performance issue with my existing application. The application run fine for a period of time. However, it got timeout error every time now when the number of records have been grown to a certain size.
The program uses Typed DataSet to access SQL Server 2005 database. The connection is made over a VPN. The same program and SQL run in a LAN environment performs not too bad. However, when it is run over the VPN, the CPU and I/O usage jumped to very high numbers. In the SQL Profiler, I found that the duration of execution is less than 20ms in LAN environment while the VPN will give me a figure around 30000ms (This means timeout).
I don't want to simply increase the timeout of my connection and command in order to solve this problem temporary. What's the actual cause of such huge difference in the performance?
Please give me some hints! Thanks a lot!
Regards,
Alex
View 9 Replies
View Related
Apr 12, 2007
Hi,
I've got a strange performance issue:
I'm using a SQL statement with a CTE to recursively get all node-ids from a tree beginning with a root node. In a select statement I'm using this CTE to get in a SELECT ... WHERE nodeID IN (SELECT ID FROM CTE_Nodes) statement data that is according to the nodes related to the root-node.
In our ASP.NET 2.0 application these statements are very slow or time out with an Exception. When I'm executing the same statement in a Management Studio the statement executes in less than one second.
BTW, we're using SQL Server 2005 Standard Ed. (9.0.3050 + 9.0.3054) in SQL Server 2000 compatibility mode with SQL Authentication.
I'm a bit frustrated, because the statements are the same.
Thanks in advance,
Klaus
Update:
BTW, the SQL Server is on a server machine and the Management Studio and the ASP.NET application on my developer machine. For data access we're using Enterprise Library 2.0 with System.Data.SqlClient.
Which possibilities do we have to trace this issue? Please help.
View 9 Replies
View Related
Jun 14, 2006
Hi,
I have a SQL Server stored procedure that gets called in the ASP.NET application. For a while it will return results quickly. After an unknown amount of time the stored procedure starts taking forever to execute. If I go into Query Analyzer and execute the same stored procedure using the exact same input parameters the results return quickly. If I then go back into the application and run the code that executes the stored procedure, the results return quickly again.
So basically every time the application call to the stored procedure begins to slow, I run that stored procedure in query analyzer and then it runs fine in the application again.
Has anyone else experienced anything like this or have any ideas as to why this would happen?
-Brian
View 1 Replies
View Related
Nov 23, 2006
Jezemine,
No, the number of reads is approximately the same. I can also confirm the disk read speed is the same on the test vs. production server. Update stats is run regularly on the production server - as I test, I ran sp_updatestats and then immediately ran the query a few times but it didn't affect the duration. Apart from the durations in the profiler traces, I can't see any differences. Clearly, something is causing the increased duration on the prod server but I don't know where to look to find it. It's definitely within SQL Server 2000.
Clive
View 8 Replies
View Related
Feb 7, 2008
I have a simple stored procedure that send a message on a conversation:
Code Snippet
CREATE PROCEDURE [Vxml].[sp_SendMessageToWarehouse]
-- Add the parameters for the stored procedure here
@Message XML,
@EntityId uniqueidentifier,
@ConversationHandle uniqueidentifier OUTPUT
AS
BEGIN
BEGIN TRY
;SEND ON CONVERSATION @ConversationHandle MESSAGE TYPE VxmlEngineXMLMessage(@Message);
END TRY
BEGIN CATCH
INSERT INTO [Vxml].SBError (ErrorMsg) VALUES (N'Error <' + ERROR_MESSAGE() + N'> for entity <' + cast(@EntityId as NVARCHAR(max)) + N'> on conversation <)' + cast(@ConversationHandle as NVARCHAR(max)) + N'> with body ' + cast(@Message as NVARCHAR(max)))
END CATCH;
END
This completes in less than 100ms (avg 30ms) except once in about 10000 executes (only one simultaneous execute), then this takes about 3 seconds to complete.
When performing the same test and calling the stored procedure simultaneously from 5 to 20 threads, we see up to 3 such spikes (in 10000 executes) taking 2 to 4 seconds.
Conversations are created before the test and closed after. No ODBC errors occur and the SBError table is empty.This is run on an SQLEXPRESS the conversation target is a SQL Server 2005 (standard edition)
Can anyone explain these strange performance spikes ?
View 1 Replies
View Related
Dec 1, 2005
Hello everyone, I am hoping someone can help me with this problem. Iwill say up front that I am not a SQL Server DBA, I am a developer. Ihave an application that sends about 25 simultaneous queries to a SQLServer 2000 Standard Edition SP4 running on Windows 2000 Server with2.5 GB of memory. About 11 of these queries are over views (all overthe same table) and these queries are all done from JDBC but I am notsure that matters. Anyway, initially I had no problem with thesequeries on the tables and the views with about 4 years of information(I don't know how many rows off hand). Then we changed the tables toreplicated tables from another server and that increased the amount ofdata to 15 years worth and also required a simple inner join on 2columns to another table for those views.Now here is the issue. After times of inactivity or other times duringthe day with enough time between my test query run I get what lookslike blocking behavior on the queries to the views (remember these allgo to the same tables). I run my 25 queries and the 11 view queriesall take about 120 seconds each to return (they all are withinmilliseconds of each other like they all sat there and then werereleased for processing at the same time). The rest of the queries arefine. Now if I turn around and immediately run the 25 queries again,they all come back in a few seconds which is the normal amount of time.Also, if I run a query on one of views first (just one) and then runthe 25 queries they all come back in a few seconds as well.This tells me that some caching must be involved since the times are sodifferent between identical queries but I would expect that one of thequeries would cache and thus take longer but the other 10 would befast, not all block for 2 minutes. What is more puzzling is that thisbehavior didn't occur before where now the only differences are:1) 3 times more data (but that shouldn't cause a difference from 3seconds to 120 and all tables have been through the index wizard with aSQL trace file to recommend indexes)2) There is now a join between 2 tables where there wasn't before3) The tables are replicated throughout the day.I would appreciate any insight into this problem as 120 seconds is waytoo long to wait. Thanks in Advance.Chris
View 1 Replies
View Related
Feb 22, 2008
Hello,
I have the following queries that run on a view called EntrySummary:
1)
exec sp_executesql N'SELECT COUNT (1) FROM [dbo].[EntrySummary] WHERE [EntrySummary].[SubmissionStatusID] = @SubmissionStatusID0 AND [EntrySummary].[CreatedBy] = @CreatedBy1',N'@SubmissionStatusID0 int,@CreatedBy1 nvarchar(20)',@SubmissionStatusID0=4,@CreatedBy1=N'domainaperson'
2)
exec sp_executesql N'SELECT COUNT (1) FROM [dbo].[EntrySummary] WHERE [EntrySummary].[CreatedBy] = @CreatedBy1 AND [EntrySummary].[SubmissionStatusID] = @SubmissionStatusID0',N'@SubmissionStatusID0 int,@CreatedBy1 nvarchar(20)',@SubmissionStatusID0=4,@CreatedBy1=N'domainaperson'
(The only difference between the two queries being the order of the where clauses)
Both return the correct answer (4144), but Query 2 takes 10-15 seconds whereas Query 1 takes < second.
If the same query is resubmitted several times, this doesn't affect the times.
(The vast majority of the records have a status of 4, but only about 3% will be created by the person).
Replacing Count(1) with count(*) makes both queries return quickly.
Is sp_executesql creating poor execution plans ? Can anyone explain this behaviour?
Regards,
AndyM
View 1 Replies
View Related
Dec 10, 2007
Hi all ,
We had a SQL server (2005) that was performing very bad . We thought of doing an UPDATE STATISTICS thinking that the response times would increase.
BUT , UNFORTUNATELY THE SERVER PERFORMANCE BECAME WORSE FROM BAD.
Ex : Stored procedures that took 2 minutes are running for 22 minutes now. Queris that ran for 17 seconds are running for 14 minutes now.
Anybody faced this kind of issue earlier ? Any help would be appreciated.
Thanks & Regards,
Hariarul
View 2 Replies
View Related
Dec 10, 2007
Hi all ,
We had a SQL server (2005) that was performing very bad . We thought of doing an UPDATE STATISTICS thinking that the response times would increase.
BUT , UNFORTUNATELY THE SERVER PERFORMANCE BECAME WORSE FROM BAD.
Ex : Stored procedures that took 2 minutes are running for 22 minutes now. Queris that ran for 17 seconds are running for 14 minutes now.
Anybody faced this kind of issue earlier ? Any help would be appreciated.
Thanks & Regards,
DBLearner
View 2 Replies
View Related
Apr 19, 2007
Can anyone help me understand what this means:
Code Snippet
java.sql.BatchUpdateException: com.microsoft.sqlserver.jdbc.SQLServerException: The IOBuffer.process operation returned an unknown packet type:0. Index:41. End:83.TDS_DONEINPROC(-1) TDS_DONEPROC(-2) TDS_DONE(-3) TDS_COLMETADATA(-127)
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.executeBatch(Unknown Source)
I'm trying to do a very simple batch update.
View 7 Replies
View Related
Jan 23, 2008
Hi,
My friend expresses his concern over the performance of his update query.
He says the execution time takes around 120 ms.
the query is looking like
Update table set col1 = 'value', col2 = 'val', col3= 'val'............ col40='val'
where col1 = 'value' and col2 = 'value' and col3 = 'val' and col4 = 'val'
He says he's updating around 40 columns at a stretch with filter conditions as shown, and only 1 row is getting affected by this.Where the col1, col2,col3 n col4 are composite primary keys and have default clustered indexes created on them.
Is that the execution time he say is really a performance issue ? I'm confused on this
Experts, pls post your comments and any tips on improving update query in general will immensely help.
thanks,
View 3 Replies
View Related
Oct 20, 2002
I should add an Identity field (Identity=True) and a row version field(timestamp) to my table, and avoid to arrange tables into different databases, is it true in general?
View 4 Replies
View Related
Nov 6, 2007
Recently we moved few of our databases from SQL 2000 to SQL 2005 (SP 2) using backup and restore. After the restore I did Reindex and update stats on the databases. Since then we have observed performance issues on SQL 2005 databases but this performance problem vanishes the moment we run (sp_updatestats). Is this a problem with SQL 2005 that we have to run sp_updatestats 2 times a days or 3 times a day. In SQL 2000 we ran it only Once a week and still we never had any performance issues. Is there any config change we need to do to fix this problem in SQL 2005?
Thanks !
View 11 Replies
View Related
May 8, 2006
Hi,I have an UPDATE statement which runs very slow (about 1-2 minutes) inWeb as well as in Query Analyzer. Very plain UPDATE statement; usesindexes, etc. I though the reason might be the table being unorganizedbecause of the row expansions due to updating an existing column valuewith a larger value. Therefore, I ran a maintanenance plan whichreorganizes data and rebuilds indexes. But, it seems that there is noimprovement.Any thought or advise will be greatly appreciated.Sincerely,Pelin Bali.
View 1 Replies
View Related
Jul 20, 2005
Dear Sql Server experts:First off, I am no sql server expert :)A few months ago I put a database into a production environment.Recently, It was brought to my attention that a particular query thatexecuted quite quickly in our dev environment was painfully slow inproduction. I analyzed the the plan on the production server (itlooked good), and then tried quite a few tips that I'd gleaned fromreading newsgroups. Nothing worked. Then on a whim I performed anUPDATE STATISTICS on a few of the tables that were being queried. Thequery immediately went from executing in 61 seconds to under 1 second.I checked to make sure that statistics were being "auto updated" andthey were.Why did I need to run UPDATE STATISTICS? Will I need to again?A little more background info:The database started empty, and has grown quite rapidly in the lastfew months. One particular table grows at a rate of about 300,000records per month. I get fast query times due to a few well placedindexes.A quick question:If I add an index, do statistics get automatically updated for thisnew index immediately?Thanks in advance for any help,Felix
View 17 Replies
View Related
Jul 20, 2005
hiFor an unavoidable reason, I have to use row-by-row processing(update) on a temporary table to update a history table every day.I have around 60,000 records in temporary table and about 2 million inthe history table.Could any one please suggest different methods to imporve the runtimeof the query?Would highly appreciate!
View 9 Replies
View Related
Jul 20, 2005
Below is a simple UPDATE that I have to perform on a table that hasabout 2.5 million rows (about 4 million in production) This queryruns for an enourmous amount of time (over 1 hour). Both theChangerRoleID and the ChangerID are indexed (not unique). Is thereany way to performance tune this?Controlling the physical drive of the log file isn't possible at ourclient sites (we don't have control) and the recovery model needs tobe set to "Full".UPDATE CLIENTSHISTORY SET ChangerRoleID = ChangerID WHEREChangerRoleID IS NULLAny Help would be greatly appreciated!
View 2 Replies
View Related
Oct 2, 2007
If I run sp_updatestats, does it effect performance?
My testing also reveals canceling sp_updatestats, doesn't caus a rollback?
View 4 Replies
View Related
Oct 16, 2013
I need to figure out the correct update statement syntax for the following integration.
I have a "Performance Table" which i insert weekly performance numbers into for each store. The table is constructed w/ columns such as Store, Weekenddate, Sales, Refunds, #ofPatients
In a "Averages Table" i have every weekenddate for each store populated. So 52 Weeks for 10 stores = 520 Rows of Store numbers & WeekendDates.
What i would like to do is run a loop or update statement which would update the store average for each weekendate based on the last 13 weeks.
This is my query
update performancestore_avgs set SalesAvg =
(select sum(SalesHit)/Count(Store) from performance_store where performance_store.weekenddate >= performancestore_avgs.weekenddate-84 and performancestore_Avgs.store = performance_store.store)
The update statement runs but the averages are completely wrong.
View 3 Replies
View Related
Oct 25, 2007
I installed the SQL Server 2005 SP2 update 2 rollup on my 64-bitserver and the performance has tanked!I installed rollup 3 on some of them, but that did not seem to help.I thought it was just a linked server performance issue, but myoptimization started running today on one of the "update 2" instancesand so far it's been running about 10 hours longer than it normallydoes.The rollup 3 fixed our stack dumping issues, but we NEED to have thisperformance thing fixed!I saw that MS has come out with update 4 last week - doesn't sayanything about fixing this, though.Has anyone else experienced this?I'm not necessarily expecting anyone to have a fix for this, justwantto know I'm looking in the right place before I call MS.
View 3 Replies
View Related
Feb 21, 2007
We recently implemented merge replication.We were expereincing. The replication is between 2 SQL Servers (2005) over same network box, and since we have introduced the replication, the performance has degraded considerably on subscriber end.
1) One thing that should be mention is that its a "unidirectional Direction" flow of changes is from publisher towards subscriber (only one publisher and distributor as well and one subscriber ).
2) Updates are high than inserts and only one article let say "Article1" ave update up to 2000 per day and i am experiecing that dbo.MSmerge_upd_sp_Article1_GUID taking more cpu time.what should be do..
on subscriber database response time is going to slow and i am experiencing a lot of number of LOCK time outs on application end.
can any one can also suggest me server level settings for aviding locking time out.
looking for any experieced solution/suggestion.
Thanks in advance.
View 3 Replies
View Related
Jul 19, 2006
Hi everyone,
I€™m suffering a queer behaviour when I use BIDS. Concretely, when I open a dtsx from my project (it has 10 packages) many times Sequence Container and Data Flow tasks are invisible. I mean, its lines are not visible at all whereas its titles are. I mean, what you see is just a white box€¦
Then, I€™m gonna Data Flow layer and I have to do double-clik over the tasks and are visible but on Control Flow I don€™t see how to solve.
Curiously in our development and production server such behaviour doesn€™t happen (we are accessing by mean Terminal Server from our workstations)
How odd!. Everything is fine except this.
I want to remark you that such project has been copied from the server, this is, these packages are been built on the server
Thanks for your thougts or ideas,
View 5 Replies
View Related
Sep 12, 2004
1. Use mssql server agent service to take the schedule
2. Use a .NET windows service with timers to call SqlClientConnection
above, which way would be faster and get a better performance?
View 2 Replies
View Related
Jun 23, 2006
Hello Everyone,I have a very complex performance issue with our production database.Here's the scenario. We have a production webserver server and adevelopment web server. Both are running SQL Server 2000.I encounted various performance issues with the production server with aparticular query. It would take approximately 22 seconds to return 100rows, thats about 0.22 seconds per row. Note: I ran the query in singleuser mode. So I tested the query on the Development server by taking abackup (.dmp) of the database and moving it onto the dev server. I ranthe same query and found that it ran in less than a second.I took a look at the query execution plan and I found that they we'rethe exact same in both cases.Then I took a look at the various index's, and again I found nodifferences in the table indices.If both databases are identical, I'm assumeing that the issue is relatedto some external hardware issue like: disk space, memory etc. Or couldit be OS software related issues, like service packs, SQL Serverconfiguations etc.Here's what I've done to rule out some obvious hardware issues on theprod server:1. Moved all extraneous files to a secondary harddrive to free up spaceon the primary harddrive. There is 55gb's of free space on the disk.2. Applied SQL Server SP4 service packs3. Defragmented the primary harddrive4. Applied all Windows Server 2003 updatesHere is the prod servers system specs:2x Intel Xeon 2.67GHZTotal Physical Memory 2GB, Available Physical Memory 815MBWindows Server 2003 SE /w SP1Here is the dev serers system specs:2x Intel Xeon 2.80GHz2GB DDR2-SDRAMWindows Server 2003 SE /w SP1I'm not sure what else to do, the query performance is an order ofmagnitude difference and I can't explain it. To me its is a hardware oroperating system related issue.Any Ideas would help me greatly!Thanks,Brian T*** Sent via Developersdex http://www.developersdex.com ***
View 2 Replies
View Related
Jun 22, 2006
Hello Everyone,I have a very complex performance issue with our production database.Here's the scenario. We have a production webserver server and adevelopment web server. Both are running SQL Server 2000.I encounted various performance issues with the production server witha particular query. It would take approximately 22 seconds to return100 rows, thats about 0.22 seconds per row. Note: I ran the query insingle user mode. So I tested the query on the Development server bytaking a backup (.dmp) of the database and moving it onto the devserver. I ran the same query and found that it ran in less than asecond.I took a look at the query execution plan and I found that they we'rethe exact same in both cases.Then I took a look at the various index's, and again I found nodifferences in the table indices.If both databases are identical, I'm assumeing that the issue isrelated to some external hardware issue like: disk space, memory etc.Or could it be OS software related issues, like service packs, SQLServer configuations etc.Here's what I've done to rule out some obvious hardware issues on theprod server:1. Moved all extraneous files to a secondary harddrive to free up spaceon the primary harddrive. There is 55gb's of free space on the disk.2. Applied SQL Server SP4 service packs3. Defragmented the primary harddrive4. Applied all Windows Server 2003 updatesHere is the prod servers system specs:2x Intel Xeon 2.67GHZTotal Physical Memory 2GB, Available Physical Memory 815MBWindows Server 2003 SE /w SP1Here is the dev serers system specs:2x Intel Xeon 2.80GHz2GB DDR2-SDRAMWindows Server 2003 SE /w SP1I'm not sure what else to do, the query performance is an order ofmagnitude difference and I can't explain it. To me its is a hardware oroperating systemrelated issue.Any Ideas would help me greatly!Thanks,Brian T
View 2 Replies
View Related
Mar 6, 2007
I have a datasource where i assign the control Parameter value depending on whether theres a value in the querystring or a gridview selectedvalue is indeed selected.When i do this using a querystring it works fine, however, when i do it using gridView_selectedIndexChanged it doesn't work.Here's the code <asp:SqlDataSource ID="jobDetailsDS" runat="server" ConnectionString="<%$ ConnectionStrings:promanConn %>" SelectCommand="SELECT tblJobs.intJobId, tblJobs.intJobStaffId, tblJobs.intJobUserId, tblJobs.intJobCategoryId, tblJobs.strJobTitle, tblJobs.strJobDesc, tblJobs.strJobNotes, tblJobs.dateJobLogged, tblJobs.dateJobDeadline, tblJobs.dateJobCompleted, tblJobs.intJobPriorityId, tblJobs.strJobSolution, tblJobs.boolJobCompleted, tblPriority.priorityName, tblPriority.id, tblPriority.priorityOrder, tblStaff.staffName, tblStaff.id AS staffID, tblProblemCategoriesLookup.strProblemCatName, tblProblemCategoriesLookup.intProblemCatId, tblStaff_1.id AS Expr1, tblStaff_1.staffName AS UserName FROM tblJobs INNER JOIN tblPriority ON tblJobs.intJobPriorityId = tblPriority.id INNER JOIN tblStaff ON tblJobs.intJobStaffId = tblStaff.id INNER JOIN tblProblemCategoriesLookup ON tblJobs.intJobCategoryId = tblProblemCategoriesLookup.intProblemCatId INNER JOIN tblStaff AS tblStaff_1 ON tblJobs.intJobUserId = tblStaff_1.id WHERE (tblJobs.intJobId = @intJobId)" InsertCommand="INSERT INTO tblJobs(intJobStaffId, intJobUserId, intJobCategoryId, strJobTitle, strJobDesc, strJobNotes, dateJobLogged, dateJobDeadline, intJobPriorityId, boolJobCompleted) VALUES (@intJobStaffId, 56 , 39, @strJobTitle, @strJobDesc, @strJobNotes,{ fn NOW() }, { fn NOW() }, @intJobPriorityId, 0); "> <SelectParameters> <asp:Parameter Name="intJobId" Type="int32" /> </SelectParameters> <InsertParameters> <asp:Parameter Name="intJobStaffId" Type="Int32" /> <asp:Parameter Name="strJobTitle" Type="String" /> <asp:Parameter Name="strJobDesc" Type="String" /> <asp:Parameter Name="strJobNotes" Type="String" /> <asp:Parameter Name="intJobPriorityId" Type="Int32" /> <asp:Parameter Name="intJobUserId" Type="Int32" /> </InsertParameters> </asp:SqlDataSource>Code Behind ElseIf dest > 0 Then jobDetailsDS.SelectParameters.Item("intJobId").DefaultValue = Request.QueryString("dest") notesDS.SelectParameters.Item("intJobId").DefaultValue = Request.QueryString("dest") notesDS.InsertParameters.Item("intJobId").DefaultValue = Request.QueryString("dest") gvActiveJobs.Visible = "False" End If End Sub Protected Sub gvActiveJobs_SelectedIndexChanged(ByVal sender As Object, ByVal e As System.EventArgs) Handles gvActiveJobs.SelectedIndexChanged gvActiveJobs.Visible = "False" jobDetailsDS.SelectParameters.Item("intJobId").DefaultValue = gvActiveJobs.SelectedValue notesDS.SelectParameters.Item("intJobId").DefaultValue = gvActiveJobs.SelectedValue notesDS.InsertParameters.Item("intJobId").DefaultValue = gvActiveJobs.SelectedValue End Sub I also debugged and checked the value of gvActiveJobs.SelectedValue and it is what i was expecting but its still no good.Can anyone help?ThanksMatt
View 2 Replies
View Related
Nov 19, 2001
I moved a table from one file group to another file group. After moving the table the row count show 0 but when I open the table it’s returning all the rows. Is something wrong with the table? Do I have created a cluster index and drop on the table?
View 1 Replies
View Related
Aug 2, 2006
Hi,
I have table in my db thats used as a temporary table. At the end of the day it contains thousands of records which get summarized then all are deleted. I recently moved the application over to a new server with pretty much the same hardware config and noticed a big performance hit when running queries on this temp table (455 milliseconds as opposed to 1) In short, the fix was dropping the table and re-creating it.
Anybody know how this could be? I know that even if you delete records from a table, the table still seems to retain its physical size on the hard drive, could this have something to do with it?
Thanks
View 3 Replies
View Related