I have a procedure (used to create a report) and was used in sql 7.0 service pack 3.
Problem is that we are upgrading to SQL 2000 and this procedure now takes 1 minute and 30 seconds to execute vs. 10 seconds previously.
Everything is same between the sql 7 and sql 2000 server. i.e. database size, indexes, hardware etc.
I looked at the query execution plan and it seems to do a sort which is taking majority of the resources on sql 2000 even though there is no sort stmt issued in the procedure itself.
Any help would be appreciated?? I am more curious to find out why this is the case when all the variables are same between the two servers yet sql 2000 performance is much worse than sql 7.0. It should be the other way around!!
Hi All,I have a table that currently contains approx. 8 million records.I'm running a SELECT query against this table that in somecircumstances is either very quick (ie results returned in QueryAnalyzer almost instantaneously), or very slow (ie 30 to 40 seconds toreturn results), and I'm trying to work out how I improve performance.Essentially the query I'm running is nothing more complex than:SELECT TOP 1 * FROM Table1 WHERE tier=n ORDER BY member_id[tier] is a smallint column with a non-clustered, non-unique index onit. [member_id] is a numeric column with a clustered, unique index onit.When I supply a [tier] value of 1, it returns results instantaneously.I have no idea if this is meaningful, but the tier = 1 records wereloaded first into the table, and comprise approximately 5 millionrecords.When I supply a [tier] value of 2, the results take 30 to 40 seconds.tier =2 records were loaded second, and comprise approximately 3million records.I've tried running an execution plan, and while I'm no expert, itappears to me that the index on tier isn't being used, even if I use:tier = CAST(2 as SMALLINT)I'm wondering if anyone can give me ANY advice on how to get anybetter performance out of this SELECT statement?Also, out of curiosity, can a disk defragment have a positive impacton SELECT query performance?Any help very much appreciated!Much warmth,Murray
Can I roll back certain query(insert/update) execution in one page if query (insert/update) in other page execution fails in asp.net.( I am using sqlserver 2000 as back end) scenario In a webpage1, I have insert query into master table and Page2 I have insert query to store data in sub table. I need to rollback the insert command execution for sub table ,if insert command to master table in web page1 is failed. (Query in webpage2 executes first, then only the query in webpage1) Can I use System. Transaction to solve this? Thanks in advance
Private Sub Button1_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles Button1.Click Try Dim objDTSPackages As DTS.Package objDTSPackages = New DTS.Package objDTSPackages.LoadFromSQLServer("(local)", "sa", "", DTS.DTSSQLServerStorageFlags.DTSSQLStgFlag_Default, "", "", "", "NEW") objDTSPackages.Execute() Catch ex As Exception lblmsg.Text = ex.Message End Try End Subusing above coding i am getting error . can anyone solve this problem?
Hi all I'm using Sql server 2000 and sometimes i need to run my Queries in Query analizer before using them in my application just to test them...BUT most of the time when i run a query in query-analizer for second time ,query analizer populates the result (records) more quicker then the first time. Apparently it caches the query !!! i don't know but for some reasons i dont't want this , so how can i prevent Query-analizer from doing so? Thanks in advance. Regards.
please kindly help me for making a database My problem is that firstly when we create databse through enterprise manager then the second step is i want to create a tables.
when i am creating any one table in data base after this process when execuet the Query/show the table then the message is displayed
" the query cannot executed because some files are either missingor not register Run setup files again to make the required files are register."
how to handle this problem what can i do ?:shocked: :shocked:
SELECT *,isnull(dbo.fn_1(a),'') f_a, isnull(dbo.1(b),'') f_b FROM tablea with(nolock) WHERE ID = 12345 and ID<10000000 and dbo.fn_1(7)='asdfasdf' and Active='Y'
Does it effect performance as i am using a udf here? Also let me know the order in which the conditions will be applied.
I want to know is there any way by which we can come to know which query is going to be executed by the Engine prior to its execution. I think if this is possible then this will affect the performance. But It will be very helpful for me.
I have sql query to search for fields in a rather big view. If I execute the query in sql server enterprise manager, the results will be displayed in less than 6 seconds. However, if I execute it using asp.net, it will take very long (more than 2 minutes).
The query is a simple one like "SELECT * FROM myview WHERE name LIKE '%Microsoft%'". And the code I use to execute it in asp.net is
Dim dsRtn As DataSet Dim objConnection As OleDbConnection Try objConnection = GetOleDbConnection() objConnection.Open() Dim objDataAdapter As New OleDbDataAdapter(strSearch, objConnection) Dim objDataSet As New DataSet() objDataAdapter.Fill(objDataSet, strTableName) dsRtn = objDataSet Catch ex As Exception dsRtn = Nothing Finally If objConnection.State = ConnectionState.Open Then objConnection.Close() End If End Try
Where strSearch is the sql search string.
I don't have any problem using such code for other queries.
Could somebody suggest the cause of the problem and how to solve it? Thanks!
I am having a query where I am connecting to eight different tables using joins. When I join one table to another the speed of the execution becomes less. Even on my local server it is taking nearly 2 to 3 minutes to execute the query. How can I increase the speed of execution of my query.
I wanted to know whether we have an execution plan enabled in SQL 6.5 as we have it in SQL 7.0 and SQL 2000 . I.e when we execute a query and if we enable ' show execution plan 'then it creates a map and shows the vital statistics . If that is available on SQL 6.5 then i am missing that tool .
How can i have it installed on my SQL 6.5 server ??
I am using a store procedure and in this sp i am having a simple select statement. Now i found that when i executes this sp in query analyzer it takes about 8-10 min to show the output. Table is having thousands of records. I can rebuild indexes on table, but apart from this what else i can do to speed up the query.
I know there is something like we can use indexes explicitly in sql query. Is it true? if yes plz show me how to use it, by giving example
Also is there any other way to run the query much faster.
We have SQL Server 2000 and int is an Oracle linked server. I'm trying to run the following query...
SELECT DISTINCT a.auf_nr AS OrderNo, e.ku_name AS Customer, d.bestell_dat AS OrdDate, d.liefer_dat AS DelvDate, CAST(SUM(b.anz) AS FLOAT) Qty, CAST(SUM((CAST(c.breite AS FLOAT) / 1000 * CAST(c.hoehe AS FLOAT) / 1000) * b.anz) AS FLOAT) SQM, CAST(SUM(a.liefer_offen) - (SUM(a.anz) - SUM(b.anz)) AS FLOAT) AvailDelv, CAST(SUM(a.liefer_anz) AS FLOAT) Delvd, CAST(SUM(c.sum_brutto*a.anz) AS FLOAT) Value
FROM liorder..LIORDER.AUF_STAT a, liorder..LIORDER.AUF_LIP_STATUS b, liorder..LIORDER.AUF_POS c, liorder..LIORDER.AUF_KOPF d, liorder..LIORDER.KUST_ADR e
WHERE a.auf_nr = b.auf_nr and b.auf_nr = c.auf_nr and c.auf_nr = d.auf_nr and d.kunr = e.ku_nr and a.auf_pos = b.auf_pos and b.auf_pos = c.auf_pos and b.lip_status = 7 and c.ver_art !='V' and a.history = 0 and a.rg_stat != 2 and e.ku_name IS not null and e.ku_vk_ek = 0 and d.bestell_dat BETWEEN '01/01/2005' and '12/17/2005'
GROUP BY a.auf_nr, d.liefer_dat, b.lip_status, d.bestell_dat, e.ku_name, d.kopf_tour, d.kopf_firma
HAVING CAST(SUM(a.liefer_offen)-(SUM(a.anz)-SUM(b.anz)) AS FLOAT) > 0
..and it takes around 2 minutes to show the results even if the date range is of the same date. I even tried to use an indexed column but I still get the same slow execution time. I even tried to create a UDF so that the WHERE clause would be resolved remotely on the Oracle DB but still the same. Is there anyway I can do it in much more efficient and faster way?
I have a .Net application that calls an stored procedure. When it does, the execution goes and never ends (I have to kill the windows process). When I call the sp from within the Management Studio, it also never ends executing and I have to cancel the query. But, when I call it immediately after, it takes 45 seconds to complete.
Now, the sp has several parts and I have made that it prints a message at the end of each part so that I can read where it stops. Strange enough, it completes all parts except the last one, which has the form INSERT INTO myLocalTable SELECT * FROM MyRemoteTable. But if I execute the Select independetly, I discover that it brings no rows! Now, many of the @@rowcount printed after the execution of the other parts shows zero rows involved or just a few. I am not using cursors, each part is an UPDATE statement or an INSERT.
TestMachine1 runs SQL2005 SP2 and has as linked server myRemoteServer (SQL2000) server. The stored procedure in TestMachine1 inserts rows to a table in myRemoteServer and brings back some rows.
can anybody help How to solve the below error.While calling the sp from the front application the below error is thrown.but if i executed the sp in backend No error is thrown ,resultset is produced.
Error thrown: The query has been canceled because the estimated cost of this query (7) exceeds the configured threshold of 6. Contact the system administrator.
I have a strong feeling that this isn't possible but I thought I might as well ask...
I'm developing a database application (SQL Server 2000 backend) where the client and the server is separated over a slow network connection (satellite). There are parts in the application where I will have to query a large resultset so I was wondering if there is a way to determine the percent complete of a query so I can put a progress bar on the interface so the user can see it loading data instead of having a frozen form.
I thought about spliting up the query into different ones and update the bar once each separate one is complete but I'd rather not do that because in the application development environment I'm working in, I have to close the resultset and reopen it every time I do a query... unless this isn't a big deal but I'm under the impression its something to avoid.
Hi there - i'm hoping someone can help me!I'm having a problem with a live database that i'm running on MSDE - Itseems to have slowed down quite considerably from the test environment(even when all the data is the same). The is notably different on oneparticular query that takes 1 sec on the test machine and almost 1 minon the live machineThe total number of user connections on the live machine is normally 4or so (found out through the Performance monitor). So I can't see thatit's MSDE's performance throttler...Has anybody got any ideas on things i can check for??Many thanksJames
We have a console.php which takes in SQL queries and displays them in aresult.php webpage.Sometimes the query takes minutes to execute or crashes the PHPapplication. Is it possible to cancel a query during execution?If yes, how does one go about it.The PHP Console is used by multiple users simultaneously.The queries are executed on a Remote Database.
Hi,We are trying to solve a real puzzle. We have a stored procedure thatexhibits *drastically* different execution times depending on how itsexecuted.When run from QA, it can take as little as 3 seconds. When it iscalled from an Excel vba application, it can take up to 180 seconds.Although, at other times, it can take as little as 20 seconds fromExcel.Here's a little background. The 180 second response time *usually*occurs after a data load into a table that is referenced by the storedprocedure.A check of DBCC show_statistics shows that the statistics DO getupdated after a large amount of data is loaded into the table.*** So, my first question is, does the updated statistics force arecompile of the stored procedure?Next, we checked syscacheobjects to see what was going on with theexecution plan for this stored procedure. What I expected to see wasONE execution plan for the stored procedure.This is not the case at all. What is happening is that TWO separateCOMPILED PLANs are being created, depending on whether the sp is runfrom QA or from Excel.In addition, there are several EXECUTABLE PLANs that correspond to thetwo COMPILED PLANs. Depending on *where* the sp is run, the usecountincreases for the various EXECUTABLE PLANS.To me, this does not make any sense! Why are there *multiple* compileand executable plans for the SAME sp?One theory we have is, that we need to call the sp with the dboqualifier, ie) EXEC dbo.spHas anyone seen this? I just want to get to the bottom of this andfind out why sometimes the query takes 180 seconds and other timesonly takes 3 seconds!!Please help.Thanks much
I have a group of reports using a shared datasource. Going to the preview of the report works fine in the report designer, but when I try and view it from a browser (deployed on a website), it gives the error:
"An error has occurred during report processing.
Query execution failed for data set 'DataSet1_ticketInfo'.
Failed to parse SQL.[long sql query here]"
If there's a problem with it, I don't get why it works in preview mode. I'm using SQL server 2005.
We recently upgraded from SQL 6.5 to SQL 7. I have a few .sql files that were each running around 5 - 8 minutes under 6.5. These same files now each take over 30 minutes to run. Has anybody had problems with their queries taking longer to run under 7.0? These files are quite large and are comprised of 3 - 4 batches with several queries in each batch. If anybody has any thoughts on the cause please let me know.
if t-sql query is perfectly run in development and when I execute in production at that time I want to use execution plan which is in development . so how I can do using cache? I know about hint we can use hint USE_PLANE. but I want to do with cache .
I need to implement one logic similar to rule engine. Below is the example.how to execute all queries in second table order?? what is the best way to implements this
declare @tblRules AS Table (RuleNo INT, RuleDesc NVARCHAR(500), RuleQuery NVARCHAR(MAX), QueryExecutionInterval NVARCHAR(50)) declare @tblRuleResults AS Table (RuleResultID INT, RuleNo INT, ExecuteTime DateTime, NextExecutionTime DateTime, Result NVARCHAR(10)) INSERT INTO @tblRules VALUES ('1','Fail - 2 times within 1 Hour','XXX','Every 15 Minutes') INSERT INTO @tblRules VALUES ('2','Fail- 2 times within 2 Hour','YYY','Every 30 Minutes')