Slow Query Execution, SQL Server 2000
Jul 20, 2005
Hi All,
I have a table that currently contains approx. 8 million records.
I'm running a SELECT query against this table that in some
circumstances is either very quick (ie results returned in Query
Analyzer almost instantaneously), or very slow (ie 30 to 40 seconds to
return results), and I'm trying to work out how I improve performance.
Essentially the query I'm running is nothing more complex than:
SELECT TOP 1 * FROM Table1 WHERE tier=n ORDER BY member_id
[tier] is a smallint column with a non-clustered, non-unique index on
it. [member_id] is a numeric column with a clustered, unique index on
it.
When I supply a [tier] value of 1, it returns results instantaneously.
I have no idea if this is meaningful, but the tier = 1 records were
loaded first into the table, and comprise approximately 5 million
records.
When I supply a [tier] value of 2, the results take 30 to 40 seconds.
tier =2 records were loaded second, and comprise approximately 3
million records.
I've tried running an execution plan, and while I'm no expert, it
appears to me that the index on tier isn't being used, even if I use:
tier = CAST(2 as SMALLINT)
I'm wondering if anyone can give me ANY advice on how to get any
better performance out of this SELECT statement?
Also, out of curiosity, can a disk defragment have a positive impact
on SELECT query performance?
Any help very much appreciated!
Much warmth,
Murray
View 4 Replies
ADVERTISEMENT
Dec 2, 2003
Hi all,
I have sql query to search for fields in a rather big view. If I execute the
query in sql server enterprise manager, the results will be displayed in
less than 6 seconds. However, if I execute it using asp.net, it will take
very long (more than 2 minutes).
The query is a simple one like "SELECT * FROM myview WHERE name LIKE
'%Microsoft%'". And the code I use to execute it in asp.net is
Dim dsRtn As DataSet
Dim objConnection As OleDbConnection
Try
objConnection = GetOleDbConnection()
objConnection.Open()
Dim objDataAdapter As New OleDbDataAdapter(strSearch, objConnection)
Dim objDataSet As New DataSet()
objDataAdapter.Fill(objDataSet, strTableName)
dsRtn = objDataSet
Catch ex As Exception
dsRtn = Nothing
Finally
If objConnection.State = ConnectionState.Open Then
objConnection.Close()
End If
End Try
Where strSearch is the sql search string.
I don't have any problem using such code for other queries.
Could somebody suggest the cause of the problem and how to solve it? Thanks!
Best regards,
David
View 9 Replies
View Related
Jul 1, 2004
Hi,
I am having a query where I am connecting to eight different tables using joins. When I join one table to another the speed of the execution becomes less. Even on my local server it is taking nearly 2 to 3 minutes to execute the query. How can I increase the speed of execution of my query.
Thanks in advance,
Uday
View 1 Replies
View Related
May 21, 2002
I have a procedure (used to create a report) and was used in sql 7.0 service pack 3.
Problem is that we are upgrading to SQL 2000 and this procedure now takes 1 minute and 30 seconds to execute vs. 10 seconds previously.
Everything is same between the sql 7 and sql 2000 server. i.e. database size, indexes, hardware etc.
I looked at the query execution plan and it seems to do a sort which is taking majority of the resources on sql 2000 even though there is no sort stmt issued in the procedure itself.
Any help would be appreciated?? I am more curious to find out why this is the case when all the variables are same between the two servers yet sql 2000 performance is much worse than sql 7.0. It should be the other way around!!
View 2 Replies
View Related
Oct 29, 2007
Scenario 1: Sproc executed on local server against local tables that took 40 seconds to run, now takes 30 minutes to run. - No blocking locks - Sometimes "NOP" in command when sp_who2 is run. - perfmon shows nothing out of the ordinary when looking at server resources. (memory, processors, etc.) there have been NO configuration changes. - Occaisional lost packets (every 10th) with ping -t - I flushed the procedure cache, and rebooted the server.Scenario 2: Sproc executed on another server accesses tables on Scenario 1 local server via server link, runs with no problems in 30 seconds.SQL Server 2000 SP3a.
View 10 Replies
View Related
Oct 15, 2007
We have a quick query regarding SQL performance.We have SQL Server 2000 (32 Bit) and SQL Server 2005 (64 Bit) as twoseparate instances on a DB Server.We were analysing the execution times for the same stored procedure onboth instances:1. Through Remote Desktop of the actual DB server2. Through Query Analyser of my local machine.The results were as follows:1. Through Remote Desktop of the actual DB serverIterationSP Execution Time (in secs)SQL 2000SQL 200512852273327344035383Average 3232. Through Query Analyser of my local machine.IterationSP Execution Time (in secs)SQL 2000 SQL 2005)1379623277335844277954391Average3585Could you please provide some light on why case 2 is slow and anysuggestions to improve the same?Thanks in Advance!
View 3 Replies
View Related
May 14, 2007
I have a SQL Server 2005 Std. Ed. 64-bit installation. There is one instance supporting a single production database. I have a CLR udf. This udf uses the XMLDocument object to retrieve XML from a URL. When the CLR udf is executed, there seems to be an initial slow response time. Subsequent response times are very fast. If the CLR udf is not called for a few minutes and then called, the slowdown appears again.
Is there something happening behind the scenes with compilation or something like that which could cause this slowdown?
Any guidance is appreciated.
Thanks in advance.
View 5 Replies
View Related
Apr 19, 2008
Hi,
I am using SQL2K for a intranet web application database. And I have several database in it.
After a long periode of time (about 6 month) the the application is slowing down when accessing the data.
I use full log for the database.
I also set an autobackup for the database for every 15 minute, but I think this is not what cause the slow down.
The database seem to slow down more and more every day.
I keep only the last 3 days transaction data and every day I delete transaction data older then 4 day, so the number of records in the transaction table is just about 3000 to 5000 records only.
Please help me. Thanks.
View 3 Replies
View Related
Sep 30, 2007
Hi,
We have a server that s too slow. The tables have about between 1 million and 3 million records each. We have about 30 tables for production. The other tables are mostly look up tables (just for personnel and categories and so on, only static tables)
1/ do u guys think that s too much data, I mean about 3 millions records in a table. Because the manager said, may be if u archive some records and delete them from the tables, the system might become quick and performant?
2/ When we check the activity monitor in management in entreprise manager, we find a process Id 55 that has the value 1 in the Blocking column, and other processes having that process ID 55 in the Blocked by column.
Does the fact that the blocking column for a process equals 1 mean that the process is having a problem, and how do we know what causes the problem for sure?
3/When we check Performance monitor, pls note that we have RAID5 with 5 disks, we see that Avg. Disk Queue Length counter ranging between 20 and 70 and especially if we run a query for example, although simple select query in one of the tables, the Avg. Disk Queue Length counter goes to the roof to 100 % in performance monitor.
I don t think we have a pb with the CPU, it looks good, not going to the roof.
For your info, we have the web server on that same SQL server machine as well, but the web server s working well, only sql server and the client application that accesses it is slow, the web pages are pretty fast.
Am I following the good path to diagnose what s wrong. It s my first time doing this, and I am trying to learn.
I appreciate your guidance.
View 4 Replies
View Related
Sep 13, 2006
We have been working with SSIS for a while and we have not found a solution or a reason for this. We have a master package that calls 10 packages in sequential order. (as shown below). If we execute each one of the package separately the run in less than 2 minutes, but when we call them through the master package the execution time start increasing as follows: Child 1 (2 min), Child 2 (3 min),, Child 3 (4 min), Child 4 (6 min), Child 1 (7 min), and so on. The execute package task has the ExecutionOutOfProcess = false (when we set it equal to True even takes longer to execute, it was creating a dtsHost.exe process for each child and always remain in memory after the package finished executing). Can someone please provide a solution or a workaround for this? Any help would be appreciated. Any help will be appreciated.
Master
|------Child 1
|
|------Child 2
|
||------Child 3
|€¦€¦€¦€¦
|€¦€¦€¦€¦
||------Child 11
Thanks,
View 4 Replies
View Related
Jun 29, 2007
Hello,
I have a big problem with slow execution of stored procedure in SQL Server 2005 but I really don't understand the reason. I have a database with large table (about 400 million rows) and simple stored procedure to get data from that table (one select statement to select time and value columns).
Strange thing is that if I call that stored procedure from .net application (native SqlDataProvider) it takes about 6 seconds to execute but if I call the same procedure with the same parameters from within SQL Server Management Studio it takes only 25 milliseconds to execute!
I've noticed that from .net, procedure is called with binary data and in Management Studio sql script is executed so I've copied/pasted the script from Management Studio to .net code and again the same thing happens (6 seconds from .net and 25ms from Management Studio). I traced executions with SQL Profiler and everything seems to be identical for both applications except it takes much longer time for .net application.
Both SQL Server Management Studio and .net application are on the same machine and SQL Server is on another.
This is the query that when executed in Management Studio takes 25ms:
EXEC [dbo].[GetRawData] @pcu = N'DV_ZERJ_HEV1',@tag = N'MJERENO',@from = N'20070629 07:00:00',@to = N'20070629 08:00:00'
This is the same query in .net application code that takes 6 seconds to execute:
sqlCommand = new SqlCommand("EXEC [dbo].[GetRawData] @pcu = N'DV_ZERJ_HEV1',@tag = N'MJERENO',@from = N'20070629 07:00:00',@to = N'20070629 08:00:00'",sqlConnection);
sqlReader = sqlCommand.ExecuteReader();
At first I thought that Management Studio somehow caches results but if I change parameters of stored procedure it always takes less than 30ms to execute.
I really don't understand this. Please, help!
View 7 Replies
View Related
Sep 27, 2001
Hello ,
I'm a newbie,I programmed a store procedure,and when it runs in query analyzer,
I get results very fast,but when I add it in a job,then this job run very very slow,
(I add records and delete records in the SP), WHY?
TIA!
James
View 2 Replies
View Related
Jul 20, 2005
I have two tables:T1 : Key as bigint, Data as char(20) - size: 61M recordsT2 : Key as bigint, Data as char(20) - size: 5M recordsT2 is the smaller, with 5 million records.They both have clustered indexes on Key.I want to do:update T1 set Data = T2.Datafrom T2where T2.Key = T1.KeyThe goal is to match Key values, and only update the data field of T1if they match. SQL server seems to optimize this query fairly well,doing an inner merge join on the Key fields, however, it then does aHash match to get the data fields and this is taking FOREVER. Ittakes something like 40 mins to do the above query, where it seems tome, the data could be updated much more efficiently. I would expectto see just a merge and update, like I would see in the followingquery:update T1 set Data = [someconstantdata]from T2where T2.Key = T1.Key and T2.Data = [someconstantdata]The above works VERY quickly, and if I were to perform the above query5 mil times(assuming that my data is completely unique in T2 and Iwould need to) it would finish very quickly, much sooner than theprevious query. Why won't SQL server just match these up while it ismerging the data and update in one step? Can I make it do this? If Iextracted the data in sorted order into a flat file, I could write aprogram in ten minutes to merge the two tables, and update in onestep, and it would fly through this, but I imagine that SQL server iscapable of doing it, and I am just missing it.Any advice would be GREATLY appreciated!
View 3 Replies
View Related
Jan 21, 2008
Hello there,
I'm using a SQL Server 2000 with a dozen of databases. The databases are rather small (all sizes together sum up to 10 GB).
The entire server gets extremelly slow from time to time (lasting a few days when it happens and suddently coming back to normality). A profiler trace doesn't show anything strange (besides a lot of SQL Agent entries).
I pretty much tried to isolate every single application that makes use of the databases in that server and see if it was the cause of the problem, but I couldn't find any correlation.
I know the computer where this server runs is quite fragmented. Is there anyway I could get to know if this is the cause of my performance issues?
I don't know if this happens, but once the server simply went down for some 3 hours, and I wasn't able to bring it up in anyway. It eventually started working again by itself. The only thing I did in the meantime was to run DBCC CHECKDB a few times, always getting the response "No error found on the database".
Any hint on that?
View 1 Replies
View Related
Dec 26, 2007
Hi everybody !!! I have a problem with database SQL server 2000, and I need anyone help me.
I have PC machine with hardware configuration following:
+ RAM : 512M
+ CPU : 2.4 GHz (intel Pentium 4)
+ System : Microsoft Windows Server 2003 Enterprice Edition , Service Pack 1
(this is call Machine1 )
and other Server machine with hardware configuration the following :
+ RAM : 2G
+ CPU : 3.6 GHz( intel Xeon)
+System : Microsoft Windows Server 2003 Enterprice Edition , Service Pack 2
(use RAID 5)
(this is call Machine2 )
.And SQL Server 2000 is installed in these machine.
I created one table and one script to insert several records into this table. and script for inserting:
declare @count int , @max int
set @count =1
set @max = 5000
WHILE (@count < @max or @count = @max)
BEGIN
insert into TBAT_STOCKEOD(TRDT,SEQNO,COMPID,TRANSTYPE) values('20071219',cast(@count as varchar(5)),'13215','1')
set @count = @count +1
END
I execute this script on Machine1 and Machine2 :
+ Machine1 : spent 3 seconds.
+ Machine2 : spent 30 seconds. I don't understand why Machine2 is slower than Machine1 ( while configuration of Machine2 is better than Machine1's ). I don't know what happen.
I checked resource usages in perfmon when SQL server execute script and result following :
Machine1:
Proccessor used 100%
whereas,Machine2 used physical Disk 100%
Is there the way to configurate SQL server to improve speed writting of hard disk?
View 5 Replies
View Related
Apr 11, 2006
I have some VB.NET code that starts a transaction and after that executes one by one a lot of queries. Somehow, when I take out the transaction part, my queries are getting executed in around 10 min. With the transaction in place it takes me more than 30 min on one query and then I get timeout.
I have checked sp_lock myprocessid and I've noticed there are a lot of exclusive locks on different objects. Using sp_who I could not see any deadlocks.
I even tried to set the isolation level to Read UNCOMMITED and still have the same problem.
As I said, once I execute my queries without being in a transaction everything works great.
Can you help me to find out the problem?
Thanks,
Laura
View 11 Replies
View Related
Feb 12, 2008
I have a parent package which executes 14 child packages in parallel, which on average take ~10 seconds each to complete when I execute the parent packege using BIDS or DTEXEC.
However, if I run the parent package using SQL Management Studio (Integration Services > Stored Packages > MSDB > Right Click > Run Package) each package takes in excess of 10 minutes to run, getting progressively slower as each package starts.
Surely the package is executing in exactly the same way as BIDS/DTEXEC, just a differenct UI?
Can anyone explain why this happens?
Thanks in advance,
Leigh
View 9 Replies
View Related
Sep 27, 2006
Hey. I've a problem and I think I know the answer also but still want to confirm. We are using SQL 2000 and SSRS 2000. The problem is, we have custom reports which a customer can build and run. I wonder how one can write sp's for that. The way it's written right now is a dynamic select clause then a dynamic, from, a dynamic where, dynamic groupby all appended torgether and run by execute command. I know it'd dynamic SQL and execution plans and stuff will hurt me but someof these reports take forever. Is there anything that can be done to fasten these reports? And if the select will be dynamic and the where will be dynamic, does it make sense to even use a sp? Is it ever going to use the same execution plan? When I run DBCC memorystatus, procedure cache takes up most of this memory. Does the use of dynamic SQL explain that?
Thank you for your time and effort in replying.
View 3 Replies
View Related
Oct 14, 2007
Dear friends,I have a problem with a simple select statement and I don't know why it is happening.I have 2 tables, Fees and FeesDataRoles. Fees presents all the fees and FeesDataRole is a middle table between Fees and Roles table. So each fee can have multiple Roles and a Role can have many Fees.Now I have a select statement:Select *From Fees Inner Join FeesDataRoles ON Fees.FeeID = FeesDataRoles.FeeIDWhere (FeesDataRoles.DataRoleID = @DataRoleID) AND (FeesDataRoles.RecordStatus = 1 ) AND (FeesDataRoles.ValidFrom >= getdate() ) AND ( FeesDataRoles.ValidTo <= getdate() OR FeesDataRoles.ValidTo is null)Now it shouldn't take that long to execute this procedure but surprisingly sometimes when I insert a value to the table and then execute this store procedure it does now show the data just added. Very strange.....!!!!I ran the procedure 5 times after inserting an item and nearly 1 out of 5 does not return the right result righ. ( It does not include the recently inserted rows)Anyone have any idea....?I used Tuning Advisor, no sugestion. I change the clustered index in FeesDataRoles from FeesDataRoleID(the primary key of the table) to DataRoleID to increase the performance, still it happens sometimes.Is my Where clause so costly that cause this problem.Please help. I really appreciate your help.Regards,Mehdi
View 2 Replies
View Related
Feb 9, 2007
Hi,
I have created a package that has
2 SQL Execute Task, One Loop container, 2 Data Flow tasks, 1 Foreach loop container, 1 ftp task. The data flow tasks has 1 oledb source, 1 flat file source, 1 row count transformation, 1 recordset destination and 1 oledb destination.
When I load the package into BIDS it takes 125 MB of memory and then everything is slow, the properties panel slides in slowly and exists slowly. The object is the packages are not painted properly. to make changes and run takes lot of time.
Am I doing anything wrong here? Why is it consuming so much of memory?
Thanks in advance for your help.
Regards,
$wapnil
View 2 Replies
View Related
Sep 9, 2004
I have had a problem with Enterprise Manager connecting to SQL Server. At first this problem was experienced with one particular network user... whichever PC he logged onto, Enterprise Manager took ages to connect to SQL Server and every operation was painfully slow. Creating a new Windows NT logon fixed the problem.
I now have this same problem but only on my PC. It doesn't matter which Windows NT logon I use, using Enterprise Manager is painfully slow. I've tried creating a new Windows NT profile, checking the hard drive for errors, defragmenting the disk, reinstalling Enterprise Manager etc but nothing works.
What is strange is that connections from VB applications on my PC are fast. It is only Enterprise Manager that is slow.
I am using the latest service pack for SQL Server.
I thought it could be a problem with the Enterprise Manager registry values but don't want to start messing with them!
Has anyone else experienced this problem? Any advice would be great as I can't find any help from microsoft other than installing the latest service pack for SQL Server.
Thanks
View 4 Replies
View Related
Sep 6, 2006
I have a problem with bad perfomance with my import of data from a SQL Server 2000 database. I use an OLEDB datasource in my SSIS package to connect to the sql server 2000 database. My 2005 server runs 64bits but i dont think this is an issue. With this configuration the import is VERY slow, we are talking about 40+ minutes to get 3.5 million rows with about 20 columns. When i create a test DTS package on the SQL Server 2000 server itself and run it, its blazingly fast. Has anyone run into something similar?
View 2 Replies
View Related
Apr 18, 2008
Hi there,
I was wondering if someone can point out the error or the thing I shouldn't be doing in a stored procedure on SQL Server 2005. I want to switch from SQL Server 2000 to SQL Server 2005 which all seems to work just fine, but one stored procedure is causing me headache.
I could pin the problem down to this query:
DECLARE @Package_ID bigint
DECLARE @Email varchar(80)
DECLARE @Customer_ID bigint
DECLARE @Payment_Type tinyint
DECLARE @Payment_Status tinyint
DECLARE @Booking_Type tinyint
SELECT @Package_ID = NULL
SELECT @Email = NULL
SELECT @Customer_ID = NULL
SELECT @Payment_Type = NULL
SELECT @Payment_Status = NULL
SELECT @Booking_Type = NULL
CREATE TABLE #TempTable(
PACKAGE_ID bigint,
PRIMARY KEY (PACKAGE_ID))
INSERT INTO
#TempTable
SELECT
PACKAGE.PACKAGE_ID
FROM
PACKAGE (nolock) LEFT JOIN BOOKING ON PACKAGE.PACKAGE_ID = BOOKING.PACKAGE_ID
LEFT JOIN CUSTOMER (nolock) ON PACKAGE.CUSTOMER_ID = CUSTOMER.CUSTOMER_ID
LEFT JOIN ADDRESS_LINK (nolock) ON ADDRESS_LINK.SOURCE_TYPE = 1 AND ADDRESS_LINK.SOURCE_ID = CUSTOMER.CUSTOMER_ID
LEFT JOIN ADDRESS (nolock) ON ADDRESS_LINK.ADDRESS_ID = ADDRESS.ADDRESS_ID
WHERE
PACKAGE.PACKAGE_ID = ISNULL(@Package_ID,PACKAGE.PACKAGE_ID)
AND PACKAGE.CUSTOMER_ID = ISNULL(@Customer_ID,PACKAGE.CUSTOMER_ID)
AND PACKAGE.PAYMENT_TYPE = ISNULL(@Payment_Type,PACKAGE.PAYMENT_TYPE)
AND PACKAGE.PAYMENT_STATUS = ISNULL(@Payment_Status,PACKAGE.PAYMENT_STATUS)
AND BOOKING.BOOKING_TYPE = ISNULL(@Booking_Type,BOOKING.BOOKING_TYPE)
-- If this line below is included the request will take about 90 seconds whereas it takes 1 second if it is outcommented
--AND ADDRESS.EMAIl LIKE '%' + ISNULL(@Email, ADDRESS.EMAIL) + '%'
GROUP BY
PACKAGE.PACKAGE_ID
DROP TABLE #TempTable
The request is performing quite well on the SQL Server 2000 but on the SQL Server 2005 it takes much longer. I already installed the SP2 x64, I'm running the SQL Server 2005 on a x64 environment.
As I stated in the comment in the query it takes 90 seconds to finish with the line included, but if I exclude the line it takes 1 second.
I think there must be something wrong with the join's or something else which has maybe changed in SQL Server 2005. All the tables joined have a primary key.
Maybe you folks can spot the error / mistake / wrong type of doing things easily.
I would appreciate any help you can offer me to solve this problem.
On the web I saw that there is a Cumulative Update 4 for the SP2 which fixes the following:
942659 (http://support.microsoft.com/kb/942659/)
FIX: The query performance is slower when you run the query in SQL Server 2005 than when you run the query in SQL Server 2000
Anyhow I think the problem is something else, I haven't tried out the cumulative update yet, as I think it is something different, more general why this query takes ages to process.
Thanks again for any help
Best regards,
Pascal
View 9 Replies
View Related
Feb 10, 2006
please kindly help me for making a database
My problem is that
firstly when we create databse through enterprise manager
then the second step is i want to create a tables.
when i am creating any one table in data base after this process when execuet the Query/show the table then the message is displayed
" the query cannot executed because some files are either missingor not register
Run setup files again to make the required files are register."
how to handle this problem what can i do ?:shocked: :shocked:
View 3 Replies
View Related
Sep 19, 2007
Hi,
when i execute a dts to copy one database from one server to another i am getting the following error:
[Microsoft][ODBC SQL Server Driver][SQL Server]OLE DB 'SQL OLEDB' reported an error.Authentication Failed.
[Microsoft][ODBC SQL Server Driver][SQL Server]OLE/DB provider returned message: Invalid authorization specification]
[Microsoft][ODBC SQL Server Driver][SQL Server]OLE DB error trace [OLE/DB Provider 'SQLOLEDB' IDB Initialize: Initiliaze returned 0x80040e4d: Authentication failed.].
Please help me solve it.
View 1 Replies
View Related
May 8, 2006
An SSIS package to transfer data from a DB instance on SQL Server 2005 to SQL Server 2000 is extremely slow. The package uses an OLEDB Source to OLEDB Destination for data transfer which is basically one table from sql server 2005 to sql server 2000. The job takes 5 minutes to transfer about 400 rows at night when there is very little activity on the server. During the day the job almost always times out.
On SQL Server 200 instances the job ran in minutes in the old 2000 package.
Is there an alternative to this. Tranfer Objects task does not work as there is apparently a defect according to Microsoft. Please let me know if there is any other option other than using a Execute 2000 package task or using an ActiveX Script to read records from one source and to insert them into the destination source, which I am not certain how long it might take and how viable will that be?
Any inputs will be much appreciated.
Thanks,
MShah
View 5 Replies
View Related
Mar 13, 2007
I am using two almost idential laptops, one with XP and one with Vista, the only differences is that the XP laptop has 1G of RAM and running Office XP and the Vista has 2G RAM and is running Office 2007.
I have a MS Access database that has linked tables to a SQL Server 2000 database. The performance of the Access database on Vista is 5-10 times slower on the Vista machine. Just flipping through records or opening forms can take 5 - 15 seconds on the Vista machine while the XP machine takes 1 sec or less.
What gives? I looked at the CPU performance and the network performance while the Access database was busy flipping through records, the network traffic was < 2% and the CPU would spike to 40% on one of the CPUs (dual core) but would remain under 5% most of the time.
I also previously had Office XP installed on the Vista machine and it had the same performance issue so bought and install Office 2007 on the Vista machine and it did not solve the problem.
It seems that Vista is doing something that is slowing down Access with linked tables. Is this a issue between Vista and using an ODBC connection to SQL Server?
Thanks in advance for any help on this
View 1 Replies
View Related
Nov 30, 2007
I hav the following problem. I have written an stored procedure in sql server 2000 as the following
CREATE PROCEDURE dbo.pa_rellena
@pFechaInicio datetime
AS
declare @pFechaFin datetime
declare @auxcod_cen char(10)
declare @importeEfectivo decimal(17,2)
declare @importeTarjetas1 decimal(17,2)
declare @importeTarjetas2 decimal(17,2)
declare @importeVales decimal(17,2)
declare @importeTalones decimal(17,2)
declare @importeGastos decimal(17,2)
select @pFechaFin=@pFechaInicio+1
--Borramos las tablas temporales si las hemos creado con anterioridad y no se han borrado
if object_id('tmpCentros') is not null
drop table tmpCentros
if object_id('tmpCentros2') is not null
drop table tmpCentros2
if object_id('tmpMaxCajas') is not null
drop table tmpMaxCajas
if object_id('tmpCajasCentro') is not null
drop table tmpCajasCentro
if object_id('tmpVales') is not null
drop table tmpVales
if object_id('tmpDiarioEfectivo') is not null
drop table tmpDiarioEfectivo
if object_id('tmpDiarioTalones') is not null
drop table tmpDiarioTalones
if object_id('tmpDiarioTarjetas') is not null
drop table tmpDiarioTarjetas
if object_id('tmpDiarioSegundaForma') is not null
drop table tmpDiarioSegundaForma
if object_id('tmpDiarioGastosTarjetas') is not null
drop table tmpDiarioGastosTarjetas
if object_id('temp1') is not null
drop table temp1
--Seleccionamos todos los centros de Salvador Bachiller
select * into tmpCentros2
from centros
where centros.tienda=1
order by cod_cen
--Seleccionamos el maximo de cajas por cada centro
select cod_cen, max(cod_caja) as cajas into tmpMaxCajas
from cierrecaja
where fecha>=@pFechaInicio and fecha<@pFechaFin
group by cod_cen
order by cod_cen
--Mezclamos los centros con el maximo de cajas
select c.cod_cen, c.Centro, c.Direccion, c.localidad, c.provincia, c.cpostal, c.telefono, m.cajas, operaciones, cajas_tot, tienda, franquicia into tmpCentros
from tmpCentros2 as c left outer join tmpMaxCajas as m on c.cod_cen=m.cod_cen
--Cajas por centro
select distinct cod_cen as cod_cen, cod_caja as cod_caja into tmpCajasCentro
from cierrecaja
where fecha>=@pFechaInicio and fecha<@pFechaFin
--Los vales de cada centro
select cod_cen,sum(importe) as imp1 into tmpVales
from vales where
fecha>=@pFechaInicio and fecha<@pFechaFin
group by cod_cen
--Efectivo de cada centro
select cod_cen,'01' as vendedor,'EFECTIVO' as descripcion, (sum(diario.TotEuro)-Sum(Diario.Imppa2)) as importe1,0 as exp1, (sum(Diario.TotEuro)-sum(Diario.imppa2)) as importe2 into tmpDiarioEfectivo
from diario
where fecha>=@pFechaInicio and fecha<@pFechaFin and cod_cen in (select cod_cen from tmpCentros) and cod_caja in (select cod_caja from tmpCajasCentro) and diario.cod_pago='01'
group by cod_cen
--Talones por centro
select centros.cod_cen,'02' as vendedor,'TALONES' as descripcion, sum(diario.TotEuro) as importe1,0 as exp1, sum(Diario.TotEuro) as importe2 into tmpDiarioTalones
from centros inner join diario on centros.cod_cen=diario.cod_cen
where fecha>=@pFechaInicio and fecha<@pFechaFin and diario.cod_cen in (select cod_cen from tmpCentros) and cod_caja in (select cod_caja from tmpCajasCentro) and diario.cod_pago='02'
group by centros.cod_cen
--Tarjetas por centro
select cod_cen,'03' as vendedor,'TARJETAS' as descripcion, sum(diario.TotEuro) as importe1,0 as exp1, sum(Diario.TotEuro*(FPago.Descuento/100)) as importe2, sum(Diario.TotEuro) - sum(Diario.TotEuro*(FPago.Descuento/100)) as importe3 into tmpDiarioTarjetas
from FPago left join Diario on fpago.Cod_pago=Diario.cod_pago
where fecha>=@pFechaInicio and fecha<@pFechaFin and cod_cen in (select cod_cen from tmpCentros) and cod_caja in (select cod_caja from tmpCajasCentro) and Fpago.Descuento<>0
group by cod_cen
--Segunda Froma de Pago
select cod_cen,'03' as vendedor,'TARJETAS' as descripcion,sum(diario.imppa2) as importe1 into tmpDiarioSegundaForma
from fpago left join Diario on Fpago.cod_pago=diario.cod_pa1
where fPago.cod_pago<>'99' and fecha>=@pfechaInicio and fecha<@pFechaFin and cod_cen in (select cod_cen from tmpCentros) and cod_caja in (select cod_caja from tmpCajasCentro) and Fpago.Descuento<>0
group by cod_cen
--Comisiones tarjetas de pago
select cod_cen,'10' as vendedor, 'GASTOS (-)' as descripcion, sum(Diario.imppa2*(fPago.Descuento/100)) as importe2 into tmpDiarioGastosTarjetas
from Fpago left join Diario on FPago.cod_pago= Diario.cod_pa1
where fPago.cod_pago<>'99' and fecha>=@pFechaInicio and fecha<@pFechaFin and cod_cen in (select cod_cen from tmpCentros) and cod_caja in (select cod_caja from tmpCajasCentro) and Fpago.Descuento<>0
group by cod_cen
/*
--Venta neta por centro
declare cursortemporal cursor for select cod_cen from TmpCentros2
open cursortemporal
delete detallecaja_aux
fetch next from cursortemporal into @auxcod_cen
while @@fetch_status=0
Begin
select @importeVales=imp1 from tmpVales where cod_cen=@auxcod_Cen
select @importeEfectivo=importe2 from tmpDiarioEfectivo where cod_cen=@auxcod_Cen
select @importeTalones=importe2 from tmpDiarioTalones where cod_cen=@auxcod_cen
select @importeTarjetas1=importe3 from tmpDiarioTarjetas where cod_cen=@auxcod_cen
select @importeTarjetas2=importe1 from tmpDiarioSegundaForma where cod_cen=@auxcod_cen
select @importeGastos=importe2 from tmpDiarioGastosTarjetas where cod_cen=@auxcod_cen
select @importeVales=isnull(@importeVales,0)
select @importeEfectivo=isnull(@importeEfectivo,0)
select @importeTalones=isnull(@importeTalones,0)
select @importeTarjetas1=isnull(@importeTarjetas1,0)
select @importeTarjetas2=isnull(@importeTarjetas2,0)
select @importeGastos=isnull(@importeGastos,0)
print @auxcod_cen
print @importeVales
print @importeEfectivo
print @importeTalones
print @importeTarjetas1
print @importeTarjetas2
print @importeGastos
insert into detallecaja_aux (cod_cen,importe1)
values(@auxcod_cen, @importeVales+@importeEfectivo+@ImporteTalones+@ImporteTarjetas1+@importeTarjetas2-@importeGastos)
fetch next from cursortemporal into @auxcod_cen
select @importeVales=0
select @importeEfectivo=0
select @importeTalones=0
select @importeTarjetas1=0
select @importeTarjetas2=0
select @importeGastos=0
end
close cursortemporal
*/
select * from detallecaja_aux
GO
When I try to run it from visual basic it slow down the sql server.
What can I do?
View 2 Replies
View Related
Mar 18, 2008
Hi
I have a database and when I run a query on it the query takes 10 minutes to complete. I am running the following query
SELECT t103.cs_flag, t103.pr_flag, SUM (t103.amount), COUNT (t103.record_id)
FROM br_data t103
WHERE t103.acct_id = 12 AND (t103.state = 3 OR t103.state = 7)
GROUP BY t103.cs_flag, t103.pr_flag
The br_data table doesnt seem to be using its indexes ?? And it has around a million records. Now when I export the database and import on to another SQL server and then run the same query as above it only takes 1 or 2 seconds.
On the server that we are having problems with I have tried to re-build the indexes using DBCC DBREINDEX (br_data,' ',0) but this hasnt helped. I have also tried backing up the database, delete the database then restore, this also hasnt helped. I have no idea why the query runs slow on the original box, but then quick when I transfer it to another server??
Both servers are running windows 2003 with SQL 2000 SP4. There are no resorrce problems such as CPU / memory, Any ideas??
Thanks
View 1 Replies
View Related
Nov 8, 2006
Hello,
i have created an RPC from my SQL server which queries a database of a linked server (remote server).
the query result is very very slow. (it is a mis-size query, with many JOINs on tables with many entries). Let's say it takes about two minutes to get about 3000 results.
The query uses five tables in the database of the linked server, runs a few (let's about 5-8) JOIN clauses and selects the entries. except for two tables (out of 8), each table has about 1000-2000 entries. the two have about 40,000 entries.
Is this normal?!
Is there anyway i can optimize my query?
i also tested my query and asked for only 100 results as opposed to all of the 3000. there was only 2-3 second difference in getting the results back, which indicates that it is not the remote connection but the query itself which is slow.
any help would be greatly appreciated!!
View 1 Replies
View Related
Feb 15, 2006
I have a very unusual situation - we converted a client from DB2 7.2 to MS SQL Server 2000, SP3. There is one report that runs very quickly when ran on the Database Server, but it takes a long time to complete when it is ran from a client system. This query is ran from within the application and not from within Query Analyzer.
Has anyone else here ever encountered this issue? What did it turn out to be? I am leaning away from it being a network issue.
Thanks in advance.
View 2 Replies
View Related
Oct 17, 2006
Hello,
I have 2 servers (say MAINSRV e SECSRV) running SQL2000 Standard SP3 on Windows 2000 Advanced within a NT (!) domain and each server is linked to the other.
My problem is that if I run a query returning few dozens of rows like:
SELECT * FROM MAINSRV.DbName.dbo.TblName TBLA
WHERE Fieldx = 'anyval'
from a client connected to the SECSRV server, it takes something like 35 minutes to complete, while the same query completes in no time when run on clients connected to MAINSRV.
Even the simplest SELECT Count(*) FROM... takes more than one minute from SECSRV while completing in a fraction of second from MAINSRV.
I tried to change the linked server security options (SQL/Windows), but the remote query remains slow.
There are no locks active on the table, both the servers have almost no load (CPU less than 10%, when tested) and the query returns just a few KBytes, so communication overhead will not be the problem.
Any suggestions will be very appreciated, thank you!!!
View 2 Replies
View Related
Jun 14, 2006
Hello can some one explain this to me and give me some advice. I have sqlserver1 and sqlserver2 I have a linked server set up [ipaddress] on both servers. When I pass a variable to the stored proc or the query it takes up to 20 seconds to return 1 row but if I replace my varibles like email='sqlboy@coxnet' and firstname='ted' and lastname='clien' it returns the one row im looking for in about 2 seconds but if I pass email =@email ,fname=@fname,lname=@lname I get the 20 second thing. The query is a stored proc being called from an asp app. I get the same results when I run this in query analyzer.
Here is the query.
DECLARE @email VARCHAR(75),@fname VARCHAR(50),@lname VARCHAR(100)
IF EXISTS (SELECT TOP 1 email
FROM [ipaddress].{database}.dbo.{tablename}--server1
WHERE email = @email )
BEGIN
set nocount on
SELECT TOP 1 email,
FIRSTNAME,
MIDDLENAME,
LASTNAME,
TITLE,
COMPANYNAME,
ADDRESS1,
ADDRESS2,
CITY,
STATE,
ZIP,
COUNTRY
FROM [ipaddress].{database}.dbo.{tablename} --server2
WHERE email in(SELECT EMAIL
FROM [ipaddress].{database}.dbo.{tablename} where email=@email--server1)
AND RecordStatusID ='1' and FirstName=@fname and LastName=@lname
END
ELSE
IF EXISTS (SELECT top 1 email,
FIRSTNAME,
MIDDLEINITIAL,
LASTNAME,
TITLE,
COMPANYNAME,
ADDRESS1,
ADDRESS2,
CITY,
STATE,
ZIP,
COUNTRY
FROM {databases}.dbo.attendees--server1
WHERE email =@email and FirstName=@fname and LastName=@lname)
BEGIN
SELECT TOP 1 email,
firstname,
MIDDLEINITIAL AS MIDDLENAME,
lastname,
title,
companyname,
address1,
address2,
city,
state,
zip,
country
FROM {database}.dbo.attendees --server1
WHERE email in (SELECT DISTINCT email
FROM server1.dbo.ebooksrequests--server1
where email=@email) and email=@email and FirstName=@fname and LastName=@lname
SET NOCOUNT OFF
END
View 4 Replies
View Related