Application Slow - Running Double Take 4.4 On SQL 2000
Jun 6, 2005
Hi,
We are having SQL2000 Advance Server.
4 processor with hypherthreading, Memory 4 GB and it is a high transactional OLTP server. We are also running Transaction Replication on that server.
We recently bought Double Take and implemented on the server with File Difference with block check sum option for Disaster Recovery Purpose.
The Queue and the Log file folder is on local drive where the system doesn't use that folder except double take.
We are replicating from Source to Taget thru the WAN (DS3).We are replicating approx. 200 Gig of Data but just the difference.
Now the CPU usage is normal,Memory utilization is normal, but the network is a major problem as the Applications connecting to the Server timesout and the applications running very slow.
We have set just like the Tech support recommended.
I would appreciate, if someone give us some recommendations to run the double take without any problem.
Recently our database has been migrated from SQL 2000 to SQL 2005 on a new server(machine) with windows 2003(previously windows 2000). If the database is retained on the same machine but with a named instance of 2005, the application(websphere 5.1) is behaving normal whereas if i configure the aplication to the new server it is running slow for some of the queries but not all.
This change will have to be implemented in production very soon. Any advise will be of great benefit
I hav the following problem. I have written an stored procedure in sql server 2000 as the following CREATE PROCEDURE dbo.pa_rellena @pFechaInicio datetime
AS declare @pFechaFin datetime declare @auxcod_cen char(10)
--Borramos las tablas temporales si las hemos creado con anterioridad y no se han borrado if object_id('tmpCentros') is not null drop table tmpCentros
if object_id('tmpCentros2') is not null drop table tmpCentros2
if object_id('tmpMaxCajas') is not null drop table tmpMaxCajas
if object_id('tmpCajasCentro') is not null drop table tmpCajasCentro
if object_id('tmpVales') is not null drop table tmpVales
if object_id('tmpDiarioEfectivo') is not null drop table tmpDiarioEfectivo
if object_id('tmpDiarioTalones') is not null drop table tmpDiarioTalones
if object_id('tmpDiarioTarjetas') is not null drop table tmpDiarioTarjetas
if object_id('tmpDiarioSegundaForma') is not null drop table tmpDiarioSegundaForma
if object_id('tmpDiarioGastosTarjetas') is not null drop table tmpDiarioGastosTarjetas
if object_id('temp1') is not null drop table temp1
--Seleccionamos todos los centros de Salvador Bachiller select * into tmpCentros2 from centros where centros.tienda=1 order by cod_cen
--Seleccionamos el maximo de cajas por cada centro
select cod_cen, max(cod_caja) as cajas into tmpMaxCajas from cierrecaja where fecha>=@pFechaInicio and fecha<@pFechaFin group by cod_cen order by cod_cen
--Mezclamos los centros con el maximo de cajas select c.cod_cen, c.Centro, c.Direccion, c.localidad, c.provincia, c.cpostal, c.telefono, m.cajas, operaciones, cajas_tot, tienda, franquicia into tmpCentros from tmpCentros2 as c left outer join tmpMaxCajas as m on c.cod_cen=m.cod_cen
--Cajas por centro select distinct cod_cen as cod_cen, cod_caja as cod_caja into tmpCajasCentro from cierrecaja where fecha>=@pFechaInicio and fecha<@pFechaFin
--Los vales de cada centro select cod_cen,sum(importe) as imp1 into tmpVales from vales where fecha>=@pFechaInicio and fecha<@pFechaFin group by cod_cen
--Efectivo de cada centro select cod_cen,'01' as vendedor,'EFECTIVO' as descripcion, (sum(diario.TotEuro)-Sum(Diario.Imppa2)) as importe1,0 as exp1, (sum(Diario.TotEuro)-sum(Diario.imppa2)) as importe2 into tmpDiarioEfectivo from diario where fecha>=@pFechaInicio and fecha<@pFechaFin and cod_cen in (select cod_cen from tmpCentros) and cod_caja in (select cod_caja from tmpCajasCentro) and diario.cod_pago='01' group by cod_cen
--Talones por centro select centros.cod_cen,'02' as vendedor,'TALONES' as descripcion, sum(diario.TotEuro) as importe1,0 as exp1, sum(Diario.TotEuro) as importe2 into tmpDiarioTalones from centros inner join diario on centros.cod_cen=diario.cod_cen where fecha>=@pFechaInicio and fecha<@pFechaFin and diario.cod_cen in (select cod_cen from tmpCentros) and cod_caja in (select cod_caja from tmpCajasCentro) and diario.cod_pago='02' group by centros.cod_cen
--Tarjetas por centro select cod_cen,'03' as vendedor,'TARJETAS' as descripcion, sum(diario.TotEuro) as importe1,0 as exp1, sum(Diario.TotEuro*(FPago.Descuento/100)) as importe2, sum(Diario.TotEuro) - sum(Diario.TotEuro*(FPago.Descuento/100)) as importe3 into tmpDiarioTarjetas from FPago left join Diario on fpago.Cod_pago=Diario.cod_pago where fecha>=@pFechaInicio and fecha<@pFechaFin and cod_cen in (select cod_cen from tmpCentros) and cod_caja in (select cod_caja from tmpCajasCentro) and Fpago.Descuento<>0 group by cod_cen
--Segunda Froma de Pago select cod_cen,'03' as vendedor,'TARJETAS' as descripcion,sum(diario.imppa2) as importe1 into tmpDiarioSegundaForma from fpago left join Diario on Fpago.cod_pago=diario.cod_pa1 where fPago.cod_pago<>'99' and fecha>=@pfechaInicio and fecha<@pFechaFin and cod_cen in (select cod_cen from tmpCentros) and cod_caja in (select cod_caja from tmpCajasCentro) and Fpago.Descuento<>0 group by cod_cen
--Comisiones tarjetas de pago select cod_cen,'10' as vendedor, 'GASTOS (-)' as descripcion, sum(Diario.imppa2*(fPago.Descuento/100)) as importe2 into tmpDiarioGastosTarjetas from Fpago left join Diario on FPago.cod_pago= Diario.cod_pa1 where fPago.cod_pago<>'99' and fecha>=@pFechaInicio and fecha<@pFechaFin and cod_cen in (select cod_cen from tmpCentros) and cod_caja in (select cod_caja from tmpCajasCentro) and Fpago.Descuento<>0 group by cod_cen /* --Venta neta por centro declare cursortemporal cursor for select cod_cen from TmpCentros2
open cursortemporal delete detallecaja_aux fetch next from cursortemporal into @auxcod_cen while @@fetch_status=0 Begin select @importeVales=imp1 from tmpVales where cod_cen=@auxcod_Cen select @importeEfectivo=importe2 from tmpDiarioEfectivo where cod_cen=@auxcod_Cen select @importeTalones=importe2 from tmpDiarioTalones where cod_cen=@auxcod_cen select @importeTarjetas1=importe3 from tmpDiarioTarjetas where cod_cen=@auxcod_cen select @importeTarjetas2=importe1 from tmpDiarioSegundaForma where cod_cen=@auxcod_cen select @importeGastos=importe2 from tmpDiarioGastosTarjetas where cod_cen=@auxcod_cen
insert into detallecaja_aux (cod_cen,importe1) values(@auxcod_cen, @importeVales+@importeEfectivo+@ImporteTalones+@ImporteTarjetas1+@importeTarjetas2-@importeGastos) fetch next from cursortemporal into @auxcod_cen
Dear All,Finally I completed my project. Thanks all you helped me to do it.Now I have the biggest problem. In my application the Data grid is filled with Data from SQL server table which has large number of records. When I run my queries in SS Management Studio it runs very fast. To fill data to datagrid it takes lots of time. How can I reduce this time. How can I increase the performance of my Application.Thanks,Janaka
I am experiencing issues with database files that have been moved using double take. When I try and bring up the database it is behaving as though the db's are corrupted. Bottom line is that it is not working. Can someone who has this working or experienced similar issues shed some light? Thanks in advance.
We are using - MSSQL Server 2000 on windows 2000 advanced server. PB 7.0 for client server front end tools.
Prior to few days our application works fine, rightnow it get slow or not responding after few (4/5) transaction. We don't know wht it is, If anyone has same experience so that is helpfull for us.
We are having issues w/ a stored proc call in our application. When I run the proc through a query window in Mgmt Studio, it comes back in .3 seconds. However, when the application runs the proc (w/ the same parameters), it takes 30 seconds for the proc to complete. When I run a trace with the proc call through Mgmt Studio, it says 4000 read, but through the app, it says 4 million reads. What is happening?? (app uses Hibernate 3.2/ Java 1.5/ JDBC)
I have a dts job set up to transfer 550,000 records from a dbf file into a sql server. If I just let it run, there is a 9-10 minute delay, then it starts. If I try to schedule a job, it fails completely. I looked up ways to get it to execute quicker, mainly going to the advanced tab of the transform arrow and making the inserts 1000 at a time, the table locked, turning constraints off. Any advice on how to speed it up or why the job is failing?
Last week, we upgraded to sql2000 from 6.5. Everything went on fine. we re-compiled all the procedures. When i try to run a procedure, its running for long time - more than 10 hours (in sql 6.5 it runs for 50 mins). do i need to set any procedure cache?.
Also, the server CPU usuage is constantly high - more than 80%. It was fine till last week.
I've got a football (soccer for the yanks!) predictions league website that is driven by and Access database. It basically calculates points scored for a user getting certain predictions correct. This is the URL:
http://www.pool-predictions.co.uk/home/index.asp
There are two sections of the site however that have almost ground to halt now that more users have registered throught the season. The players section and league table section have gone progressively slower to load throughout the year and almost taking 2 minutes to load.
All the calculations are performed in the Access database Ive written and there are Access SQL queries to get the data out.
My question is, is how can I speed the bloody thing up! ! Somone has alos suggested to me that I use stored procedures and SQL Server to speed things up? Ive never used SQL Server before so I am bit scared about using it (Im only a hobbyist), and I dont even know what a SP is or does. How easy will it be upgrading the whole thing to SQL Server and will it be worth the hassle, bearing in mind I expect my userbase to keep growing? Do SP help speed things up significantly? Would appreciate some advice!
Our server is running. There are no locks, and server has been rebooted but the problem is still there. This has been going on for some time now. I intend to restart the server. Does anybody have a quick solution, please help. Thanks for your assistance!!
When first time I start my sql server is running faster. After 10 to 15 days later, sql sever performance is very slow. After I restart SQL service, to become normal.
Hi list, I'm a long time lurker on this list and really enjoy the discussions, although I rarely get a chance to participate.
Here is my situation: We are importing chunks of data (500 records at a time) from a C++ interface. The records have to be transformed before inserting into the target table which I am doing using a stored proc which is working fine. The records are in memory in C++ and the programmer is looping through the records building inserts into a temp table through ADO (which my proc picks up). The server business object is using the connection.execute method which is inserting one record at a time. That part of the process is taking over 15 seconds for 500 records which is the bulk of the total time.
My question is: Using ADO is there a better way to insert these records into the temp table? I see mention of a recordset interface but my programmers are new to ADO and since I am the DBA and have never used ADO, I am not sure what to tell them.
I have linked three SQL Servers together, and have written a stored proc that has a cursor that joined three tables from one of the linked servers. When I pull the SQL out of the cursor definition and run it in a query window it runs fine, but when I run the stored proc that simply steps through the same select result set it is too slow for words. It also throws a warning about serial isolation levels. Is there any way I can fix this.
Our main production server has started running slow, it is a dual zeon thingy with plenty of ram so hardware is not an issue.
Basically a service connects to the database and executes a few stored procs, the only way I can get the system up to speed again is to recompile one of the SPs but that is only a temporary fix.
Anyone had a similar thing?
Can anyone give me help on performance tuning in SQL server 2000.
I have a DTS package ON SQL 2000 which transfer data from AS400 to SQL 2000 using an ODBC Client Access 5.1 (The DSN was configured by a sysdmin on the AS400 so it is configured properly). When i execute the package manualy (by right click and "execute package") the package runns fine and ruterns data in no time (Eproximatly 30000 rows in 15 sec).
The problem starts when a job executes the same packagee!!! When i start the job, the DTS package is running Very Very Slow!!!! sometime it takes Hours to return a few rows! and it seems that it is stuck.
The SQLAgent is running as a NT Account with Administrator rights, and has full access to the AS400!! so the problem is not the Agent.
by monitoring the AS400, i have noticed that the job/DTS is retreaving the first fetch quickly , and then it is in a waiting status
i have tried everything and cant seem to get this problem fixed
Does anyone know what could be the problem? I Need Help Quick!!! Thank You
Hello All, I have two procedures being run one after the other. when I run proc1 it runs for about 15 min. Now the proc2 is dependent on proc1, when I run proc2 it runs for 45 min. If I run both the proc's simultaneously through .net code it takes more than 1 hour. Can anyone of you tell me where would be the problem.
I'm developing a desktop application using CAB and DevExpress 6.3 controls. I use ReportViewer control to render .rdlc reports under local mode. I loaded data source from a dataset, template definition from a string, called the Refresh method of ReportViewer.LocalReport, so far, all are OK. But once I call the RefreshReport method of ReportViewer, the whole Application slows down, even after I closed the report viewer form. Operations like switch between menu items, open a new form and resize the main form become double or triple slow, CPU usage is keeping in 100% while doing those operations... I've no clue how refreshing report affects all those UI drawing... However, if I break down and jump over the RefreshReport method, an empty report form shows and every thing keeps OK. After all, all the reports can be rendered and shown correctly finally...
And seems there is nothing to do with the templates and data source, the templates are very simple, and event if I render the reports with empty data tables, it also slows down my application. I tried to create and run the form in a new thread, tried to upgrade to ReportViewer 9.0, but both seem useless.
Could any body help me on this problem? Thanks in advance.
I am performing a series of calculations where accuracy is very important, so have a quick question about single vs double precision variables in SQL 2008.
I'm assuming that there is an easy way to cast a variable that is currently stored as a FLOAT as a DOUBLE prior to these calculations for reduced rounding errors, but I can't seem to find it.
I've tried CAST and CONVERT, but get errors when I try to convert to DOUBLE.
For example...
SELECT CAST(1.0/7.0 AS FLOAT) SELECT CONVERT(FLOAT, 1.0/7.0)
both give the same 6 decimal place approximation, and the 6 decimals make me think this is single precision.
But I get errors if I try to change the word FLOAT to DOUBLE in either one of those commands...
Every night we connect to a remote server using Linked Server and copy details from that database to a loading table, then load it into the 'real' tableĀ in our own environment. The remove database we load it from has indexes/primary keys that match the 'real', however the 'loading' table itself does not have any indexes or primary keys, both are SQL Server 2005 machines.
In the loading table we first of all truncate it then do a select insert statement from the remote server, then we then truncate the 'real' table and load iit from the 'loading' table.
The issue is when we attempted to load it into our 'real' table from our loading table there was a duplicate row, and our process failed with a Primary Key violation.
I checked the source with does have the same primary key's in, it did not contain a duplicate row and I checked the loading table and that did contain a duplicate row.
My question this is in what circumstances this could happen ?
I have SQLServer 6.5 SP5a update running on Windows NT 4.0 SP6 with 4 gig RAM and 4 processor.
Suddenly the SQL 6.5 jobs running on the production server started running very very slow. A job that suppose to run in 30 minutes are running like 2 hours and completing successfully.
(I suspect the after the Norton Anti virus automatic live update may be the reason but not the Second Vulnerability as mentioned by Microsoft Bulletin last week)
I check the SQLServer, ran the performance monitor, checked pagefiles, disk space, databases,memory, tempdb. Everything seems to be normal.
I rebooted the server, checked any other process making that slow. But no use.
Please help me out with this issue as this is a production and the CRM applications from the clients uses the database server.
I have been tapped to help with fixing a reporting tool. We have a Sql Server database/crystal reports(10) setup. I havent had the chance to look at the tables in the DB yet, but I was told that aggregate tables were used. In my past experience with crystal reports, we used database views to feed crystal reports (in Oracle). I was thinking that I could somehow use views instead of tables and then try to re-index the base tables and compile satitics (if theres such a thing with Sql Server). I was also going to look into bottlenecking and locking (table locking as opposed to row or page locking for the lookup tables...to reduce overhead on the main tables) but, I'm not sure if it'll make a difference since this is just a demo server with no major traffic hitting it yet.
The question is, does anyone have any experience with crystal reports running slow with Sql Server, what should I look out for??
I'm having what you might call an optimisation issue - but I'm also not sure if my approach to this problem is right. I've spent the whole day trying various methods but none seem to be performing as admirably as I'd hoped.
Basically, here's the scenario:
1. Log files are being inserted into a SQL table via Log Parser 2.2. 2. I have a lookup table of ip address ranges to countries and other geographic data. 3. Once the log row has been inserted from Log Parser, I want to update the same log table with data from the lookup table.
The main problem seems to be the updating (the initial insert from Log Parser is lightning quick).
I initially wrote a trigger on AFTER INSERT on the log table that converted the actual user's IP address into IPNumber format using a simple algorithm, then pumped the IPNumber into a quick select query which retrieved the geolocation data. Then, in the same trigger, there was an update statement which basically said:
update dbo.Logs set [Country] = @Country where [IpNumber] = @IpNumber and [Country] = Null
Therein lies the problem. The update statement slows everything down to almost a standstill and also seems to obtain a lock on the table.
Critical factors:
1. The log table must remain readable. 2. The query must execute in seconds -- not over half hour :)
I've experimented with various combinations of indexing, so far to no avail.
I am having a problem ,SQL server is running very slow.This is happening some days only.For example my stored procedure ususally runs less than 2 minutes, some days will take 13minutes.I dont understand the problem.All the stored procedure having the same problem.Sorry, I am not a DBA,basically a devloper.Daily morning we are taking the DB backup and indexes already applied.DB size 10461.06 MB,RAM 4GB,CPU usage is less than 50%,This is a Cinema Database,so lot of users are accessing at same time(Web,IVR,cinema ticket counters etc).We are using SQL reports.Because of the stored procedure running slow,can not view the reports.pls advice..
please help me..If you need some more information please ask ..
Hi friends after working visual studio (on my report model project) few minutes it runs too slow. i mean clicking on entities ,attributes takes ages to finish. I opened task manager i see "devenv.exe" is taking more than 800,000 k !!
am using sql server 2005 standard edition. have you seen similar problem. Thanks for your help.
We have a stored procedure which is running fine on a SQL server 2000 from Query Analyzer. However, when we try to execute the same stored procedure from ADO.NET in an executable, the execution is hung or takes extremely long. Does anyone have any ideas or suggestions about how it could happen and how to fix. thanks
I support a website (ASP.NET 2.0) where recently users have been unable to insert data due to timeout issues. The functionality executes a query that inserts a single row into a SQL Server 2005 db table. I tried running this query from the backend, and it took 4:48 to insert a single row! Interestingly, after that initial agony any similar inserts I tried took no time whatsoever. I have checked the execution plan, but unfortunately don't really know what I'm looking for with inserts, as most of my experience with execution plans is with select statements. Any resources anyone could point me to for troubleshooting this would be much appreciated.
I have a SELECT TOP query in order to return x number of top records from a table which has the indexing service enabled on it, such as this :
SELECT TOP(15) * FROM [TableName] ORDER BY [ColumnName]
and also there are not that many records(MAX 100 rows) kept in the table at the moment however, it will grow later. The issue stems out from the ORDER BY [ColumnName] part of the syntax that changes the TOP selection order which makes the query to run very slow as I have also analyzed in the SQL SERVER QUERY ANALYZER. Anyhow, I need to have the ORDER BY statement to show the data based on different columns either ascending or descending. There might we workarounds to achieve the same goal that I am not aware of. Any thoughts is appreciated.