Sql Server Data Transfaring Very Slow

Apr 28, 2008

Hi,
I am using SQL2K for a HMS application database. And I have several database in it.
After a long periode of time (about 6 month) the the application is slowing down when accessing the data.
I use full log for the database.

I keep only the last 3 days transaction data and every day I delete transaction data older then 4 day, so the number of records in the transaction table is just about 3000 to 5000 records only.
Please help me. Thanks.



please it is very urgent.


View 4 Replies


ADVERTISEMENT

Very Slow Data Access On SQL Server 2005?

Jul 5, 2006

Hi

Here is the brief to my problemWe had our database on SQL Server 2000 and Windows 2000.This machine
had 2gb of RAM and dual Penitum 3 processors and about 25-30 users were
connected all the time. The size of database is around 2 gb. Even on this setup
rate of data retrival was good, never had any issues. We moved to SQL Server 2005 and Windows 2003. This machines has
2 Pentium Xeon 3.4 processors and 2 stick of KINGSTON 1024 MB 333
MHZ DDR DIMM ECC CL2.5 DUAL RANK X4 INTEL. The rate of data
retrival is awful and its very slow. It using about 1.7 to 1.9 gb of RAM all the
time. Page File usage is about 2.07 gb and Virtual Usage is about 1.7gb.I dont quiet understand why is it so slow to get data. We use bespoke software,
so nothing has changed there. Hardware specification of our server is far more better then the recommended
system requirement for SQL Server 2005.Am i missing something out or i havent set up the SQL Server properly? Any help would really be appreciated.Mits

View 3 Replies View Related

Copying Table Data From SQL Server 2005 To SQL Server 2000 - Very Slow When Using OLEDB Source And Destination Sources?

May 8, 2006

An SSIS package to transfer data from a DB instance on SQL Server 2005 to SQL Server 2000 is extremely slow. The package uses an OLEDB Source to OLEDB Destination for data transfer which is basically one table from sql server 2005 to sql server 2000. The job takes 5 minutes to transfer about 400 rows at night when there is very little activity on the server. During the day the job almost always times out.

On SQL Server 200 instances the job ran in minutes in the old 2000 package.

Is there an alternative to this. Tranfer Objects task does not work as there is apparently a defect according to Microsoft. Please let me know if there is any other option other than using a Execute 2000 package task or using an ActiveX Script to read records from one source and to insert them into the destination source, which I am not certain how long it might take and how viable will that be?

Any inputs will be much appreciated.

Thanks,

MShah

View 5 Replies View Related

Slow SQL Server Data Transfers When Internet Connected

Mar 18, 2008

We have an information retrieval application in which there is a single connection to a database followed by multiple table open, read, and close commands. Response time is consistantly less than 1 second on a LAN.
When Internet connected (not VPN), the first table read is typically fast, but the response time becomes slower and slower after multiple table open, read, and close commands. There seems to be a considerable amount of handshaking based on monitoring of the router's status lights.

View 7 Replies View Related

Import Data From SQL Server 2000 Incredibly Slow

Sep 6, 2006

I have a problem with bad perfomance with my import of data from a SQL Server 2000 database. I use an OLEDB datasource in my SSIS package to connect to the sql server 2000 database. My 2005 server runs 64bits but i dont think this is an issue. With this configuration the import is VERY slow, we are talking about 40+ minutes to get 3.5 million rows with about 20 columns. When i create a test DTS package on the SQL Server 2000 server itself and run it, its blazingly fast. Has anyone run into something similar?

View 2 Replies View Related

Data Insertion Too Too Slow...

Nov 27, 2007

Hi,Env is ms sql server 2000.ddl:create table srchPool(tid int primary key, taid int, s tynyint, uidtynyint);-- and sql server automatically creates a clustered index for the pkdml:insert into srchPool(taid,s,uid)select article_id, 99, 1484from targetTBLwhere article_content LIKE '% presentation %';insert into srchPool(taid,s,uid)select article_id, 55, 1484from targetTBLwhere article_content LIKE '% demonstration %';-- a few more similar queries ...The above insertion query TOOK about 2000ms to execute, too too slow,would be much faster if I insert the data sets into a temp tbl likeselect article_id, 99, 1484 into #srchPool(taid,s,uid)from targetTBLwhere article_content LIKE '% presentation %';-- once its use is finished and drop it?Many thanks.

View 7 Replies View Related

Data Retrieval Terribly Slow

Jul 27, 2006

Hi,
I'm using ASP.NET 1.1, SQL Server 2000 Server:
I  followed the ASP.NET 1.1 Starter Kit's Commerce application and applied the same principles it had written the code to retrieve data to my web application I created.  For example I've written this Function in a class to return a sqldatareader:
Public Function GetAdvanceSearch(ByVal s As String, ByVal Ext As Integer, ByVal fdate As DateTime, ByVal tdate As DateTime) As SqlDataReader
Dim oDrAdSearch As SqlDataReaderDim oCmdGetSearch As New SqlCommand("spAdvanceSearch", oComConn)
With oCmdGetSearch   .CommandType = CommandType.StoredProcedure   .Parameters.Add(New SqlParameter("@DialNo", SqlDbType.VarChar)).Value = s   .Parameters.Add(New SqlParameter("@FDate", SqlDbType.DateTime)).Value = fdate   .Parameters.Add(New SqlParameter("@TDate", SqlDbType.DateTime)).Value = tdate   .Parameters.Add(New SqlParameter("@Ext", SqlDbType.Int)).Value = ExtEnd With
oComConn.Open()oDrAdSearch = oCmdGetSearch.ExecuteReader(CommandBehavior.CloseConnection)
If oDrAdSearch.HasRows Then   Return oDrAdSearchElse   Return NothingEnd If
End Function
And When I'm calling this function I do write in this way (assuming that this function is in a class called "Calls"):
Dim objCalls as New Calls
DataGrid1.DataSource = objCalls.GetAdvanceSearch(<PARAMS.......>)DataGrid1.Databind
My application is a Telephone Call Recording System and could expect vast amount of data. Averagely, a month may produce approximately 50,000 records or more. So while querying through my web application for a month, the application itself either gets stuck or the retrieval speed gets drastically slow. However I'm using Datareaders for every querying scenario. My Web application is hosted in a Windows 2000 Server and accessed via Local Network or IntraNet.
What are the ways I could make this retrieval more speedier and efficient? I would like to hear from anyone who have come across this problem and anyone who could help me on this.
Thanks in Advance. Looking forward for a reply from some one.
 

View 3 Replies View Related

Data Access Very Slow In .net As Compared To VB

Jan 9, 2006

 

Hi,

I have migrated my app from VB to VB.Net. A 3-tier app with remoting and COM+.

I am experiencing a long wait time of about 3 times higher than what it would take in the VB App.

I am using DataAdapter.FiLL method to fill the datatable.

I have tried.

Using DataReader ( Made the things worse )

Using BeginLoadData and EndLoadData

Creating a Dataset and calling fill with the dataset so that the round trip to the middletier is saved to bring the SQL.

But i feel now that whatever is done. the problem is with the fill method only?

Is there any alternative?

Please suggest. It is one of the most important thing which if not possible may lead to scrapping up idea of upgrading to .Net.

Shri

View 10 Replies View Related

Slow OLEDB Source In Data Flow

Feb 2, 2008

Slow OLEDB Source in Data Flow

Hi All,

I have a simple data flow task, composed of only an OLEDB Source, a Conditional Split, and two Execute SQL statements (both insert statements, one after the other). When I run my package in Visual Studio for debugging, I noticed that after executing around ~9800 in the first and another ~9800 records in the second insert statements, the OLEDB Source will take around 3 or 4 minutes to fetch another set of ~9800 records. I have set the DefaultBufferMaxRows property of the Data Flow to 10000. My query to retrieve those 700,000 records runs for about 2-3 mins to finish (which I think should be decent enough). Is this an expected behavior of SSIS? The expected number of records to be retrieved is 700,000, and it takes forever to finish the transfer of these records. Please help

View 4 Replies View Related

WMI Data Reader Task Painfully Slow

Nov 16, 2006

HI,

I am query event log information for a single day using WMI Data reader task and it is taking for ever. I am querying event log for remote servers but the server exectuing the process and queried servers both are in the same domain.

Kindly advise.

Thanks

Shafiq



View 2 Replies View Related

Data Flow Task Running Very Slow

Jun 11, 2007

Hello,



I developed an SSIS package doing a nightly load into a data warehouse. We have an 8 hour loading window - currently the package takes 16 hours to complete.



I isolated the problem to a Data Flow task where +-35% of the time is spent. This task is pretty straight forward:



- OLE DB source, reading +- 800,000 rows from a SQL server database



- 13 Lookups in sequence, to get surrogate keys from dimension tables. Lookups are all on GUIDS.



- An aggregation



- OLEDB target, fact table in a SQL server database.



It seems unreasonable for the this task to take over 5 hours. It spends the majority of time on the lookups - not so much at target, source and aggregation.



Any comments and advice will be greatly appreciated.



Thanks.



(PS some machine details:



OS Name Microsoft(R) Windows(R) Server 2003, Standard Edition
Version 5.2.3790 Service Pack 1 Build 3790
Other OS Description Not Available
OS Manufacturer Microsoft Corporation
System Name ARK-SQL
System Manufacturer HP
System Model ProLiant DL380 G5
System Type X86-based PC
Processor x86 Family 6 Model 15 Stepping 6 GenuineIntel ~1866 Mhz
Processor x86 Family 6 Model 15 Stepping 6 GenuineIntel ~1866 Mhz
BIOS Version/Date HP P56, 9/18/2006
SMBIOS Version 2.3
Windows Directory C:WINDOWS
System Directory C:WINDOWSsystem32
Boot Device DeviceHarddiskVolume1
Locale United States
Hardware Abstraction Layer Version = "5.2.3790.1830 (srv03_sp1_rtm.050324-1447)"
User Name Not Available
Time Zone South Africa Standard Time
Total Physical Memory 3,327.30 MB
Available Physical Memory 938.20 MB
Total Virtual Memory 1.10 GB
Available Virtual Memory 2.78 GB
Page File Space 2.00 GB
Page File C:pagefile.sys)

View 6 Replies View Related

Export/data Dump To Flat File Is Too Slow

Mar 10, 2006

I am trying to export a table with ~ 10 Million rows to a flat file and it is taking for ever with SQL2005 export functionality. I have tried creating an SSIS package with a flat-file destination and the results are the same. In each case it does the operation in chunks of about 9900+ rows, and each chunk takes ~1-2 minutes which sounds unreasonable.

I tried bcp, and it fails after a few thousand rows. I tried moving the data to SQL2000 first then to flat file from SQL2K, but the move from SQL2005->SQL2000 was going at the same rate as above.

So, the bottleneck seems to be data going out of SQL2005 no matter what the destination is. I'm wondering if there is some setting that Iam missing that would make this run in a reasonable amount of time?

View 1 Replies View Related

Slow Running Query With A High Lob Logical Reads, Lob Data

Jun 1, 2007

we had some slow down complaints lately and this query seems to be the culprit almost every single time. The estimated execution plan is a clustered index seek as there is a clustered index on the uidcustomerid column. setting profile statistics on shows that every time it executes it does an index seek.

profiler session showed a huge number of reads for these queries depending on the value being looked up. 1500 through 50000. i set up profile io on and the culprit is lob logical reads. everything else is 0 or very low. in this case lob logical reads is over 1700.

3 of the columns in the select statement are text columns. when i take them out of the query the lob logical reads drops to 0 and goes up incrementally as i add each column back in.

is there anyway to improve the performance without changing data types to varchar(max)?


select SID,Last_name,Name_2,First_name,Middle_initial,Descriptives,Telephone_number,mainline,Residence,ADL,
DID_number,Svce_street,Svce_town,Svce_state,Svce_appt,Mailing_street,Mailing_town,Mailing_state,Mailing_appt,
Mailing_zip,Listing,Addl_listing,Published,Listed,Gold_number,PIN,status,SSnumber,tax_jurisdiction,
Bill_date,Past_balance,Service_start_date,Service_end_date,LOA,FCC_type,Line_type,I_W,Jacks,Voice_messaging,
vms_ring_cycles,CCS,phonesmarts,ringmate,voice_dialing,Bill_detail,Contact_Number,Contact_extension,
Best_Time,suspend,suspend_start,suspend_end,credits_allowed,credits_granted,home_region,Calling_Plan,Local_Plan,
Local_Plan_Rate,Flat_Rate,Sales_agent,Community,Building_Mgmt,How_Heard,Incentive_1,Incentive_1a,Incentive_1b,
Incentive_1c,Incentive_2,Incentive_2a,Incentive_2b,Incentive_2c,Incentive_3,Incentive_3a,Incentive_3b,
Incentive_3c,block_operator,block_collect,block_group,block_adult,block_call_return,block_repeat_dialing,
block_call_trace,block_caller_id,block_anonymous,block_all_high_toll,block_regional_and_ld,block_DA_Call_Completion,
block_DA,block_3rd_party,bank,prepayment,dial_around_number,custid,waive_interest,Financial_Treatment,
Other_Feature_1_code,Other_Feature_1_rate,Other_Feature_2_code,Other_Feature_2_rate,Other_Feature_3_code,
Other_Feature_3_rate,Other_Feature_4_code,Other_Feature_4_rate,Partial_Account,mail_date,snp_1_date,snp_2_date,
terminate_date,snp1notified,snp1peak,snp1offpeak,snp2notified,snp2peak,snp2offpeak,avg_days_paid,Pulled_Ld,SNP1,
SNP2,Treatment,Collections,Installment,Nynex_BTN,LD_rate,local_discount,to_month,rounds_up,full_package_made,
local_made,PIC,LPIC,tax_exempt_local,tax_exempt_federal,CommissionedAgent,LDRateID,UidCustomerId,
accVchLineClassUSOC,block_Inter_Reg_LD,block_international,block_DA_3rd_Collect,block_DH2,block_ISP_2,block_ISP_3,
block_ISP4_3_GBAS,block_ISP3_3_GBAS,block_collect_only,block_LD_Reg_DA,block_usage_based,block_ISP5_3_GBAS,
block_ISP5_2_GBAS,block_group_adult,csr_PIC,csr_LPIC,csr_SA,csr_exception,cutover_status,cutover_datetime,
OutsideAgent,prfVchAttributes,uidResellerID,Category,uidDealID
from profiles where UidCustomerID in (352199267)

View 3 Replies View Related

Data Transfer From Lotus Notes Very Slow Compared To SQL 2000 DTS

Sep 7, 2006

To extract data from an ODBC source, try the following:

Add an ADO.Net Connection Manager.
Edit the Connection Manager editor and select the ODBC Data Provider
Configure the Connection Manager to use your DSN or connection string
Add a Data Flow Task to your package.
Add a Data Reader Source adapter to your data flow
Edit the Data Reader source adapter to use the ADO.Net connection manager that you added.
Edit the Data Reader source to query for the data you wish to extract.

hth

Donald

Using the steps outlined above as described by Donald Farmer in another post on this forum, I have created an SSIS package which retrieves data from Lotus Notes 6.55. The DSN referenced by the ADO.Net Connection Manager connects to Lotus Notes via the NotesSQL ODBC driver 3.02g.

When I execute the dataflow, data is transferred from Lotus Notes, but the data transfer rate is extremely slow compared to SQL 2000 DTS. In SQL 2000 DTS, we can retrieve just under half a million records from Lotus Notes in about 13 minutes. Utilizing the same DSN on the same machine, SQL 2005 SSIS completes the transfer in about 57 minutes.

Is there anything that can be done to improve the performance in SSIS to retrieve data from Lotus Notes via ADO.Net ODBC?

Thanks!

View 3 Replies View Related

Slow Running Insert After Changing OLE DB Destination Data Access Mode

Jan 4, 2007

Hello,

In a Data Flow Task, I have an insert that occurs into a SQL Server 2000 table from a fixed width flat file. The SQL Server table that the data goes into is accessed through an OLE DB connection manager that uses the Native OLE DBMicrosoft OLE DB Provider for SQL Server.

In the OLE DB Destination, I changed the access mode from Table or View - fast load to Table or View because I needed to implement OLE DB Destination Error Output. The Error output goes to a SQL Server 2000 table that uses the same connection manager.

The OLE DB Destination Editor Error Output 'Error' option is configured to 'Redirect' the row. 'Set this value to selected cells' is set to 'Fail component'.

Was changing the access mode the simple reason why the insert from the flat file takes so much longer, or could there be other problems?

Thank you for your help!

cdun2

View 32 Replies View Related

Load Fact Table, Very Slow With Lookup - Data Flow Transformation

Apr 9, 2008



Hello,


I have developed some packages to load data into "Fact" tables in the data warehouse.
Some packages are OK, other ones not. What is the problem?: some packages load fact tables with lots of "Lookup - Data Flow Transformation" into the "data flow task" (lookup against dimension tables) but they are very very slow, too much slow to be choosen as a solution.

Do you have any other solutions to avoid using "Lookup - Data Flow Transformation"? Any other solution (SSIS, TSQL and so on....) is welcome to speed up the Fact table loading process.

Thank in advance

View 7 Replies View Related

SQL Server 2005 Stored Procedure Is Very Slow Vs. SQL Server 2000

Apr 18, 2008

Hi there,

I was wondering if someone can point out the error or the thing I shouldn't be doing in a stored procedure on SQL Server 2005. I want to switch from SQL Server 2000 to SQL Server 2005 which all seems to work just fine, but one stored procedure is causing me headache.

I could pin the problem down to this query:


DECLARE @Package_ID bigint

DECLARE @Email varchar(80)

DECLARE @Customer_ID bigint

DECLARE @Payment_Type tinyint

DECLARE @Payment_Status tinyint

DECLARE @Booking_Type tinyint

SELECT @Package_ID = NULL

SELECT @Email = NULL

SELECT @Customer_ID = NULL

SELECT @Payment_Type = NULL

SELECT @Payment_Status = NULL

SELECT @Booking_Type = NULL

CREATE TABLE #TempTable(

PACKAGE_ID bigint,

PRIMARY KEY (PACKAGE_ID))

INSERT INTO

#TempTable

SELECT

PACKAGE.PACKAGE_ID

FROM

PACKAGE (nolock) LEFT JOIN BOOKING ON PACKAGE.PACKAGE_ID = BOOKING.PACKAGE_ID

LEFT JOIN CUSTOMER (nolock) ON PACKAGE.CUSTOMER_ID = CUSTOMER.CUSTOMER_ID

LEFT JOIN ADDRESS_LINK (nolock) ON ADDRESS_LINK.SOURCE_TYPE = 1 AND ADDRESS_LINK.SOURCE_ID = CUSTOMER.CUSTOMER_ID

LEFT JOIN ADDRESS (nolock) ON ADDRESS_LINK.ADDRESS_ID = ADDRESS.ADDRESS_ID

WHERE

PACKAGE.PACKAGE_ID = ISNULL(@Package_ID,PACKAGE.PACKAGE_ID)

AND PACKAGE.CUSTOMER_ID = ISNULL(@Customer_ID,PACKAGE.CUSTOMER_ID)

AND PACKAGE.PAYMENT_TYPE = ISNULL(@Payment_Type,PACKAGE.PAYMENT_TYPE)

AND PACKAGE.PAYMENT_STATUS = ISNULL(@Payment_Status,PACKAGE.PAYMENT_STATUS)

AND BOOKING.BOOKING_TYPE = ISNULL(@Booking_Type,BOOKING.BOOKING_TYPE)

-- If this line below is included the request will take about 90 seconds whereas it takes 1 second if it is outcommented

--AND ADDRESS.EMAIl LIKE '%' + ISNULL(@Email, ADDRESS.EMAIL) + '%'

GROUP BY

PACKAGE.PACKAGE_ID

DROP TABLE #TempTable


The request is performing quite well on the SQL Server 2000 but on the SQL Server 2005 it takes much longer. I already installed the SP2 x64, I'm running the SQL Server 2005 on a x64 environment.
As I stated in the comment in the query it takes 90 seconds to finish with the line included, but if I exclude the line it takes 1 second.
I think there must be something wrong with the join's or something else which has maybe changed in SQL Server 2005. All the tables joined have a primary key.
Maybe you folks can spot the error / mistake / wrong type of doing things easily.
I would appreciate any help you can offer me to solve this problem.

On the web I saw that there is a Cumulative Update 4 for the SP2 which fixes the following:





942659 (http://support.microsoft.com/kb/942659/)
FIX: The query performance is slower when you run the query in SQL Server 2005 than when you run the query in SQL Server 2000

Anyhow I think the problem is something else, I haven't tried out the cumulative update yet, as I think it is something different, more general why this query takes ages to process.

Thanks again for any help

Best regards,
Pascal

View 9 Replies View Related

SQL Server Very Slow

Mar 29, 2001

Hi,

we have performance problems with MS SQL Server 2000. We upsized an Access
2000 application to MS SQL server, using linked tables. Most of the time
the performance is fine (there are at most 10 users connected to the server
at the same time). However, it regularly happens that the database stops
responding. Queries, which normally take 20 ms to execute, require 20
seconds or more. In the Access client this looks as if Access has hung, it
is not responding, even though it eventually comes back to live. What I
have found out is that if I restart the SQL server, the problem disappears
and the performance stays fine for some time. This in mind, I set up a
batch which stops and restarts the SQL at night. However, recently the
problem started appearing even when the SQL server was running for only a
few hours.
I also looked at the performance monitor at both the client workstation and
the server and even when the response time are slow, the processor usage
both at the workstations and the server is under 10 percent.
I wonder whether anybody could help me with this problem. I realise that
using linked tables in Access is not the best thing for achieving good
performance, but I still would expect at least decent performance. At the
moment the situation is worse than if we were using just Access.
For your information, the computer which the SQL server runs on is a dual
processor Pentium Pro 200 MHz, with 320 MB RAM and a SCSI RAID. The server
is the only Windows 2000 domain controller on the network and it runs
Active directory. Plus there is also Exchange server 5.5 installed on this
server. This looks like a lot for a single server, but please bear in mind
that there are only 15 users on the network.
It may be also interesting to know that we have only recently upgraded from
SQL server 7 to SQL server 2000, but we were experiencing the same problem
before, even though not so often.

Thank you for any help.

Tomas Koutny
Elink Ltd

View 4 Replies View Related

SQL Server 7 Slow

Jul 15, 2004

When, at the same network, I run Windows Server 2003, with webserver on it, SQL clients connected to SQL server on other computer server, experience slow work with queries. What seems to be a problem? When I disconnect Windows 2003 Server everything go faster. Why?

View 2 Replies View Related

SQL Server Very Slow

Jul 20, 2004

Hiya folks,
This is more a request for some input from peeps with more experience of SQL than myself. A problem has shown itself on my SQL server over the last week or so, in that the server will 'slow down' intermitently, almost as if the the connection to the server has been lost for about 30 seconds. All will be fine for another minute or so and then the same problem occurs.

The only way I've found to get round this problem is to stop the SQL server and completely restart the server that SQL resides on, then restart SQL. This cures the problem for about a day.

Any offers??

View 9 Replies View Related

SQL-Server Very Slow Via VPN (WAN)???

Jan 14, 2005

Hello,

I've written a program, which communicates via ODBC with multipe database platforms. In a local network it seems to be everthing OK, but when I connect via VPN (2MBit/s S-DSL) to the MSSQL (2000 SP3) the connection is not only very slow, it seems that the MSSQL only uses 1 % of the bandwith. I don't think that 0,25 KByte/s is quite normal speed. A query takes about 5 - 10 minutes. (And I do a lot of queries...)

If I connect to an Oracle-DB the full bandwith will used (125 KByte/s).

Is there a problem with the SQL2000? How can I solve this behaviour?

so long

View 10 Replies View Related

SQL Server Slow

Nov 4, 2005

Hi,

I am having a problem accessing my sql server database using either Enterprise Manager or Query Analyser. It is awfully slow. Each time I click to expand a database in Enterprise Manager it takes about 25mins to do anything. I was running a DTS package yesterday which failed and have had this problem since. If I access the database via my app everything seems to be running at a normal speed. If I go to my task manager the sql server process is using up 750MB of memory and 750MB of virtual memory??

Has anyone seen this and know how to fix it???

Thanks in advance

View 2 Replies View Related

Server Slow Down

Jun 16, 2008

Does anyone have any experience with a server slowing down over time to the point that it must be rebooted? This occurs over a time frame of from a few days to as long as a week. The server has a single Xeon 3.6 processor and 8 GB of ram. It executes production SQL scripts against databases contained on the server as well as a data warehouse stored on an AS400 server.

After rebooting, all jobs seem to execute in a reasonable time frame, according to their size and scope.

View 9 Replies View Related

Slow Server

Jul 13, 2007

server was running out of space. drop a database to free up space. the server is slow now like its taking more time to query or delete records than normal. what happend and how do i fix it?

=============================
http://www.sqlserverstudy.com

View 3 Replies View Related

SQL Server Too Slow

Jul 20, 2005

When I query or browse databases or tables in SQL Server 2000 it worksextremely slow. It started working like this from one day to another.I tried reinstalling but it stayed the same.Now I'm installing Service Pack 3, it's curious that executingreplsys.sql, replcom.sql and repltran.sql scripts in the installationis extremely slow too.Did anyone experienced this or have any idea?Thanks in advance.

View 5 Replies View Related

SQL Server Management Studio Run Extremely Slow On Server.

Mar 21, 2007

I had installed SQL 2005 in my server (very high spec server).

but i wonder why every time i launch the management studio, it tooks forever to load... about 5 minutes.

When i right click on a table and click on view properties, it can takes up to 5 minutes to display the detail.

Is there any setting / configuration that i configured wrongly?

View 4 Replies View Related

Slow Running Server

May 2, 2001

Our server is running. There are no locks, and server has been rebooted but the problem is still there. This has been going on for some time now. I intend to restart the server. Does anybody have a quick solution, please help. Thanks for your assistance!!

View 4 Replies View Related

Slow Running Server

May 17, 2001

Please what do I look out for 6.5 if I want to troubleshoot for slow running.
Thanks.

View 2 Replies View Related

Server Slow Down Problem

Aug 23, 2001

Our SQL server needs to be rebooted every two weeks sometimes even earlier. Otherwise it gets extremely slow and I can even open any tables in enterprize manager. Also the users cannot type any info into the application screen, it takes forever to change from one screen to other. Can somebody please suggest me how to avoid this situation or any idea of why it happens.

Thanks a lot.

View 2 Replies View Related

SQL Server Running Slow Down

Jun 6, 2001

Hi,

When first time I start my sql server is running faster. After 10 to 15 days later, sql sever performance is very slow. After I restart SQL service, to become normal.

Thanks
Mohan

View 1 Replies View Related

Slow Performance In Sql Server

Jun 5, 2004

Hi

We are facing performance related problem using Sql server 2000.

We have one stand alone P4 Pc (128 ram) and around 30 users access the sql server through network.

We have written our aplication in VB 6 and backend as Sql Server 2000. We have used Stored Procedure where ever necessary. We have used cursor location as Server side.

When we start with 5 users it is not slow, when all the users say 30 comes in it is slow down.

Can some one help to find out what is the problem.

Thanks

View 1 Replies View Related

Help With SLOW SQL Server Performance

Nov 29, 2005

I'm still new to SQL Server so some of my lingo/verbage may be incorrect, please bare with me.

The company I work for relies strictly on ASP and SQL Server for 85% of it's daily operations. We have some Access projects and some VB projects as well, but for the majority it's ASP and SQL Server.

Previously we had 2 T1 lines with something like 3MB a piece and a handfull of Dell Servers. Our main server is also a Dell running Windows Server 2003 and is hosted through a reputable company here in town. They have a host of fiber lines running all over so I know we're getting good throughput. We've actually just upgradded to a DS3 but we're still working out the kinks with that. Anyway, I just want to eliminate that up front - we have great connection speeds.

The problems lies, I believe in our database design. The company supposedly had a DBA come in and help setup the design some 3 or 4 years ago, however even with my limited knowledge I feel like something is just not working right.

Our main table is "Invoices" which is obviously all of our Invoices, ever. This table has an Identity field "JobID" which is also the Clustered Index. We have other Indexes as well, but it appears they're just scattered about. The table probably 30-40 fields per row and ONLY 740,000 rows. Tiny in comparison to what I'm told SQL Server can handle.

However, our performance is embarassing. We've just landed a new client who's going to be brining us big business and they're already complaining about the speed of their website. I am just trying to figure out ways to speed things up. SQL is on a dedicated machine I believe with dual Xeon processors and a couple gigs of ram. So that should be ok. THe invoices table I spoke of is constantly accessed by all kinds of operations as it's heart of what we do. We also have other tables such which are joined on this table to make up the reporting we do for clients.

So I guess my question is this. Should the Clustered Index be the identify field and is that causing us problems? We use this field alot for access a single Invoice at a time and from what I understand this makes it a good Clustered Index, because the index IS the jobID we're looking for. But when it comes time to do reporting for a client, we're not looking at this field. We just pull the records for that Clients Number. And we only have 1400 clients at this point. So if we were to make the "ClientID" field the Clustered Index, it would much faster to Zero in on the group of Invoices we wanted because the ClientID is ALWAYS included in our queries.

But because a "DBA" came in to design this setup, everyone is afraid to change it. I guess it's hard to explain without people sitting here going through the code and look at the structures of all our tables - but I guess what I need is like a guide of what to do to easily increase performance on SQL Server and the proper use of Clustered and Non-Clustered Indexs and how to mix and match those.

Sorry I wrote a book.
Ideas? This place has always helped me before, so thanks in advance!

View 14 Replies View Related

New Bigger, Better SQL Server...slow

Mar 23, 2006

Hi,

We just put on our main accounting (50 GB total, 8 GB largest table - GLTRAN) database on a new Windows Advanced 2003 server with 8 GBs of memory. Everything is essentially the same as the old box, aside from the fact that it's on Windows Advanced 2003 Server and it's using LUNS as the E: drive where the SQL database is kept. It runs fine for the most part, excpet this one report takes literally 20 times longer to run than on the pld box.

It's SQL Enterprise 2000 SP4 (also the same). Are there new config options for SQL when running on a 2003 server? Or is it how the OS is handling the SQL service? I'm perplexed. It's not indexes. I still have the old box and load the current dbase to it for testing purposes and the report runs like lightning on it.

Thanks,
Dave

View 14 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved