I am presently tasked with improving the performance of a chatty application. What I mean by a chatty application is that the application makes multiple calls to the database server for each user request. In many cases it appears that many of the windows are making multiple calls to the database for the population of dialog boxes. In a couple of cases there are more than 10 dialog boxes that are populated similar to this.
To me this is kind of an "old issue", but my response is (1) cache the dialog information as much as possible and (2) make one call to the database to return all of this data for the dialog boxes if it is not cached. For update calls again make a single call to the database rather than making a call for each individual row of a table that is updated.
I admit that some of this will complicate the code of the application; however, my perspective is that I am supposed to look at this from the database perspective and not from the perspective of "programmer convenience." There will be times in which something that would otherwise get to hairy for a programmer will dictate an additional trip to the server, but this should not normally be the case.
SELECT T1.F3 FROM T1 INNER JOIN T2 ON T1.F4 = T2.F4 WHERE (T1.F1 > @iNum AND T2.F1 > @iNum) OR ( @iNum2 * (T1.F1 - T2.F1)/(T1.F2 - T2.F2) ) + (T1.F1 - ((T1.F1 - T2.F1)/(T1.F2 - T2.F2) * T1.F2) ) > @INum
As you can see, the second part of the WHERE (after the OR) is much more complicated than the part before the OR. My query would run a lot faster if it tried the first part of the OR and didn't bother with the second part if the first part was satisfied. Is there any way to do this?
Hi,I used SQL Server 2005 on my development machine, and whilst thismachine isn't as powerful as the live server, it does at times seem alittle slower than I would expect. So I've been wondering if there isany way for me to tune the machine so that SQL Server is better able tomake use of the resources? I am using WS2003.Short of tweaking with the actual database, which is still underdevelopment, does anyone have any tips to increase peformance?Thanks,--Dylan Parryhttp://electricfreedom.org | http://webpageworkshop.co.ukProgramming, n: A pastime similar to banging one's headagainst a wall, but with fewer opportunities for reward.
I've written a small application that uses a Microsoft Sql Server 2000 Database and Here lately i've been experiencing alot of performance problems with timeouts and data just taking too long to retreive. I was wondering what could i do to help with this.
Details: My Program i wrote simply takes windows eventlogs and uploads them into a central database in general , i have also build a small asp.net page, where users could write out queries or save them for later use. The program is hitting our servers and it grabs them remotely, inserts into sql one by one, and clears out the logs when finished, it runs one a day, and generates about 12,000 events daily. (mostly login's and logoffs). once it starts it'll take about 2 and a half hours to finish going through everything.
Database Design... Single Table, about 3.5 million rows now, takes up about 3 gigs of harddrive space.
Column Definitions Name, Data Type
EventEntryID INT PrimaryKey(Clustered) (Auto-Increment Identity) Log varchar(300)
Type varchar(300) Date smalldatetime Source varchar(300) Category varchar(300) Message varchar(8000) EventID varchar(300) UserName varchar(300) Computer varchar(300)I know the database design can probally use some work, but it's hard to do so now b/c of the massive amoutns of data without killing the transaction log.
I know it's a small server, but this is pretty much the largest thing this server deals with
I've tried to monitor sql with perfmon and sql performance counters and I just don't see much of anything going wrong with it there, so i'm thinking the the hardware isn't an issue, but then again, i'm not very familar with sql.
hello everyone, does anyone know some tipps or articles that descripe how the sqlserver 2000 can be improved? i have an application which displays newspapers. the text of the newspapers is saved in db. performance is ok but i have performance-problems when i do a fulltext-search.
1) hardware: p4 xeon double core, 2.4 Ghz, 2 GB Ram 2) number of articles in database: ~ 255.000 Records (betwenn 10 - 400 words each) 3) the database is indexed (i am searching with "CONTAINS" and not with the "LIKE"-keyword) 4) in my query i am searching in 3 columns (title,leadtext,content) 5) results (searching for one word, stopping after 150 Results):the time to get a result from database lies between 5 to 10 Seconds (Average: 6.77 sec)! this is the time only for getting the dataset from the sqlserver. the time of manipulating the dataset and binding it to my repeater needs about 15 milliseconds. using a sqldatareader didn't improve the performance. i got the same results. in future the number of records will be three to five times higher than now. i am afraid that then the user has to wait half a minute to get a result of 150 search hits. my questions: Are you having similiar results in your applications? Are this times acceptable? The SqlServer 2000 (professional) is installed in its default way with the default settings. Are they possiblities to modify this settings for improving performance? Where can i find info about that?
My colleague and I are having some difficulties regarding the speed of a LIKE query on our intranet system. Please see the quote below as posted on another SQL forum (To which no one has responded yet ).
Quote: Howdy,
I've taken control of an SQL Database with an ASP front end. The CPU usage of the SQL Server process periodically jumps up to between 75-99%. I've managed to identify the ASP page that is causing it and it's a search page.
It's basically doing something like this:
SELECT JobNumber, CustomerName, CustomerAddress, CustomerPostCode FROM Customers WHERE CustomerName LIKE '%query%' OR CustomerAddress LIKE '%query%' OR CustomerPostCode LIKE '%query%'
Where query is the value the user has searched on.
The CustomerName and CustomerPostCode fields are varchar fields, but the CustomerAddress field is a text field.
I know if I replace the CustomerAddress text field with a series of varchar fields then I could make them non-clustered indexes, but would this help speed things up with a LIKE query?
There's a lot of work required on the ASP side if I'm going to split the field out, so I only want to do it if I'm gonna get a significant benefit from doing it.
Failing this, is there a better way of performing a search like this?
I'm facing a performance issue with select count(*) table_name when performing pagination to my results. My actualy query joins 4 tables with proper PK and FK contrainsts and indices applied.
select top 100 * from table1 left outer join table2 on tid=rid left outer join table3 on tid=aid inner join shipment on sid=sid where datetime>=convert(datetime,'20070810 00:00:00') and datetime<=convert(datetime,'20070920 23:59:59')
The query above takes 200 ms to 600 ms to execute.
select count(*) from table1 left outer join table2 on tid=rid left outer join table3 on tid=aid inner join shipment on sid=sid where datetime>=convert(datetime,'20070810 00:00:00') and datetime<=convert(datetime,'20070920 23:59:59')
The query above takes 2000ms to 4000 ms to run.
The total records that fulfill the where clause is approximately 5000 to 6000 records. I checked the execution plan but found that the server is actually utilizing the indices with operations like index scan and index seek.
Are there any other things that i can do to improve the counting of total records?
I have a varchar(max) field that is on its way to a Term Lookup task, so it requires conversion to DT_NTEXT before it gets there. The conversion is taking a really, really, really long time. In my current example I have about 400 rows. Granted the length of the varchar(max) if I 'select max(len(myColumn))' is approximately 14,000,000 so I understand that I'm moving a lot of data. But I'm on a x64 machine with 4 dual core processors and watching Task Manager I can see that I'm only using 1 core out of 8. Anything I'm missing that can speed this up?
I have a number of complex search stored procedures that use the following syntax to try to simplify the code.
WHERE @SearchParam IS NULL OR SearchCol = @SearchParam
unfortunately it appears that this is really inefficient as far as the database is concerned.
If I run the following query on the AdventureWorks database (SQL Server 2005 with SP2 and fixes up to v3054)
Code Block SET STATISTICS IO ON
Declare @CustomerID int SET @CustomerID = 1
SELECT SalesOrderID FROM Sales.SalesOrderHeader WHERE @CustomerID IS NULL OR CustomerID = @CustomerID
SELECT SalesOrderID FROM Sales.SalesOrderHeader WHERE CustomerID = @CustomerID
Is see that the first select results in 45 logical reads, whereas the second results in only 2 logical reads.
Does anyone have any idea how I can get the benefit of a search procedure that does not have loads of IF blocks, or dynamic SQL without this major performance issue?
We have a table that is 800GB. We are planning to re-build the clustered index on this table to a different filegroup. The new filegroup and files associated with it will sit on a SAN which will have a 1.5TB allocation. Does anyone have any suggestions in regards to how many files to have associated with the filegroup to provide optimal performance? Apparently we could have 3 LUNS (500gb each), so would 1 file on each LUN provide additional performance as opposed to one file on 1 LUN?
Does any one know of any good components I can buy in to sit on top of my SQL Server to provide the kind of search functionality that google has - 'Did you mean...,' nearset match etc. I have set up SQL Server Full Text but to build all the extra functionality will take some time and I think it will be more cost effective to buy a component in. If any one has used any could you let me know of you experinces please. CheersScott
Is there a configuration or a trick to improve the speed of the access to inserted and deleted tables whithin a trigger? Whenever a trigger is called, the access to inserted or deleted constitute approximatly 95% of the execution time.
Is there a way to have access to inserted and to deleted improved other than copying the data to another table?
I am using ASP.NET 2.0 login control. this control connect to ASPNET.MDF database. I built another application using windows service, and this application also connects to ASPNET.MDF database. problem is that the first application that connects to the database, locks the database, and so the other application cannot use the database untill the other application is closed. I am going on circles about this, and just don't know what to do. Please please please. love me do.
I have a SQLexpress db that i would like to be able to access from both a windows app and web app (both running on the same machine) at the same time. Is this possible. I've been able to connect either one or the other, but not both at the same time. Thanks
If there are 2 different web application connecting to a sql server database through ODBC connection, both of them have full privilege to update , create , add column etc. Would there any issues of SQL server impacts when actually on live.
Hi to all.. Does anyone know about some useful resources of programming Accounting and General Ledger applications..SQL scripts,books..etc. Thanks to all..
I am running SQL Server 2005 Dev x86 with SSRS SP1 on Windows2003 Svr SP1.
My SQL Server is running and SSRS is working. When I come to run certain installs though, my Server name is not present in the dropdowns or in the browse for installed server lists.
I entered the name of my SQL server manually, but when I ran the application, it gave me an error: 00250 unable to run dtabase locator service.
i am a computer sciences engineering student and and we have an assignment to create a web project with a database.
Our lecturers will store our submissions on a db server but they want our projects enable the restoring our own databases..
that is, they wanna be able to have a copy of our databases to a new db that they have just created.. and they want us to enable this feature in an install.aspx page in our project.. the new db is guaranteed that will have the same name as ours..
so now what i am supposed to include in install.aspx?
Hi All.. I have an 3 tier web applications and I want to use sql transactions but not in each class. Example : I have 3 clases who need to update each table in one transaction how I do use sql transactions. I try to do in this way but not work: sub cmdupdate_click dim a as new class1 dim b as new class2 dim c as new class3 dim cn as new sqlconnection...... cn.open .....transactions.... scripts a.cn =cn ...I pass the same connection via property to each class a.update b.cn=cn.... b.update c.cn=cn..... c.update Transaction...scripts to commit cn.closeend sub maybe that logic is incorrect. I dont now is my explain is clear.
There is a possibility of a unique database for various applications. Example users of a table that has the name, email, registro.Uma application included in a user table and other application also included in this table.
Our company wants to run web based application in folowing way.
Browser --> WebServer -->Sql Server
Sql server is part part of corporation domain and has about 25 more databases
Should we dedicate extra SQL server to run Web apps , or it would be safe to run web apps on corporation Sql Server? or If any one can point on links on this subject?. Thank you
What is the correct way to move applications from one server (2000) to another server(SQL 2005). I have to move all the databases as well as applications from one server to another.Here is my procedure: 1) Backup all 65 databases in first server(600GB total) 2)copy all the backup files from one server to another(destination) 3)Make all database(single user)and take transaction Log backup 4) Restore all database to destination server in norecovery mode and restore all transaction log in Recovery mode) 5)stop server 1 and point the application to another server
So how much downtime i have to face since it is 24/7 production server. and Can you please give the correct details so i can apply.
Hi All,During the past couple of years I have been maintaining the company'sAccess databases, in the coming weeks I will be migrating data to SQLserver, I will be using the Access forms as a frontend to access datawhile we developed a new front end through VB.Net.I was wondering if anyone will recommend an application that makes anexact replica or backup of the SQL server in the case the SQL serverfails. With MS Access, if a record is corrupted we needed to go backto the previous day backup of the database, we cant afford to havethis issue any longer.Regards,John
Hi there!I'm programming in delphi and new to querieng MS SQL Server. Is there a wayto monitor the queries sent to the server and something of the returns. Atleast execution time would be interesting ...Thx in advance,Fritz
I am looking for the right tools to do application monitoring.I'm hoping to find one single tool that can do the entire job but ifit does not exist then a few monitoring apps would do as well.I need the ability to do the standard things like testing for ping,checking for windows services running and restarting them after somethreshold is met, and wmi.Some of the more tricky things I need it to do are:* Parse log files looking for specific error codes, and the ability toset an alert only if it sees that error X times over some period oftime.* Run custom slq queries like row counts and max values returned froma query and if that condition is met X times over some period oftime.* Run an external app that does its own custom testing and act on itsresults, which could be to parse the result file from the app.Keep in mind cost is not the isssue right now - so this can befreeware to some type of enterprise solution that our company can use.Does anyone know of any tools that you can point me towards?Thanks...
I'm new to VB and SQL server. I have a windows forms application and a Web application that need to connect to the same database (Named WebCenter). But when I try to connect to it from the Web app while running the windows forms app this error message apears:
Unable to open the physical file "C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDataWebCenter_Data.MDF". Operating system error 32: "32(The process has no access to the file because is being used by another process). Unable to open the physical file "C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDataWebCenter_Log.LDF". Operating system error 32: "32(The process has no access to the file because is being used by another process.)". Cannot open database "WebCenter" requested by the login. The login failed. Login failed for user 'CARLOS-PCASPNET'. File activation failure. The physical file name "C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDataWebCenter_Log.LDF" may be incorrect.
Can anyone give me hand or point me in the right direction.
I have created a C# Windows Forms application that can be run connected directly to SQL Server 2005 (publisher) for in-office users and to a SQL Express (subscriber) on a tablet PC for remote users. The server is set through configuration and the remote users sync using replication and it all works.
The issue I'm having is that I've found it necessary to install SQL Server management objects (SQLServer2005_XMO) on the clients who sit at desktops and never use replication or SMO. I believe this is because SMO has to be installed in the GAC and I can't just distribute the required dlls with my app.
Is there any way I can deploy this app to always connected users without installing SMO on their machines?
I have a windows application that uses a small MySql as its data source. I would like to convert it to SQL Express. I would like to continue to manage the data from the windows application, but would like to make it available to the web using Visual Web Developement. I can create the new database in VWD, but how do I access the data from my application? I can use either ODBC or ADO.