Speeding Up Performance On A Wan Network
Jan 20, 2006I have a access application which is split into 2 bits with the backend being on the network. The thing is its very slow performance wise and i want to try and optimise it...
View RepliesI have a access application which is split into 2 bits with the backend being on the network. The thing is its very slow performance wise and i want to try and optimise it...
View RepliesI have designed an Access DB with various forms to display data populated by queries. This runs fairly efficiently when on the same system.
As soon as I try to split the DB into a front and back end and place the data part in a network folder location (across a WAN) the performance is incredibly slow.
The strange thing is that when testing the queries out on SQL server (just as a test), they run quickly and the data can't be more than a few kb in size.
As it was explained to me the other day:
"Access is an file oriented database. There is no client-server code, so all data manipulations are done on client side anyway. Access has to load data across network."
This sounds to me like Access would be loading all 20Mb of data across the WAN and processing it on the client end rather than running the query at the back end first and only sending through 10kb of data.
Is this true?
I have a problem trying to improve performance on a database with linked tables across a network. I found in MS Access Help that you can do the following (see bullet below), but I have no idea how to use the OpenRecordset method. Can anyone give me an idea how to code this, or update the linked table with the information given below.
*You can greatly enhance performance when opening the main database and opening tables and forms by forcing the linked database to remain open. To do this, create an empty table in the linked database, and link the table in the main database. Then use the OpenRecordset method to open the linked table. This prevents the Microsoft Jet database engine from repeatedly opening and closing the linked database and creating and deleting the associated .ldb file.
I'd appreciate your help.
Thanks in advance,
JYMALY
Guys,
I have built a database with varoius linked tables and spreadsheets, and have written a query that accesses a linked table and spreadsheet, the query populates fields in a subform each time a different record is selected.
But this query is slowing up my database as there is a few second delay when changing records, I know it is this query that is affecting the speed as before I wrote the query the crossover between records was instantaneous.
The query only pulls forward 6 cells of data from 2 linked tables via an ODBC connection and from 1 refreshable spreadsheet with an SQL.
Any thoughts as to how I can speed my database query up?
My Access DB has two Number fields and five Text fields (each having field size of 200). They contain text. The DB is organised year-wise. I have now crossed 15000 records. I find that the search is very slow. Search is done on all these text fields for each word typed. Some one suggested that I index the fields to speed up the search. When I tried to change the indexes to Yes, it didnt accept. Is there any other way to speed up search?
As an alternative, I created an append query to append records of the year I select in my form to another table so that I can search that table instead of the main table which contains all years records. But the append query is not appending records.
Any solution would be appreciated
I have observed that if u refresh path of / link tables from an external database (the backend) and if the backend is currently in use by some-one else on the network (LAN in my case) it takes a lot of time compared to when no one else is using it. Is there any way to get around this?
View 5 Replies View RelatedI have created a form to summarize sales data for the past two years. The form uses a tab control with 12 tabs(By Region, Country, State, SalesRep, etc.). Each tab has its own subform link to its own query(Each query has two subqueries so that I can get percent difference). The subforms also have total fields at the bottom in the footer. The sales data comes out of two differnt tables so I can compare this year to last year. My problem is speed. It takes about 35 seconds for the form to open. Is there a smarter way to do this.
View 1 Replies View RelatedHi all,
Through years of hard work I have created a fully functioning double-entry accounts system with epos and inventory management capabilities for my retail business.
Each transaction is stored once - either in tblPurchaseInvoices, tblSalesOrders, tblPayments, tblReceipts or tblJournalEntries.
The double entry part is created automatically by queries as is all the vat/sales tax information. Producing: qryPI_VAT, qrySO_Vat, qryPaymentVat, qryReceiptVat, qryTradeDetors, qryUnpaidInvoices.
The audit trail then combines all of these queries (a total of 11!) in a union query. This is obviously a very slow process and produces 63,354 records currently. Searching for information in this list is a nightmare as are calculations!
Can anyone point me in the right direction of creating a more efficient audit trail. I would have thought one transaction table would be the way to go but I can't see how it could be done.
Thanks in advance for any help.
Cheers,
Chris
Hi,
Could having two look up queries, one on form an the other on table, and both of them take info from the same field and store it in a different field on a different table slow the DB?
:)
Thanks,
B
case 1:
Link the SQL Server tables to MS Access (mdb file) - there are 50 concurrent users access in the same file (network share drive).
case 2:
create adp file - there are 50 concurrent users access in the same file (network share drive).
I know if each client has own copy on his local machine, it will not have any problem. But, if we want to put one file only in the sharing drive, then
if either the client or server computer (or the connection between them) fails during the transaction or other operation that writes to the database file,
which case is better to solve?
Hi everyone
I'm starting a year-long university project which will be written in c#. The client app will be run on a LAN, with 35-40 users accessing the database concurrently, connecting to a database held on the server.
Could anyone tell me if Access can withstand 35-40 users at the same time? I found out that the maximum amount of users is 255, but what kind of performance impact(if any) will there be with 35-40?
I've searched all over the Internet for this, but I can only find references to using access behind a website, which isnt quite what I need. Any help would be great.
Thanks.
Hi all,
I'm hoping someone can offer me some advice on performance for a FE/BE database that will eventually be accessed by up to 60 people during the same day (usually only 3-4 people searching/writing at same time).
The system is to process complaints at a call centre - built from the ground up and my first true Acess DB application. I have noticed extremely slow times in loading forms - which is a pain considering we are a performance oriented workplace.
FE is on the desktop. BE on the server. Currently it is taking up to 10 seconds to open a form (even when the form contains virtually no information looked up from tables).
Could anyone offer any tips on how to improve performance.
Thanks
Robert :)
Hi all,
I am work in a medium sized (80-100 people) Government Contact Centre. As some people may know from previous posts, for several months on and off I have been developing a database which could be best described as a 'ticket of work' system for many of our transaction channels.
I am currently rolling out the 'Beta' version of the database, and am noticing some slowdown in performance.
The database is Acces 2K, F/E to B/E X2 (writing to one backend 90% of the time) sitting on one of the networked drives (F/E is local).
I have addressed some of the issues that relate to performance (such as persistent locking of B/E) and am seeking some further advice. Unfortunately, as our I.S department do not support Access use I am unable to seek advice at work.
1. Does anyone know of any tools or methods for measuring database performance?
I am particularly interested in response times and how these are impacted depending on traffic/load. It would be particularly beneficial it there was a tool available that recorded data automatically(eg. in an excel worksheet) for later analysis.
2. A question re: code efficiency. I am primarily using ADO to open and manipulate recordsets. This may be a silly question, but being self taught I have missed lots of the obvious stuff on the way. With ADO, I can either open an entire table, or use an SQL select to open only a specific record. Do the two methods differ with respect to performance greatly (more so from the perspective of other users)?
Sorry about the long post, and thank you in advance for any advice you can offer!
Cheers,
Rob
I'd like to know whether other users have had performance problems with different versions of Access in a single environment.
In my office, two users are on A2007, and the rest of us are on A2003 or earlier. After some nasty conflicts in the (un-split) database when A2007 and A2003 users were in the same file, I split the database. Now the A2007 and A2003 people have individual frontends. Other than having a garbled LDB file, I haven't had any other conflicts between the two programs.
However, I have had some massively irritating problems ever since I split the database.
1)The Design View on my Front End runs very, very slow (like, five beats after every action to process).
2) If I try to edit one particular, very simple form, it will not return to form view... it is stuck in Design view. (That sounds like the form has been corrupted.)
3) The auto-save doesn't seem to be working... Access (2003) crashed on me yesterday and I lost an hour's work on a new form.
I've tried some of the database construction suggestions to speed performance in other posts - forms based on queries, record locking, short paths, etc., and have seen some improvement in overall performance, but none of these should affect design view. USING the database tends to be speedy-quick.
So, did our dalliance with A2007 mess up my database? Does any one else have experience with this? Or are these symptoms common with a slow network connection (the back end is on a server, front ends are on individual desktops)?
Thanks!
Bear with me on this one as the query looks a mouthful although it's fairly simple.
I started with the following query, which was working working very quickly and almost instantaneously bringing back results. Essentially, it is a number of nested select statements bringing back data and joining on the first table, Structure, to group and filter the results. I had to do it this way as there are no distinct relationships between the zarinvreg, zarageddebt, or baddebt tables (not 100% anyway).
Attached on SQL_Ok.txt
I then had to add some extra joins in and all of a sudden the query has slowed to 10 seconds. There's nothing particularly heavy about these extra joins but they have a couple of WHERE caluses in them. I tried indexing all the fields in every table in the DB and that didn't help at all.
Attached on SQL_SLOW.txt
Any ideas on how to improve this, some things to try, or why the massive delay in processing ? The same query is pretty fast on SQL Server though...just not Access
We're going live with a database today, and running through some testing, some of the forms seem to freeze. It's only happened a couple of times, but my question is, what is the best way to distribute it.
At the moment it's just on a location on the network and the users in the team access it directly. Can anyone give me any suggestions. My neck is on the line here...
I have searched on this forum for other threads like this and in the db examples page, but didn't find anything. I have made a perfomance eval db in MSA 2003. It works fine except for one part. I need to score the individual on about 20 different criteria. Each one ranges from 1-5. I am having problems getting all the entries to sum when I run the report. Should I use combo boxes, check boxes, radio buttons or what? :confused: Then how do I get the individual scores to sum up when I am finished putting them in and run the report to print it? I have looked around the net for a Performance Evaluation template to see how it is done, but couldn't find one anywhere. Microsoft doesn't have one in their list of templates either. If anyone knows where to find one at, I'd appreciate that too. TIA for your help! I appreciate it.
James
Hi,
Im developing a project database. A normal project will need 10 000 records in the biggest table. Does this effect the performance? I mean, when they have done 8 projects, there will be like 80 000 records in one table. Is this to much? Does this influence the performance very much ?
The database is gooing to be placed on a sequelserver.
thanks for your reply since this is a very importend mather!!!
:confused:
I have split my database application that was approaching the 20MB size. This I have split into a front end (approx. 8 Mb) with linked tables to a back end database (approx. 12MB).
Network is 100Mb Ethernet.
However, since doing this, end users have noticed that scrolling through records and especially running reports takes significantly longer sometimes 3x/ 4x longer. I understood that splitting the DB would have a beneficial effect from a development / application 'release' point of view and maybe if I were to create an MDE file of the front end, I could also benefit from reduced network traffic given that end users are using a compiled executable etc.
With the speed issues I have been experiencing I have had no choice but to roll back to the original application format with everything in the the one MDB file.
Has anyone else had to do the same - given similar speed degradation issues?
Thanks
Guido
Last week, my access97 db, with back end and front end, both residing on a network as they have been for the last 2 years, decided to start running at 1/4 of the speed that they usually have. The db is used by 16 users, and roughly 5 are on at any given time (Operating system - XP). Both front end and backend were compressed without any change in performance. No changes in programming or number of records was introduced as of late. Checking with our IT department indicated that the performance of the network and drive have not changed and are up to snuff. I moved a copy of the FE and BE to my hard drive and found performance to return to normal speed, although I am not sure if it always ran faster on a PC. Any experience with this irregularity and options to check would be greatly appreciated?
View 2 Replies View RelatedHello,
Have a few questions after I ran the Access performance analyzer. Now these ideas are they good or just some generic recommendations. Dont know if I should take care of all these or not?
Anyone know if I should do all these things and about how I should do it?
http://img236.imageshack.us/img236/4755/perf1ff3.png
http://img169.imageshack.us/img169/7370/perf2ay4.png
http://img236.imageshack.us/img236/5223/perf3bm4.png
Hi All,
A DB is split (FE / BE) with several FE users and the BE sat on a network.
FE Access 2003. (runtime)
The Sub form has record set type set to Snapshot.
Which of the following scenarios will perform fastest?
Scenario 1,
The FE Queries a linked table and displays the results on a sub form (Datasheet Format).
Scenario 2,
The BE table is copied to the FE (new table) and the query is run against the new table and displays the results on a sub form (Datasheet Format)
The reason for this question is to attempt to reduce the network traffic and further improve the speed performance of a split database.
Garry
Hello,
I am attempting to convert some Access XP MDB files to Access 2007. I am experience some horrendous performance problems. We are a bit unusual in that we use Access as a RAD tool and almost all of our data resides in Oracle tables linked to by our Access front ends.
The issue I am experiencing concerns opening tables in the UI. If I open a table by double clicking it, a datasheet view is displayed (rather quickly) as usual, filling the page with information. The UI is responsive while Access populates its recordset in the background. (e.g. the the record count on the record navagtion bar is not displayed). As soon as I click in a column defined as a date, the UI hangs, network trafic and CPU usage rise dramatically and the Access Window displays "not responding". It appears that Access is attempting to retrieve all the rows in the table. After 3 minutes or so, the UI becomes responsive, CPU and Network traffic return to normal and the record count is now displayed (roughly 900K records).
This is "bad" (and also not what Office XP did). Please can anyone help?
Thanks
Vince Tabor
Hi there,
I am having huge performance issues with a FE/BE split. As a background, I have the following Table layout.
http://img357.imageshack.us/img357/7374/diagramjv9.th.gif (http://img357.imageshack.us/my.php?image=diagramjv9.gif)
In basic terms, the contract table has basic contract info, resources can be assigned to a contract (via the Assign table) and we feed in exchange rate info also from another table.
Everything is so slow as soon as I put the BE on a share drive. I have done everything I can think of, I have changed to tlookups, I have changed the Auto options as recommended, analyse shows no issues, compacted the DB. It is unusable!
I notice on one form that it takes 1 second to calculate a field. The field basically uses a tSum to find the total cost of a contract (looks up assign to sum up all the attached resources). This seems to slow it down, but it does not explain the huge time it takes to load up.
I am considering that the issue may be down to the share where it is being located has too high latency (it is in another country, and it feels slow browsing through it)
Any other general ideas, or do you need more info?
Thank you.
Hello there,
I've lately come across some posts that condemn using lookup fields in tables. But how bad is this really? I mean it's quite a nifty feature, and it'd be really too bad if it hogs up resources, thus forcing us to leave it alone for performance considerations.
I'd appreciate if any of you guys/gals with more experience on the topic would like to share some, so I can take a better stance for future developement.
Regards,
Jaime
I have run accross a performance issue and looking to see if another knows why it happening. First of all I am using an ODBC connection to an AS400 system. I am quering a table with roughly 1,000,000 records. If I hard code the criteria for a particular field, the query completes in less and 5 seconds. If I set the criteria to be a field in a local table, the same query runs for over 1 and a half minutes. Any ideas.
View 3 Replies View Related