Access Performance Analyzer
Jan 17, 2007
Hello,
Have a few questions after I ran the Access performance analyzer. Now these ideas are they good or just some generic recommendations. Dont know if I should take care of all these or not?
Anyone know if I should do all these things and about how I should do it?
http://img236.imageshack.us/img236/4755/perf1ff3.png
http://img169.imageshack.us/img169/7370/perf2ay4.png
http://img236.imageshack.us/img236/5223/perf3bm4.png
View Replies
ADVERTISEMENT
Dec 22, 2012
When I run the analyzer on all object types it recommends to change the data type for field "zip" (zip code) to "long integer to:
"benefit that table and potentially other objects in my database"
The field type is currently set to text, And I have the same setting for the same field in a separate table, yet it does not come up with a recommendation for that table.
Additionally, I don't seem to have the option "long integer" for the field data type???
jeds - using Access 2010
View 5 Replies
View Related
Jun 8, 2005
Here is a program that queries an Access database using a SQL Query Analyzer like program
Very cool indeed.
Hooks
View 6 Replies
View Related
Jul 24, 2005
Hi everyone
I'm starting a year-long university project which will be written in c#. The client app will be run on a LAN, with 35-40 users accessing the database concurrently, connecting to a database held on the server.
Could anyone tell me if Access can withstand 35-40 users at the same time? I found out that the maximum amount of users is 255, but what kind of performance impact(if any) will there be with 35-40?
I've searched all over the Internet for this, but I can only find references to using access behind a website, which isnt quite what I need. Any help would be great.
Thanks.
View 7 Replies
View Related
Oct 27, 2007
Hello,
I am attempting to convert some Access XP MDB files to Access 2007. I am experience some horrendous performance problems. We are a bit unusual in that we use Access as a RAD tool and almost all of our data resides in Oracle tables linked to by our Access front ends.
The issue I am experiencing concerns opening tables in the UI. If I open a table by double clicking it, a datasheet view is displayed (rather quickly) as usual, filling the page with information. The UI is responsive while Access populates its recordset in the background. (e.g. the the record count on the record navagtion bar is not displayed). As soon as I click in a column defined as a date, the UI hangs, network trafic and CPU usage rise dramatically and the Access Window displays "not responding". It appears that Access is attempting to retrieve all the rows in the table. After 3 minutes or so, the UI becomes responsive, CPU and Network traffic return to normal and the record count is now displayed (roughly 900K records).
This is "bad" (and also not what Office XP did). Please can anyone help?
Thanks
Vince Tabor
View 1 Replies
View Related
Oct 29, 2006
My situation is as follow
I have an Access datebase, where main tables are built and saved.
Also, I have 2 Access modules, which access informations in main tables in an Access datebase (I used Linking to table) and process information in query.
Questions: Linking to tables in another datebases will make perfomance of the module worse ? If the answer is Yes, then in which scale ?
View 1 Replies
View Related
May 8, 2007
Hi all,
I am programming a database that contains approximately 3700 records stored in a relational table structure.
When I split the database and moved the tables to the server side, my performance worsened drastically: a query that runs in under 5 seconds when the tables are on the client side takes over 45 seconds in the split database (with tables on the server side).
I've tried a variety of solutions (including changing the subdatasheet name property to [None] in the tables, creating an MDE file from the MDB file), but they haven't helped much.
Any suggestions about how to speed performance are greatly appreciated! Thanks so much!
View 2 Replies
View Related
Jul 4, 2005
We are encountering severe performance problems when running our access database on our Network.
Symptom:
Huge drop off in performance when 2nd and subsequent users refresh links to chosen back end database. If the database is opened by another users front end database.
Takes up to 10 minutes to refresh the links to 120 tables in Back-end MDB database.
The system is designed as a split front and backend Access database. The back-end Data MDB database is located on a Windows 2000 server.
The first time a user logs on they are required to locate and Re- Link the tables from the back-end database which holds data for the current year. If they wish to work on another year they re-link the tables from the relevant back end database.
When running locally the Front end takes between 5 and 10 seconds to refresh ALL the links to around 130 tables.
When the back end is on a network drive the process takes about twice as long. But is still quick.
If another user has the file open via their own (locally held) front end, then the refresh link process takes between 5 and 10 seconds PER TABLE !!
This means that that it takes up to 10 minutes to complete the re-link.
Same behaviour if using Linked table Manager or refresh links using code
As soon as second user logs out of database the refresh link process speeds up.
The behaviour has been observed on a variety of servers and customer networks.
General data access is also much slower after 2nd user - but just about bearable. Data files are small – few records – occupy 7mb on disk.
Development Platforms – Windows XP / 200 clients, Windows 2000 Server hosts the back end database.
Front end is Access XP (MDB or MDE File) with al the latest service packs and security updates installed.
We have experienced this problem with a number of different systems running in various environments. But this wasn’t a problem with the old Access 1 database
HELP! Access should be able to support 10 – 20 users - we can barely run 2!
View 4 Replies
View Related
Nov 2, 2005
Hi folks!
I've been experiencing a very frustrating problem recently.
One of my databases crahes randomly, and I'd like to know if some of you have tips to helps me.
The database is splited in front-end and back-end, both on the server. I know keeping the front-end on the client PC will improve performance, but due to a high number of users (about 60 to 80) and the need of doing frequent updates, installing the front-end on each machines is not an option for now.
Users are not all directly connected to the server. Those who are directly connecte4d to the server (in building A) have no problem. Those who don't (building B) have extensive lags on opening (about 30 seconds), and on about anything that calls an heavy query, most often leeding to a crash. Access display the message telling that the datbase needs to be repaired and restart.
I tryed several thing to improve my queries, like eleminating most of GROUP BY and UNIQUE clauses and avoiding the use of functions.
I have a lot of conditional formatting in the forms, but I don't think this is the problem, since it is executed on the local machine (as far as I know).
My theory is that the lags cause some call to the DB to timeout. I just dont know how to fix that. Is there an option to higher the timeout delay?
My other theory is that the problem is caused by dropped packets during the connection to the DB. I tryed downloading large files (100Mb to 250Mb) from the server several times and never had a problem. Although the connection is quite slow, the ping is very small, like below 10ms to 20ms. Is there better ways to test a network connection, if possible without installing software (limited rights on the machine)?
Would using Replicas be a solution? What are the pros and cons of it? I never used them and know very little about it. My idea would be of putting a back-end on both servers, but I guess the data would not be as live.
BTW, I am using Access 2002 (2000 file format) on an XP SP2 PC.
Thanks to all of you patient enough to read all of this! I Hope some of you have some awnsers, tips or tricks!
View 10 Replies
View Related
Jun 29, 2014
I have an Access 2010 database .. and will be adding about 25,000 records a day to it. So it will get pretty big fast .. so 9 million a year. Is this too big for Access ?? Or is it more a issue of file size .. I know that Access 2010 can go up to 2 GB. I am interested in how I can maximize my performance and not have the database.The data is as follows .. One table. The other tables will be small, just this one will increase in size really.
Id .. Autonumber (primary key)
When .. datetime
Radio .. integer (indexed .. w duplicates)
Group .. integer (indexed .. w duplicates)
Type .. text 4
TransType .. text 2
I am considering a purge of old records if needed .. like only keeping the most previous 5 million records or some reasonable number. Like .. then I would think I should try to number my records backwards (using long int rather than autonumber and number records backwards on import) .. ie record 1 would be the most recent one and only keep the last X number of them for performance reasons.
View 8 Replies
View Related
Sep 10, 2006
Hi,
Could having two look up queries, one on form an the other on table, and both of them take info from the same field and store it in a different field on a different table slow the DB?
:)
Thanks,
B
View 4 Replies
View Related
Mar 28, 2007
case 1:
Link the SQL Server tables to MS Access (mdb file) - there are 50 concurrent users access in the same file (network share drive).
case 2:
create adp file - there are 50 concurrent users access in the same file (network share drive).
I know if each client has own copy on his local machine, it will not have any problem. But, if we want to put one file only in the sharing drive, then
if either the client or server computer (or the connection between them) fails during the transaction or other operation that writes to the database file,
which case is better to solve?
View 1 Replies
View Related
Jul 20, 2006
Hi all,
I'm hoping someone can offer me some advice on performance for a FE/BE database that will eventually be accessed by up to 60 people during the same day (usually only 3-4 people searching/writing at same time).
The system is to process complaints at a call centre - built from the ground up and my first true Acess DB application. I have noticed extremely slow times in loading forms - which is a pain considering we are a performance oriented workplace.
FE is on the desktop. BE on the server. Currently it is taking up to 10 seconds to open a form (even when the form contains virtually no information looked up from tables).
Could anyone offer any tips on how to improve performance.
Thanks
Robert :)
View 1 Replies
View Related
Jan 15, 2007
Hi all,
I am work in a medium sized (80-100 people) Government Contact Centre. As some people may know from previous posts, for several months on and off I have been developing a database which could be best described as a 'ticket of work' system for many of our transaction channels.
I am currently rolling out the 'Beta' version of the database, and am noticing some slowdown in performance.
The database is Acces 2K, F/E to B/E X2 (writing to one backend 90% of the time) sitting on one of the networked drives (F/E is local).
I have addressed some of the issues that relate to performance (such as persistent locking of B/E) and am seeking some further advice. Unfortunately, as our I.S department do not support Access use I am unable to seek advice at work.
1. Does anyone know of any tools or methods for measuring database performance?
I am particularly interested in response times and how these are impacted depending on traffic/load. It would be particularly beneficial it there was a tool available that recorded data automatically(eg. in an excel worksheet) for later analysis.
2. A question re: code efficiency. I am primarily using ADO to open and manipulate recordsets. This may be a silly question, but being self taught I have missed lots of the obvious stuff on the way. With ADO, I can either open an entire table, or use an SQL select to open only a specific record. Do the two methods differ with respect to performance greatly (more so from the perspective of other users)?
Sorry about the long post, and thank you in advance for any advice you can offer!
Cheers,
Rob
View 12 Replies
View Related
Sep 27, 2007
I'd like to know whether other users have had performance problems with different versions of Access in a single environment.
In my office, two users are on A2007, and the rest of us are on A2003 or earlier. After some nasty conflicts in the (un-split) database when A2007 and A2003 users were in the same file, I split the database. Now the A2007 and A2003 people have individual frontends. Other than having a garbled LDB file, I haven't had any other conflicts between the two programs.
However, I have had some massively irritating problems ever since I split the database.
1)The Design View on my Front End runs very, very slow (like, five beats after every action to process).
2) If I try to edit one particular, very simple form, it will not return to form view... it is stuck in Design view. (That sounds like the form has been corrupted.)
3) The auto-save doesn't seem to be working... Access (2003) crashed on me yesterday and I lost an hour's work on a new form.
I've tried some of the database construction suggestions to speed performance in other posts - forms based on queries, record locking, short paths, etc., and have seen some improvement in overall performance, but none of these should affect design view. USING the database tends to be speedy-quick.
So, did our dalliance with A2007 mess up my database? Does any one else have experience with this? Or are these symptoms common with a slow network connection (the back end is on a server, front ends are on individual desktops)?
Thanks!
View 4 Replies
View Related
Oct 2, 2006
Bear with me on this one as the query looks a mouthful although it's fairly simple.
I started with the following query, which was working working very quickly and almost instantaneously bringing back results. Essentially, it is a number of nested select statements bringing back data and joining on the first table, Structure, to group and filter the results. I had to do it this way as there are no distinct relationships between the zarinvreg, zarageddebt, or baddebt tables (not 100% anyway).
Attached on SQL_Ok.txt
I then had to add some extra joins in and all of a sudden the query has slowed to 10 seconds. There's nothing particularly heavy about these extra joins but they have a couple of WHERE caluses in them. I tried indexing all the fields in every table in the DB and that didn't help at all.
Attached on SQL_SLOW.txt
Any ideas on how to improve this, some things to try, or why the massive delay in processing ? The same query is pretty fast on SQL Server though...just not Access
View 1 Replies
View Related
Jan 17, 2005
We're going live with a database today, and running through some testing, some of the forms seem to freeze. It's only happened a couple of times, but my question is, what is the best way to distribute it.
At the moment it's just on a location on the network and the users in the team access it directly. Can anyone give me any suggestions. My neck is on the line here...
View 2 Replies
View Related
Oct 10, 2005
I have searched on this forum for other threads like this and in the db examples page, but didn't find anything. I have made a perfomance eval db in MSA 2003. It works fine except for one part. I need to score the individual on about 20 different criteria. Each one ranges from 1-5. I am having problems getting all the entries to sum when I run the report. Should I use combo boxes, check boxes, radio buttons or what? :confused: Then how do I get the individual scores to sum up when I am finished putting them in and run the report to print it? I have looked around the net for a Performance Evaluation template to see how it is done, but couldn't find one anywhere. Microsoft doesn't have one in their list of templates either. If anyone knows where to find one at, I'd appreciate that too. TIA for your help! I appreciate it.
James
View 3 Replies
View Related
Oct 24, 2005
Hi,
Im developing a project database. A normal project will need 10 000 records in the biggest table. Does this effect the performance? I mean, when they have done 8 projects, there will be like 80 000 records in one table. Is this to much? Does this influence the performance very much ?
The database is gooing to be placed on a sequelserver.
thanks for your reply since this is a very importend mather!!!
View 2 Replies
View Related
Nov 16, 2005
:confused:
I have split my database application that was approaching the 20MB size. This I have split into a front end (approx. 8 Mb) with linked tables to a back end database (approx. 12MB).
Network is 100Mb Ethernet.
However, since doing this, end users have noticed that scrolling through records and especially running reports takes significantly longer sometimes 3x/ 4x longer. I understood that splitting the DB would have a beneficial effect from a development / application 'release' point of view and maybe if I were to create an MDE file of the front end, I could also benefit from reduced network traffic given that end users are using a compiled executable etc.
With the speed issues I have been experiencing I have had no choice but to roll back to the original application format with everything in the the one MDB file.
Has anyone else had to do the same - given similar speed degradation issues?
Thanks
Guido
View 5 Replies
View Related
Jan 11, 2006
Last week, my access97 db, with back end and front end, both residing on a network as they have been for the last 2 years, decided to start running at 1/4 of the speed that they usually have. The db is used by 16 users, and roughly 5 are on at any given time (Operating system - XP). Both front end and backend were compressed without any change in performance. No changes in programming or number of records was introduced as of late. Checking with our IT department indicated that the performance of the network and drive have not changed and are up to snuff. I moved a copy of the FE and BE to my hard drive and found performance to return to normal speed, although I am not sure if it always ran faster on a PC. Any experience with this irregularity and options to check would be greatly appreciated?
View 2 Replies
View Related
Jan 20, 2006
I have a access application which is split into 2 bits with the backend being on the network. The thing is its very slow performance wise and i want to try and optimise it...
View 2 Replies
View Related
Mar 15, 2007
I have designed an Access DB with various forms to display data populated by queries. This runs fairly efficiently when on the same system.
As soon as I try to split the DB into a front and back end and place the data part in a network folder location (across a WAN) the performance is incredibly slow.
The strange thing is that when testing the queries out on SQL server (just as a test), they run quickly and the data can't be more than a few kb in size.
As it was explained to me the other day:
"Access is an file oriented database. There is no client-server code, so all data manipulations are done on client side anyway. Access has to load data across network."
This sounds to me like Access would be loading all 20Mb of data across the WAN and processing it on the client end rather than running the query at the back end first and only sending through 10kb of data.
Is this true?
View 5 Replies
View Related
Sep 23, 2007
Hi All,
A DB is split (FE / BE) with several FE users and the BE sat on a network.
FE Access 2003. (runtime)
The Sub form has record set type set to Snapshot.
Which of the following scenarios will perform fastest?
Scenario 1,
The FE Queries a linked table and displays the results on a sub form (Datasheet Format).
Scenario 2,
The BE table is copied to the FE (new table) and the query is run against the new table and displays the results on a sub form (Datasheet Format)
The reason for this question is to attempt to reduce the network traffic and further improve the speed performance of a split database.
Garry
View 5 Replies
View Related
Apr 8, 2008
Hi there,
I am having huge performance issues with a FE/BE split. As a background, I have the following Table layout.
http://img357.imageshack.us/img357/7374/diagramjv9.th.gif (http://img357.imageshack.us/my.php?image=diagramjv9.gif)
In basic terms, the contract table has basic contract info, resources can be assigned to a contract (via the Assign table) and we feed in exchange rate info also from another table.
Everything is so slow as soon as I put the BE on a share drive. I have done everything I can think of, I have changed to tlookups, I have changed the Auto options as recommended, analyse shows no issues, compacted the DB. It is unusable!
I notice on one form that it takes 1 second to calculate a field. The field basically uses a tSum to find the total cost of a contract (looks up assign to sum up all the attached resources). This seems to slow it down, but it does not explain the huge time it takes to load up.
I am considering that the issue may be down to the share where it is being located has too high latency (it is in another country, and it feels slow browsing through it)
Any other general ideas, or do you need more info?
Thank you.
View 6 Replies
View Related
May 28, 2007
Hello there,
I've lately come across some posts that condemn using lookup fields in tables. But how bad is this really? I mean it's quite a nifty feature, and it'd be really too bad if it hogs up resources, thus forcing us to leave it alone for performance considerations.
I'd appreciate if any of you guys/gals with more experience on the topic would like to share some, so I can take a better stance for future developement.
Regards,
Jaime
View 10 Replies
View Related