How To Speed Up/improve Performance For International DB Users
Mar 7, 2006
Hello,
First time poster here so I hope this doesn’t sound too remedial. Here’s my situation…
I work for a large industrial company that has locations throughout the world. We have a DB that tracks product concepts and ideas and associated metrics for those ideas. The DB resides on a file server in North America (Raleigh, North Carolina to be exact). North American users have no trouble with the performance of the DB. It takes a moment to open (several seconds), but once it has opened there is virtually no lag time to add or edit records, run reports, view graphs, etc. However, users in Germany and the Netherlands encounter substantial lag time not only in opening, but also in updating and entering records, running reports, and viewing graphs. This is true even after they have waited for the DB to open.
The size of the DB is only around 2MB so I don’t think overall size is the issue.
There are probably no more than 3 or 4 users in the DB at the same time with most occasions being a single user so I don’t think we are having a multiple user issue.
The DB is self contained – no references to external data or splitting of any kind.
So my questions are:
1. Do you think the poor performance is a function of our network or of Access or the DB design?
2. If it is the network, is there anything that I can do in Access to help get around the hardware/network issues?
I have a (quite) specific question but I thing it covers something I simply cannot answer.
I have three UPDATE queries running on linked tables in Microsoft Access (2000/XP).
My main data table (the one to be updated) has almost 1million records
My three information tables ALL have primary keys (which are used to link the main table) and vary in size
I have atatched the three UPDATE queries plus descriptions of the field names used.
TableRecordsTime
Main DataTable900000 Mask nomenk1302 hours Mask media90015 minutes Mask brand4000?????
Query A UPDATE [Main DataTable] AS z INNER JOIN [Mask nomenk] AS n ON (z.nomCode1 = mn.nomCode1) AND (z.nomCode2 = mn.nomCode2) AND (z.nomCode3 = mn.nomCode3) AND (z.nomCode4 = mn.nomCode4) SET z.NomenkMask1 = n!NomenkMask1;
Query B UPDATE [Main DataTable] AS z INNER JOIN [Mask media] AS mm ON (z.couCode = mm.couCode) AND (z.nomCode1 = mm.nomCode1) AND (z.pubCode = mm.pubCode) SET z.MediaMask1 = mm!MediaMask1;
Query C UPDATE [Main DataTable] AS z INNER JOIN [Mask brand] AS mb ON (z.couCode = mb.couCode) AND (z.nomCode1 = mb.nomCode1) AND (z.brCode1 = mb.brCode1) AND (z.brCode2 = mb.brCode2) SET z.BrandMask1 = mb!BrandMask1;
My problem, quite simpley is the speed involved with running these queries. I know that query b) is the quickest with query a) a distant second (I could not even complete the running of query c) and killed it after 6 hours.
What I need to know is WHY is queryC soooo much slower than queryB when the only realy diference that I can see between them if the latter has an extra field to join on
I am exporting a large query to a delimited text file. I'm finding that it takes more than 5 minutes just to get the Export Text Wizard to load, and I'm guessing that's because Access is running the query as it loads the wizard.
A DB is split (FE / BE) with several FE users and the BE sat on a network. FE Access 2003. (runtime) The Sub form has record set type set to Snapshot.
Which of the following scenarios will perform fastest?
Scenario 1, The FE Queries a linked table and displays the results on a sub form (Datasheet Format).
Scenario 2, The BE table is copied to the FE (new table) and the query is run against the new table and displays the results on a sub form (Datasheet Format)
The reason for this question is to attempt to reduce the network traffic and further improve the speed performance of a split database.
I have a database that multiple users have access to. We all have new pcs of exactly the same spec. We all have the same oracle ODBC, which is linking to tables in an oracle database on a Sun F15k Server. The linked tables are set-up using my oracle account with saved password. We are using the same network. We are all using Access 97.
Queries that are run in this database are returning results in just a couple of seconds for other users, and 20-30 seconds for me.
Can anyone suggest any factors that may be causing this discrepancy? Or things that I can run/do to improve performance?
I've discovered I have a problem sending text between an Access Database and ASP page.
The database contains Turkish characters (eg, i without a dot) and the output to a web page is fine when run on a server that has Turkey for its international settings. Interestingly, the database shows the correct data regardless of environment... just the written page that screws it up.
However when the same data and pages are used on an "English" server, the Turkish characters are lost and modifyed to their closest Western cousins.
I'm not sure whether this is exactly an Access or an ASP problem... I think it's the latter, but maybe someone reading this forum has som ideas too Since I need to use this data in a server hosted in the UK, is there any code I can add which will fool the server into reading and writing the Turkish characters correctly?
Obviously, I could set up a function on the page to convert the characters, but that seems like a waste of system resources.
I have 2 queries to check if there is any "double quote" character in any of 12 month columns in a month table of 125K records. I use 2 queries since the maximun criteria in a query is only 8. So I use 8 criteria in the 1st query then use 4 criteria to check the remaining months in the 2nd query. The month table is refreshed and created every month. The 12 month columns are changing from month to month since they are -13 month to -1 month when the month table is created. For example, if the month table is refreshed and created in May 2013, then the 12 month columns are "May 2012", "Jun 2012" .... and "Apr 2013". If the month table is created in Jun 2013, then the 12-month columns are "Jun 2012", "Aug 2012" .... and "May 2013". The end user has little knowledge of Access Databse so he seems confusing how to update the 2 queries on new month table.I am trying to see if there is a way to use 12 criteria in just one query so the end user only deals with one query's update.If there is a way to automate/improve the update of the queries/query, then it would be the best.
I have a growing Access database in a multi-user environment over a 10/100Mb network. The database is all in one file at the moment, on a shared directory of our XP-Pro 'Server', and the workstations have a mapped drive to it and are W98SE machines. All the machines are 1.2Ghz Fujitsu Siemens machines.
It is still under development, but is also in constant use, and I therefore have to develop on a copy, then get everyone out so I can copy in the changes. I would love it to be a client/server setup and split the db to Tables only backend on the server and progams on the client, but when I tried, the result was a dramatic slow-down in the system...it became unusable.
I do have a budget for this, and could get a proper 'server' or maybe an Ethernet Disk, but what is the best config for speed and admin purposes. Anyone doing something similar??
I noticed something strange in access 2000: sometimes it takes a long time to calculate a report and other times it goes rapidly. I don't see any process taking a lot of CPU %.
When i do the same thing in Access2003, it goes rapidly every time.
I have a number of queries which build two or three Union queries which looks at 35,000 records, and when you open the Union queries or run reports it takes forever for them to open.
I have indexed all the tables which have the common fields to see if this speeds it up the queries, but there still slow. The table they look at are in two different tables which are linked into a front end where the queries are!
I query against a table with 380K records and growing.
There are approx 14 fields in the table, but I only retrieve 7 in my query. Does having those extra fields in there slow the query down, or does the query ignore them?
Which is faster, placing a calculation ItemNumber: IIf([MANITEMNO]<>" ",[MANITEMNO],[ITEM]) in a query or placing =IIf([MANITEMNO]<>" ",[MANITEMNO],[ITEM])as a Control Source in a text field on a form or report :confused:
I was brought up to believe in Santa Claus and that DLookups were slow. I browsed last night and found many threads mentioning speed issues with DLookups.
My latest project consists of upgrading a database written at a Client Site that is chock full of (you guessed it) DLookups. My normal inclination was to replace them with parameter queries. But before I did, I ran some benchmarks of DLookup vs Parameter Query speed. In all cases (table1 = 11 records, table2 = 1,143 records) the DLookup was faster.
My test was to lookup a field based on the recordset's primary key. In table2 I tested a record mid way down and the last record. DLookup is the winner hands down.
Should I also stop believing in Santa Claus. Too bad because I have a new notebook on my wish list.
Can anyone offer suggestions as to why, when I split my db, place the backend on the server, and open Form2 my front end grows from 2.25MB to 3.95?
I was using macros to filter the data for each type of project and that does not affect the db size nearly as much as this does.
I am using the following code on the On_Click event of the "View Projects" button on Form2 to build the criteria for the records to appear on my "Projects" form.
Private Sub cmdOK_Click() Dim varItem As Variant Dim strSubtype As String Dim strStatus As String Dim strSubtypeCondition As String Dim strSQL As String Dim strSortOrder As String Dim cat As New ADOX.Catalog Dim cmd As ADODB.Command
For Each varItem In Me.lstSubtype.ItemsSelected strSubtype = strSubtype & "," & Me.lstSubtype.ItemData(varItem) Next varItem If Len(strSubtype) = 0 Then strSubtype = "Like '*'" Else strSubtype = Right(strSubtype, Len(strSubtype) - 1) strSubtype = "IN(" & strSubtype & ")" End If
For Each varItem In Me.lstStatus.ItemsSelected strStatus = strStatus & ",'" & Me.lstStatus.ItemData(varItem) & "'" Next varItem If Len(strStatus) = 0 Then strStatus = "Like '*'" Else strStatus = Right(strStatus, Len(strStatus) - 1) strStatus = "IN(" & strStatus & ")" End If
If Me.optAnd.Value = True Then strSubtypeCondition = " AND " Else strSubtypeCondition = " OR " End If
'Build the sort order
If Me.cboSortOrder1.Value <> "Not Sorted" Then strSortOrder = " ORDER BY tblProjectDetails.[" & Me.cboSortOrder1.Value & "]" Else strSortOrder = "" End If
I am making a internet-game. It has for every register player quite some variables stored in a database.
On nearly all pages the database is accessed, modified and closed again. Let's say each player has 20-25 variables stored in the database. Would this cause speed problems? Any ideas to solve this?
I have a database on a server that is accessed by mobile clients using laptops (broadband) when out of the office. They use'virtual private network service' to do this. (I did not set this up, I just design and program the front and back ends) However some report a slow response time when retieving data from the database file.
Would 'Active desktop' be any quicker? Any suggestions on how they might speed things up, would be most welcome 'Replication' comes to mind but I think their data must remain up to date at all times.
Our office runs on a pretty large Access database (v2003). We are on a large hospital network and have about 15 users for our database. It tends to run VERY, VERY slow. Are there other options?
I have a database that is split into a FE / BE with the BE running on a server and users are accessing thru a dial up connection. This is working very slow do to the fact that I have combo boxes that users select data from that are based on different tables and every time you click on a combo box it takes several minutes to open deponding on the number of records. At 1st I thought that maybe converting to SQL would help solve this but the more I read the less I think that will do the trick. Could someone please advise me on the best solution here.
Hi, I am doing a Left Join to try to look up values in a large (about 100,000 records) table. If the value isn't found, I'm using the nz function to supply a value.
This query runs very slowly (takes about 2 minutes). I can understand why... I suppose that for every value it's trying to look up , it has to loop through all 100,000 records before it decides that it's not there.
So, I am just looking for ideas on how to make this run faster.
I do have indexes on all my join fields and criteria fields. Thanks for any suggestions.
Here's the SQL:
SELECT VM1a.row, VM1a.Column, VM1a.Noun, VM1a.Rev, VM1a.RefDes, VM1a.repcode, VM1a.Cell, VM1a.InspPoint, VM1a.DefectType, CLng(nz([FirstOfTruncatedOpSeq],0)) AS ZOpSeq, Sum(VM1a.DefectQty) AS SumOfDefectQty FROM VM1a LEFT JOIN BOMOutRefDesOnly ON (VM1a.RefDes = BOMOutRefDesOnly.RefDesOnly) AND (VM1a.BOM1 = BOMOutRefDesOnly.PCAItemNo) GROUP BY VM1a.row, VM1a.Column, VM1a.Noun, VM1a.Rev, VM1a.RefDes, VM1a.repcode, VM1a.Cell, VM1a.InspPoint, VM1a.DefectType, CLng(nz([FirstOfTruncatedOpSeq],0));
I have a query which returns each entrant with the speed figures in a descending order for previous races, I wish my query to return the top 5 speed figures per entrant or if the entrant has less than 5 previous runs it needs to return all available data.
I am not VBA literate, so as simple as possible please, thanks.
I've created an Access DB on a Citrix server which is multi user so has been split and user linked tables. It runs quite slow however. At the moment I don't have time to convert it to unbound forms, so have read that one solution to speeding it up is to create a table in the back end tables to the main DB. Then use the open recordset event to keep the link between the two open.
I know how to link the two, but can someone explain the open recordset part please. What do have to do?
The database has to be handled by a web service and/or a web appliction making the comma delimited option occupy more server time breaking down the string into usable user_ID's. But (and this is probably my actual problem) if i was to impliment my second idea, I'm not sure what I should make the tables primary key or the best way to relate it to the main user table.
My initial thought is to just have a, technicaly useless, ID be the primary key and impliment a one(User) to many(friends entries) relationship... but im a bit of an access n00b so thought i better come ask for some advice.