Split DB Speed Performance?
Sep 23, 2007
Hi All,
A DB is split (FE / BE) with several FE users and the BE sat on a network.
FE Access 2003. (runtime)
The Sub form has record set type set to Snapshot.
Which of the following scenarios will perform fastest?
Scenario 1,
The FE Queries a linked table and displays the results on a sub form (Datasheet Format).
Scenario 2,
The BE table is copied to the FE (new table) and the query is run against the new table and displays the results on a sub form (Datasheet Format)
The reason for this question is to attempt to reduce the network traffic and further improve the speed performance of a split database.
Garry
View Replies
ADVERTISEMENT
Mar 7, 2006
Hello,
First time poster here so I hope this doesn’t sound too remedial. Here’s my situation…
I work for a large industrial company that has locations throughout the world. We have a DB that tracks product concepts and ideas and associated metrics for those ideas. The DB resides on a file server in North America (Raleigh, North Carolina to be exact). North American users have no trouble with the performance of the DB. It takes a moment to open (several seconds), but once it has opened there is virtually no lag time to add or edit records, run reports, view graphs, etc. However, users in Germany and the Netherlands encounter substantial lag time not only in opening, but also in updating and entering records, running reports, and viewing graphs. This is true even after they have waited for the DB to open.
The size of the DB is only around 2MB so I don’t think overall size is the issue.
There are probably no more than 3 or 4 users in the DB at the same time with most occasions being a single user so I don’t think we are having a multiple user issue.
The DB is self contained – no references to external data or splitting of any kind.
So my questions are:
1. Do you think the poor performance is a function of our network or of Access or the DB design?
2. If it is the network, is there anything that I can do in Access to help get around the hardware/network issues?
View 2 Replies
View Related
Nov 16, 2005
:confused:
I have split my database application that was approaching the 20MB size. This I have split into a front end (approx. 8 Mb) with linked tables to a back end database (approx. 12MB).
Network is 100Mb Ethernet.
However, since doing this, end users have noticed that scrolling through records and especially running reports takes significantly longer sometimes 3x/ 4x longer. I understood that splitting the DB would have a beneficial effect from a development / application 'release' point of view and maybe if I were to create an MDE file of the front end, I could also benefit from reduced network traffic given that end users are using a compiled executable etc.
With the speed issues I have been experiencing I have had no choice but to roll back to the original application format with everything in the the one MDB file.
Has anyone else had to do the same - given similar speed degradation issues?
Thanks
Guido
View 5 Replies
View Related
Feb 27, 2007
So, I split up my database and housed the tables database part of the split on a shared network drive and am experimenting with the front end of the database which i've housed on my local computer. I've tried most of the recommendations: shortened the name of the db, changed fe to mde file, changed link tables to subdatasheet (none), but still my forms (only 2) are taking a while to load up. The forms are pretty substantial, and have a form and a subform on each. Can anyone offer up any recommendations to improve the speed?
View 14 Replies
View Related
Oct 18, 2005
Hi,
I have a growing Access database in a multi-user environment over a 10/100Mb network. The database is all in one file at the moment, on a shared directory of our XP-Pro 'Server', and the workstations have a mapped drive to it and are W98SE machines. All the machines are 1.2Ghz Fujitsu Siemens machines.
It is still under development, but is also in constant use, and I therefore have to develop on a copy, then get everyone out so I can copy in the changes. I would love it to be a client/server setup and split the db to Tables only backend on the server and progams on the client, but when I tried, the result was a dramatic slow-down in the system...it became unusable.
I do have a budget for this, and could get a proper 'server' or maybe an Ethernet Disk, but what is the best config for speed and admin purposes. Anyone doing something similar??
Thanks
View 14 Replies
View Related
Apr 5, 2007
Hey,
I noticed something strange in access 2000: sometimes it takes a long time to calculate a report and other times it goes rapidly. I don't see any process taking a lot of CPU %.
When i do the same thing in Access2003, it goes rapidly every time.
Can anyone help me?
Rik
View 1 Replies
View Related
Nov 2, 2005
I have a number of queries which build two or three Union queries which looks at 35,000 records, and when you open the Union queries or run reports it takes forever for them to open.
I have indexed all the tables which have the common fields to see if this speeds it up the queries, but there still slow. The table they look at are in two different tables which are linked into a front end where the queries are!
View 2 Replies
View Related
Mar 24, 2006
I query against a table with 380K records and growing.
There are approx 14 fields in the table, but I only retrieve 7 in my query. Does having those extra fields in there slow the query down, or does the query ignore them?
Just trying to figure out ways to improve speed.
Thanks.
View 3 Replies
View Related
May 23, 2007
Which is faster, placing a calculation
ItemNumber: IIf([MANITEMNO]<>" ",[MANITEMNO],[ITEM])
in a query or placing
=IIf([MANITEMNO]<>" ",[MANITEMNO],[ITEM])as a Control Source in a text field on a form or report :confused:
View 12 Replies
View Related
Dec 19, 2007
I was brought up to believe in Santa Claus and that DLookups were slow. I browsed last night and found many threads mentioning speed issues with DLookups.
My latest project consists of upgrading a database written at a Client Site that is chock full of (you guessed it) DLookups. My normal inclination was to replace them with parameter queries. But before I did, I ran some benchmarks of DLookup vs Parameter Query speed. In all cases (table1 = 11 records, table2 = 1,143 records) the DLookup was faster.
My test was to lookup a field based on the recordset's primary key. In table2 I tested a record mid way down and the last record. DLookup is the winner hands down.
Should I also stop believing in Santa Claus. Too bad because I have a new notebook on my wish list.
Thanks for your wisdon.
View 1 Replies
View Related
May 24, 2005
Can anyone offer suggestions as to why, when I split my db, place the backend on the server, and open Form2 my front end grows from 2.25MB to 3.95?
I was using macros to filter the data for each type of project and that does not affect the db size nearly as much as this does.
I am using the following code on the On_Click event of the "View Projects" button on Form2 to build the criteria for the records to appear on my "Projects" form.
Private Sub cmdOK_Click()
Dim varItem As Variant
Dim strSubtype As String
Dim strStatus As String
Dim strSubtypeCondition As String
Dim strSQL As String
Dim strSortOrder As String
Dim cat As New ADOX.Catalog
Dim cmd As ADODB.Command
For Each varItem In Me.lstSubtype.ItemsSelected
strSubtype = strSubtype & "," & Me.lstSubtype.ItemData(varItem)
Next varItem
If Len(strSubtype) = 0 Then
strSubtype = "Like '*'"
Else
strSubtype = Right(strSubtype, Len(strSubtype) - 1)
strSubtype = "IN(" & strSubtype & ")"
End If
For Each varItem In Me.lstStatus.ItemsSelected
strStatus = strStatus & ",'" & Me.lstStatus.ItemData(varItem) & "'"
Next varItem
If Len(strStatus) = 0 Then
strStatus = "Like '*'"
Else
strStatus = Right(strStatus, Len(strStatus) - 1)
strStatus = "IN(" & strStatus & ")"
End If
If Me.optAnd.Value = True Then
strSubtypeCondition = " AND "
Else
strSubtypeCondition = " OR "
End If
'Build the sort order
If Me.cboSortOrder1.Value <> "Not Sorted" Then
strSortOrder = " ORDER BY tblProjectDetails.[" & Me.cboSortOrder1.Value & "]"
Else
strSortOrder = ""
End If
'Build the SQL statement
strSQL = "SELECT tblProjectDetails.* FROM tblProjectDetails " & _
"WHERE tblProjectDetails.[subtypeid] " & strSubtype & _
strSubtypeCondition & "tblProjectDetails.[status] " & strStatus & _
strSortOrder & ";"
cat.ActiveConnection = CurrentProject.Connection
Set cmd = cat.Views("Query1").Command
cmd.CommandText = strSQL
Set cat.Views("Query1").Command = cmd
Set cat = Nothing
DoCmd.OpenForm "Projects", , "Query1"
I have cleaned up my code, compact and repaired, compiled, got rid of unneeded tables, queries, etc. and no change.
I have read up on all of the possible causes/solutions and can't seem to narrow it down.
Thanks for any help,
Toni
View 2 Replies
View Related
Dec 27, 2004
I am making a internet-game. It has for every register player quite some variables stored in a database.
On nearly all pages the database is accessed, modified and closed again. Let's say each player has 20-25 variables stored in the database. Would this cause speed problems? Any ideas to solve this?
View 2 Replies
View Related
Oct 4, 2006
I have a database on a server that is accessed by mobile clients using laptops (broadband)
when out of the office. They use'virtual private network service' to do this.
(I did not set this up, I just design and program the front and back ends)
However some report a slow response time when retieving data from the database file.
Would 'Active desktop' be any quicker?
Any suggestions on how they might speed things up, would be most welcome
'Replication' comes to mind but I think their data must remain up to date at all times.
View 3 Replies
View Related
Jun 5, 2007
Our office runs on a pretty large Access database (v2003). We are on a large hospital network and have about 15 users for our database. It tends to run VERY, VERY slow. Are there other options?
View 4 Replies
View Related
Jun 24, 2005
I have a database that is split into a FE / BE with the BE running on a server and users are accessing thru a dial up connection. This is working very slow do to the fact that I have combo boxes that users select data from that are based on different tables and every time you click on a combo box it takes several minutes to open deponding on the number of records. At 1st I thought that maybe converting to SQL would help solve this but the more I read the less I think that will do the trick. Could someone please advise me on the best solution here.
Thank you
Jim
View 10 Replies
View Related
Jan 20, 2006
Hi,
I am doing a Left Join to try to look up values in a large (about 100,000 records) table. If the value isn't found, I'm using the nz function to supply a value.
This query runs very slowly (takes about 2 minutes). I can understand why... I suppose that for every value it's trying to look up , it has to loop through all 100,000 records before it decides that it's not there.
So, I am just looking for ideas on how to make this run faster.
I do have indexes on all my join fields and criteria fields.
Thanks for any suggestions.
Here's the SQL:
SELECT VM1a.row, VM1a.Column, VM1a.Noun, VM1a.Rev, VM1a.RefDes, VM1a.repcode, VM1a.Cell, VM1a.InspPoint, VM1a.DefectType, CLng(nz([FirstOfTruncatedOpSeq],0)) AS ZOpSeq, Sum(VM1a.DefectQty) AS SumOfDefectQty
FROM VM1a LEFT JOIN BOMOutRefDesOnly ON (VM1a.RefDes = BOMOutRefDesOnly.RefDesOnly) AND (VM1a.BOM1 = BOMOutRefDesOnly.PCAItemNo)
GROUP BY VM1a.row, VM1a.Column, VM1a.Noun, VM1a.Rev, VM1a.RefDes, VM1a.repcode, VM1a.Cell, VM1a.InspPoint, VM1a.DefectType, CLng(nz([FirstOfTruncatedOpSeq],0));
View 1 Replies
View Related
Apr 23, 2007
I have a query which returns each entrant with the speed figures in a descending order for previous races, I wish my query to return the top 5 speed figures per entrant or if the entrant has less than 5 previous runs it needs to return all available data.
I am not VBA literate, so as simple as possible please, thanks.
Freddy67
View 5 Replies
View Related
Apr 24, 2007
I have a lot of queries based on queries. These all work as desired, however they can be slower than I'd like.
Given that my company has no intention of changing to another piece of software I am, therefore, limited to whatever speed I can get out of Access.
Are there any general rules or guidelines that a more experienced person could recommend to ensure that all these queries run as quickly as possible?
View 11 Replies
View Related
Dec 14, 2004
I've created an Access DB on a Citrix server which is multi user so has been split and user linked tables. It runs quite slow however. At the moment I don't have time to convert it to unbound forms, so have read that one solution to speeding it up is to create a table in the back end tables to the main DB. Then use the open recordset event to keep the link between the two open.
I know how to link the two, but can someone explain the open recordset part please. What do have to do?
Cheers,
Recall.
View 6 Replies
View Related
Jul 27, 2005
I have a table with 7 million records. Using my continuous Form, I have been using right mouse click and entering in a parameter e.g. *my search term* to filter and this takes some time before the results are shown. This gives a reduced list of say 1,000 records. But then, when I click in the column and try to sort, it takes ages to sort the records.
So, my problem is two fold: firstly, it takes some time for the first filter to work; secondly, it takes time to sort.
Should I be doing this in a different way? Any other tips for filtering and sorting faster?
Thanks,
Dave
View 3 Replies
View Related
Jun 29, 2006
We have SPEED Ferret 4.1, but we've upgraded ACCESS 2003 which 4.1 does not support. Does anyone know if a new version's coming out? Does 2003 have its own enhanced find and replace functionality?
View 1 Replies
View Related
Nov 6, 2006
Hello,
I have split a database useing the database spliter wizard. But I still have network speed problems. What I am wondering is if anyone knows if useing an ODBC connection between the front and back ends is more efficient than file sharing across the network?
Thanks for any information on this
Tim
View 4 Replies
View Related
Jul 27, 2006
All,
I'm not sure how well I've managed to search on this as I'm not too sure where to start!
I have an append query as follows:
INSERT INTO tbl_Employee ( Company_No )
SELECT tbl_Co_Data.Company_No
FROM tbl_Co_Data
WHERE (((tbl_Co_Data.Company_No) Not In (select Company_No from tbl_Employee)))
ORDER BY tbl_Co_Data.Company_No;
Basically this query is run a number of times a day and appends new company numbers in to a table - 'tbl_Employee'. It's badly named - it's not got much to do with employees. Any way it takes a good 3 minutes to run with about 20k records in tbl_Co_Data and probably 18k records in tbl_Employee.
It looks to me like it's looping through each record in one table for each record in the other - which is plain daft.
I'm currently experimenting with a DTS package that puts tbl_Co_Data in to SQL server first before the query would run (tbl_Employee is already there) with a view to running a SP and ditching the query.
Does anybody have any other ideas as I'm having problems with the DTS in that it appears to be pretty slow in itself!
Many thanks in advance for any response.
View 3 Replies
View Related
Oct 23, 2006
I'm running this query off a table of about 1.5mil records. If I leave the date parameter off It comes up under a minute. When I add it back, it takes 5 minutes before the query even starts running. All the fields used in filtering the query are indexed. Any ideas to speed this up?
SELECT CallsEntered.[Work Order Nbr], CallsEntered.[Date Entered], CallsEntered.[Time Entered], CallsEntered.[Primary Locator Code] AS [ASC], CallsEntered.Headend, CallsEntered.Node, CallsEntered.[Grid Id], CallsEntered.[Q Code], CallsEntered.[Problem Code 01], CallsEntered.[Primary Finding Code], CallsEntered.[Primary Solution Code], CallsEntered.[Cancel Code], CallsEntered.[Scheduled Date], CallsEntered.[Wo Status], CallsEntered.[Date CheckIn], CallsEntered.[Assigned Installer], Calendar.Week, Calendar.Year
FROM Calendar INNER JOIN CallsEntered ON Calendar.Date=CallsEntered.[Date Entered]
WHERE (((CallsEntered.[Date Entered]>=Forms!frmServiceCalls!txtStartDate Or Forms!frmServiceCalls!txtStartDate Is Null)=True) And ((CallsEntered.[Date Entered]<=Forms!frmServiceCalls!txtEndDate Or Forms!frmServiceCalls!txtEndDate Is Null)=True) And ((CallsEntered.Node=Forms!frmServiceCalls!txtnode Or Forms!frmServiceCalls!txtnode Is Null)=True) And ((CallsEntered.[Primary Locator Code]=Forms!frmServiceCalls!cboASC Or Forms!frmServiceCalls!cboASC Is Null)=True) And ((CallsEntered.[Q Code]=Forms!frmServiceCalls!cboQCode Or Forms!frmServiceCalls!cboQCode Is Null)=True));
Thanks
View 4 Replies
View Related
Aug 29, 2007
I am trying to issue multiple INSERT statements in a row - but it seems that only the first succeeds. If I put a 1-second delay between attempts, suddenly they all succeed (so I know the statements themselves are all valid, it's not a data issue).
These are all being issued in a loop from the same thread, so as far as I can see unless the data is being inserted asynchronously, I can't see any problem. If it is being done asynchronously - how are we supposed to know one INSERT succeeded and thus we can issue the next one?
Anyone have any idea why this might happen? I don't want to leave 'magic delays' in the code!
View 2 Replies
View Related
Mar 8, 2008
I have a (quite) specific question but I thing it covers something I simply cannot answer.
I have three UPDATE queries running on linked tables in Microsoft Access (2000/XP).
My main data table (the one to be updated) has almost 1million records
My three information tables ALL have primary keys (which are used to link the main table) and vary in size
I have atatched the three UPDATE queries plus descriptions of the field names used.
TableRecordsTime
Main DataTable900000
Mask nomenk1302 hours
Mask media90015 minutes
Mask brand4000?????
Query A
UPDATE [Main DataTable] AS z
INNER JOIN [Mask nomenk] AS n ON (z.nomCode1 = mn.nomCode1) AND (z.nomCode2 = mn.nomCode2) AND (z.nomCode3 = mn.nomCode3) AND (z.nomCode4 = mn.nomCode4)
SET z.NomenkMask1 = n!NomenkMask1;
Query B
UPDATE [Main DataTable] AS z
INNER JOIN [Mask media] AS mm ON (z.couCode = mm.couCode) AND (z.nomCode1 = mm.nomCode1) AND (z.pubCode = mm.pubCode)
SET z.MediaMask1 = mm!MediaMask1;
Query C
UPDATE [Main DataTable] AS z
INNER JOIN [Mask brand] AS mb ON (z.couCode = mb.couCode) AND (z.nomCode1 = mb.nomCode1) AND (z.brCode1 = mb.brCode1) AND (z.brCode2 = mb.brCode2)
SET z.BrandMask1 = mb!BrandMask1;
FieldnameFieldType
couCodeText
pubCodeText
nomCode1Long Integer
nomCode2Long Integer
nomCode3Long Integer
nomCode4Long Integer
brCode1Long Integer
brCode2Long Integer
My problem, quite simpley is the speed involved with running these queries. I know that query b) is the quickest with query a) a distant second (I could not even complete the running of query c) and killed it after 6 hours.
What I need to know is WHY is queryC soooo much slower than queryB when the only realy diference that I can see between them if the latter has an extra field to join on
View 6 Replies
View Related