I have a table with 7 million records. Using my continuous Form, I have been using right mouse click and entering in a parameter e.g. *my search term* to filter and this takes some time before the results are shown. This gives a reduced list of say 1,000 records. But then, when I click in the column and try to sort, it takes ages to sort the records.
So, my problem is two fold: firstly, it takes some time for the first filter to work; secondly, it takes time to sort.
Should I be doing this in a different way? Any other tips for filtering and sorting faster?
It is not really 1.74 million records, but Access thinks it is, for some reason.Here's what happens: I get a CSV file in with 2196 lines. There are 2 date fields that are formatted poorly. Sometimes it is mddyyyy and some times it is mmddyyyy. I import the data to one table and then export everything to another table except those two fields. There are two date/time fields in the new table that are left empty at first. I then run 2 update queries to format and convert these poorly formated date fields. The query simply joins to the two tables on 3 fields and then updates the date field. When I hit the preview button on the query it takes about a second and says it will update 2196 records. Perfect.When I actually run the query, it takes about a half hour and tells me it will be updating 1.74 million records. Any ideas why this is happening? If and when the query runs there are still only 2196 records in both tables.
I have a number of queries which build two or three Union queries which looks at 35,000 records, and when you open the Union queries or run reports it takes forever for them to open.
I have indexed all the tables which have the common fields to see if this speeds it up the queries, but there still slow. The table they look at are in two different tables which are linked into a front end where the queries are!
I query against a table with 380K records and growing.
There are approx 14 fields in the table, but I only retrieve 7 in my query. Does having those extra fields in there slow the query down, or does the query ignore them?
I'm not sure how well I've managed to search on this as I'm not too sure where to start!
I have an append query as follows:
INSERT INTO tbl_Employee ( Company_No ) SELECT tbl_Co_Data.Company_No FROM tbl_Co_Data WHERE (((tbl_Co_Data.Company_No) Not In (select Company_No from tbl_Employee))) ORDER BY tbl_Co_Data.Company_No;
Basically this query is run a number of times a day and appends new company numbers in to a table - 'tbl_Employee'. It's badly named - it's not got much to do with employees. Any way it takes a good 3 minutes to run with about 20k records in tbl_Co_Data and probably 18k records in tbl_Employee.
It looks to me like it's looping through each record in one table for each record in the other - which is plain daft.
I'm currently experimenting with a DTS package that puts tbl_Co_Data in to SQL server first before the query would run (tbl_Employee is already there) with a view to running a SP and ditching the query.
Does anybody have any other ideas as I'm having problems with the DTS in that it appears to be pretty slow in itself!
I'm running this query off a table of about 1.5mil records. If I leave the date parameter off It comes up under a minute. When I add it back, it takes 5 minutes before the query even starts running. All the fields used in filtering the query are indexed. Any ideas to speed this up?
SELECT CallsEntered.[Work Order Nbr], CallsEntered.[Date Entered], CallsEntered.[Time Entered], CallsEntered.[Primary Locator Code] AS [ASC], CallsEntered.Headend, CallsEntered.Node, CallsEntered.[Grid Id], CallsEntered.[Q Code], CallsEntered.[Problem Code 01], CallsEntered.[Primary Finding Code], CallsEntered.[Primary Solution Code], CallsEntered.[Cancel Code], CallsEntered.[Scheduled Date], CallsEntered.[Wo Status], CallsEntered.[Date CheckIn], CallsEntered.[Assigned Installer], Calendar.Week, Calendar.Year FROM Calendar INNER JOIN CallsEntered ON Calendar.Date=CallsEntered.[Date Entered] WHERE (((CallsEntered.[Date Entered]>=Forms!frmServiceCalls!txtStartDate Or Forms!frmServiceCalls!txtStartDate Is Null)=True) And ((CallsEntered.[Date Entered]<=Forms!frmServiceCalls!txtEndDate Or Forms!frmServiceCalls!txtEndDate Is Null)=True) And ((CallsEntered.Node=Forms!frmServiceCalls!txtnode Or Forms!frmServiceCalls!txtnode Is Null)=True) And ((CallsEntered.[Primary Locator Code]=Forms!frmServiceCalls!cboASC Or Forms!frmServiceCalls!cboASC Is Null)=True) And ((CallsEntered.[Q Code]=Forms!frmServiceCalls!cboQCode Or Forms!frmServiceCalls!cboQCode Is Null)=True));
I have a (quite) specific question but I thing it covers something I simply cannot answer.
I have three UPDATE queries running on linked tables in Microsoft Access (2000/XP).
My main data table (the one to be updated) has almost 1million records
My three information tables ALL have primary keys (which are used to link the main table) and vary in size
I have atatched the three UPDATE queries plus descriptions of the field names used.
TableRecordsTime
Main DataTable900000 Mask nomenk1302 hours Mask media90015 minutes Mask brand4000?????
Query A UPDATE [Main DataTable] AS z INNER JOIN [Mask nomenk] AS n ON (z.nomCode1 = mn.nomCode1) AND (z.nomCode2 = mn.nomCode2) AND (z.nomCode3 = mn.nomCode3) AND (z.nomCode4 = mn.nomCode4) SET z.NomenkMask1 = n!NomenkMask1;
Query B UPDATE [Main DataTable] AS z INNER JOIN [Mask media] AS mm ON (z.couCode = mm.couCode) AND (z.nomCode1 = mm.nomCode1) AND (z.pubCode = mm.pubCode) SET z.MediaMask1 = mm!MediaMask1;
Query C UPDATE [Main DataTable] AS z INNER JOIN [Mask brand] AS mb ON (z.couCode = mb.couCode) AND (z.nomCode1 = mb.nomCode1) AND (z.brCode1 = mb.brCode1) AND (z.brCode2 = mb.brCode2) SET z.BrandMask1 = mb!BrandMask1;
My problem, quite simpley is the speed involved with running these queries. I know that query b) is the quickest with query a) a distant second (I could not even complete the running of query c) and killed it after 6 hours.
What I need to know is WHY is queryC soooo much slower than queryB when the only realy diference that I can see between them if the latter has an extra field to join on
The guy who created this cluster of a database no longer works here. There is a section in it that has a user select an option (0-30) then click a button. When that button is clicked it runs a series of queries based on the selection. Each selection chosen takes 90 minutes to update. I am looking to see if there is any way to do the same in less time. Additionally.. I am having to select 1, click the button, wait 90 minutes. select 2, click the button, wait 90 minutes. select 3, click the button, wait 90 minutes.... and so on. Therefore I am also trying to come up with a way to have to have a button that will do selections 1-10 all in one run.
Code: Private Sub btn_download_Click() On Error GoTo Err_btn_download_Click Dim db As Database, qwo As QueryDef, rs As Recordset, x, numrec As Long, rt As Recordset, rwo As Recordset Dim stDocName As String, i As Long, rwot As Recordset, qwot As QueryDef, qs As QueryDef, xcount As Long, xmsg As Integer
When I enter a number equal to or over ten million, the last two values get rounded and I don't want them to be. My field is setup as a single, standard, two decimals. When I tested it I typed in 123456789, what I got was 123,456,800.00
enter 9999999 get 9,999,999.00 enter 10000199 get 10,000,200.00 enter 10000001.75 get 10,000,000.00
Single is supposed to handle up to 10^38
I have checked my "Region and Language" settings for the OS (W7) and there is nothing in there about rounding or maximum number size. I would like to leave the data type at single for the space considerations, and because it should work as single.
I have a growing Access database in a multi-user environment over a 10/100Mb network. The database is all in one file at the moment, on a shared directory of our XP-Pro 'Server', and the workstations have a mapped drive to it and are W98SE machines. All the machines are 1.2Ghz Fujitsu Siemens machines.
It is still under development, but is also in constant use, and I therefore have to develop on a copy, then get everyone out so I can copy in the changes. I would love it to be a client/server setup and split the db to Tables only backend on the server and progams on the client, but when I tried, the result was a dramatic slow-down in the system...it became unusable.
I do have a budget for this, and could get a proper 'server' or maybe an Ethernet Disk, but what is the best config for speed and admin purposes. Anyone doing something similar??
I noticed something strange in access 2000: sometimes it takes a long time to calculate a report and other times it goes rapidly. I don't see any process taking a lot of CPU %.
When i do the same thing in Access2003, it goes rapidly every time.
Which is faster, placing a calculation ItemNumber: IIf([MANITEMNO]<>" ",[MANITEMNO],[ITEM]) in a query or placing =IIf([MANITEMNO]<>" ",[MANITEMNO],[ITEM])as a Control Source in a text field on a form or report :confused:
I was brought up to believe in Santa Claus and that DLookups were slow. I browsed last night and found many threads mentioning speed issues with DLookups.
My latest project consists of upgrading a database written at a Client Site that is chock full of (you guessed it) DLookups. My normal inclination was to replace them with parameter queries. But before I did, I ran some benchmarks of DLookup vs Parameter Query speed. In all cases (table1 = 11 records, table2 = 1,143 records) the DLookup was faster.
My test was to lookup a field based on the recordset's primary key. In table2 I tested a record mid way down and the last record. DLookup is the winner hands down.
Should I also stop believing in Santa Claus. Too bad because I have a new notebook on my wish list.
Can anyone offer suggestions as to why, when I split my db, place the backend on the server, and open Form2 my front end grows from 2.25MB to 3.95?
I was using macros to filter the data for each type of project and that does not affect the db size nearly as much as this does.
I am using the following code on the On_Click event of the "View Projects" button on Form2 to build the criteria for the records to appear on my "Projects" form.
Private Sub cmdOK_Click() Dim varItem As Variant Dim strSubtype As String Dim strStatus As String Dim strSubtypeCondition As String Dim strSQL As String Dim strSortOrder As String Dim cat As New ADOX.Catalog Dim cmd As ADODB.Command
For Each varItem In Me.lstSubtype.ItemsSelected strSubtype = strSubtype & "," & Me.lstSubtype.ItemData(varItem) Next varItem If Len(strSubtype) = 0 Then strSubtype = "Like '*'" Else strSubtype = Right(strSubtype, Len(strSubtype) - 1) strSubtype = "IN(" & strSubtype & ")" End If
For Each varItem In Me.lstStatus.ItemsSelected strStatus = strStatus & ",'" & Me.lstStatus.ItemData(varItem) & "'" Next varItem If Len(strStatus) = 0 Then strStatus = "Like '*'" Else strStatus = Right(strStatus, Len(strStatus) - 1) strStatus = "IN(" & strStatus & ")" End If
If Me.optAnd.Value = True Then strSubtypeCondition = " AND " Else strSubtypeCondition = " OR " End If
'Build the sort order
If Me.cboSortOrder1.Value <> "Not Sorted" Then strSortOrder = " ORDER BY tblProjectDetails.[" & Me.cboSortOrder1.Value & "]" Else strSortOrder = "" End If
I am making a internet-game. It has for every register player quite some variables stored in a database.
On nearly all pages the database is accessed, modified and closed again. Let's say each player has 20-25 variables stored in the database. Would this cause speed problems? Any ideas to solve this?
I have a database on a server that is accessed by mobile clients using laptops (broadband) when out of the office. They use'virtual private network service' to do this. (I did not set this up, I just design and program the front and back ends) However some report a slow response time when retieving data from the database file.
Would 'Active desktop' be any quicker? Any suggestions on how they might speed things up, would be most welcome 'Replication' comes to mind but I think their data must remain up to date at all times.
Our office runs on a pretty large Access database (v2003). We are on a large hospital network and have about 15 users for our database. It tends to run VERY, VERY slow. Are there other options?
A DB is split (FE / BE) with several FE users and the BE sat on a network. FE Access 2003. (runtime) The Sub form has record set type set to Snapshot.
Which of the following scenarios will perform fastest?
Scenario 1, The FE Queries a linked table and displays the results on a sub form (Datasheet Format).
Scenario 2, The BE table is copied to the FE (new table) and the query is run against the new table and displays the results on a sub form (Datasheet Format)
The reason for this question is to attempt to reduce the network traffic and further improve the speed performance of a split database.
I have a database that is split into a FE / BE with the BE running on a server and users are accessing thru a dial up connection. This is working very slow do to the fact that I have combo boxes that users select data from that are based on different tables and every time you click on a combo box it takes several minutes to open deponding on the number of records. At 1st I thought that maybe converting to SQL would help solve this but the more I read the less I think that will do the trick. Could someone please advise me on the best solution here.
Hi, I am doing a Left Join to try to look up values in a large (about 100,000 records) table. If the value isn't found, I'm using the nz function to supply a value.
This query runs very slowly (takes about 2 minutes). I can understand why... I suppose that for every value it's trying to look up , it has to loop through all 100,000 records before it decides that it's not there.
So, I am just looking for ideas on how to make this run faster.
I do have indexes on all my join fields and criteria fields. Thanks for any suggestions.
Here's the SQL:
SELECT VM1a.row, VM1a.Column, VM1a.Noun, VM1a.Rev, VM1a.RefDes, VM1a.repcode, VM1a.Cell, VM1a.InspPoint, VM1a.DefectType, CLng(nz([FirstOfTruncatedOpSeq],0)) AS ZOpSeq, Sum(VM1a.DefectQty) AS SumOfDefectQty FROM VM1a LEFT JOIN BOMOutRefDesOnly ON (VM1a.RefDes = BOMOutRefDesOnly.RefDesOnly) AND (VM1a.BOM1 = BOMOutRefDesOnly.PCAItemNo) GROUP BY VM1a.row, VM1a.Column, VM1a.Noun, VM1a.Rev, VM1a.RefDes, VM1a.repcode, VM1a.Cell, VM1a.InspPoint, VM1a.DefectType, CLng(nz([FirstOfTruncatedOpSeq],0));
I have a query which returns each entrant with the speed figures in a descending order for previous races, I wish my query to return the top 5 speed figures per entrant or if the entrant has less than 5 previous runs it needs to return all available data.
I am not VBA literate, so as simple as possible please, thanks.
I've created an Access DB on a Citrix server which is multi user so has been split and user linked tables. It runs quite slow however. At the moment I don't have time to convert it to unbound forms, so have read that one solution to speeding it up is to create a table in the back end tables to the main DB. Then use the open recordset event to keep the link between the two open.
I know how to link the two, but can someone explain the open recordset part please. What do have to do?
We have SPEED Ferret 4.1, but we've upgraded ACCESS 2003 which 4.1 does not support. Does anyone know if a new version's coming out? Does 2003 have its own enhanced find and replace functionality?
Hello, I have split a database useing the database spliter wizard. But I still have network speed problems. What I am wondering is if anyone knows if useing an ODBC connection between the front and back ends is more efficient than file sharing across the network? Thanks for any information on this Tim