I'm running this query off a table of about 1.5mil records. If I leave the date parameter off It comes up under a minute. When I add it back, it takes 5 minutes before the query even starts running. All the fields used in filtering the query are indexed. Any ideas to speed this up?
SELECT CallsEntered.[Work Order Nbr], CallsEntered.[Date Entered], CallsEntered.[Time Entered], CallsEntered.[Primary Locator Code] AS [ASC], CallsEntered.Headend, CallsEntered.Node, CallsEntered.[Grid Id], CallsEntered.[Q Code], CallsEntered.[Problem Code 01], CallsEntered.[Primary Finding Code], CallsEntered.[Primary Solution Code], CallsEntered.[Cancel Code], CallsEntered.[Scheduled Date], CallsEntered.[Wo Status], CallsEntered.[Date CheckIn], CallsEntered.[Assigned Installer], Calendar.Week, Calendar.Year
FROM Calendar INNER JOIN CallsEntered ON Calendar.Date=CallsEntered.[Date Entered]
WHERE (((CallsEntered.[Date Entered]>=Forms!frmServiceCalls!txtStartDate Or Forms!frmServiceCalls!txtStartDate Is Null)=True) And ((CallsEntered.[Date Entered]<=Forms!frmServiceCalls!txtEndDate Or Forms!frmServiceCalls!txtEndDate Is Null)=True) And ((CallsEntered.Node=Forms!frmServiceCalls!txtnode Or Forms!frmServiceCalls!txtnode Is Null)=True) And ((CallsEntered.[Primary Locator Code]=Forms!frmServiceCalls!cboASC Or Forms!frmServiceCalls!cboASC Is Null)=True) And ((CallsEntered.[Q Code]=Forms!frmServiceCalls!cboQCode Or Forms!frmServiceCalls!cboQCode Is Null)=True));
I am truly stumped. I have a query that sometimes takes less than 1 second to run, and sometimes takes close to 5 minutes. I am making NO changes to the query. I first thought it was a network issue, so I moved the whole db, both front-end and back-end, offline. No change. So I thought maybe it was a linked table issue, so I moved all the tables used by the query to the front end. No change. Every once in a while, the query runs at lightning speed, less than 1 second. Most of the time, it takes 3-4 minutes, or longer.
Here's the SQL: SELECT PCAInternalPareto1a.row, PCAInternalPareto1a.column, PCAInternalPareto1a.Noun, PCAInternalPareto1a.Rev, PCAInternalPareto1a.InspPoint, PCAInternalPareto1a.cell, PCAInternalPareto1a.DefectType, CLng(nz([firstofTruncatedOpSeq],0)) AS ZOpseq, Sum(PCAInternalPareto1a.DefectQty) AS SumOfDefectQty, PCAInternalPareto1a.poo, PCAInternalPareto1a.repcode FROM PCAInternalPareto1a LEFT JOIN BOMOutRefDesOnly ON (PCAInternalPareto1a.RefDes = BOMOutRefDesOnly.RefDesOnly) AND (PCAInternalPareto1a.partno = BOMOutRefDesOnly.PCAItemNo) WHERE (((nz([FirstOfComponentItemNo],0)) Like "*")) GROUP BY PCAInternalPareto1a.row, PCAInternalPareto1a.column, PCAInternalPareto1a.Noun, PCAInternalPareto1a.Rev, PCAInternalPareto1a.InspPoint, PCAInternalPareto1a.cell, PCAInternalPareto1a.DefectType, CLng(nz([firstofTruncatedOpSeq],0)), PCAInternalPareto1a.poo, PCAInternalPareto1a.repcode;
PCAINternalPareto1a is itself a query, which always runs fast. BOMOutRefDesOnly is a table with about 84,000 records. I have indexes on the joined fields.
I know the where statement looks useless but I have my reasons for doing it this way. I tried deleting the where statement and it ran really fast. Then I re-added it, and it still ran really fast. THen slow again. I can't find any relationship between anything I've tried and the speed of the query!!!! What's going on??????
I know this is probably the dumbest thing to ask, but I am just not getting it. I'm tryin to run a Query, and I assume it's a SELECT query for Last Name, First Name, SSN. The tbl containing this information [tbl_Employees] automatically places the dashes in the SSN through an input mask. However for one type of report I will later need to generate, the SSN cannot contain the dashes. What is the criteria for this?!? I am so lost! I know it's probably something so simple that I'm just overlooking it, but I'm clueless. Can someone help me out? Thanks so much! :)
were each number comes from a query field..Can I add say Migrated and Excluded within the report and possibly other calculations that I may wish to include.
I am trying to create a query that takes values from a search form and provides records. I was having issues with getting results when some boxes on the form were left blank. I found a solution to that and it worked with a small number of fields. However when I make the full form query (about 8-10 fields) and run it says the query is too complex. I wrote the sql as I could not get designer to do what I wanted. Attached is the sql that works and that which is "too complex".
I have a number of queries which build two or three Union queries which looks at 35,000 records, and when you open the Union queries or run reports it takes forever for them to open.
I have indexed all the tables which have the common fields to see if this speeds it up the queries, but there still slow. The table they look at are in two different tables which are linked into a front end where the queries are!
I query against a table with 380K records and growing.
There are approx 14 fields in the table, but I only retrieve 7 in my query. Does having those extra fields in there slow the query down, or does the query ignore them?
I have a table with 7 million records. Using my continuous Form, I have been using right mouse click and entering in a parameter e.g. *my search term* to filter and this takes some time before the results are shown. This gives a reduced list of say 1,000 records. But then, when I click in the column and try to sort, it takes ages to sort the records.
So, my problem is two fold: firstly, it takes some time for the first filter to work; secondly, it takes time to sort.
Should I be doing this in a different way? Any other tips for filtering and sorting faster?
I'm not sure how well I've managed to search on this as I'm not too sure where to start!
I have an append query as follows:
INSERT INTO tbl_Employee ( Company_No ) SELECT tbl_Co_Data.Company_No FROM tbl_Co_Data WHERE (((tbl_Co_Data.Company_No) Not In (select Company_No from tbl_Employee))) ORDER BY tbl_Co_Data.Company_No;
Basically this query is run a number of times a day and appends new company numbers in to a table - 'tbl_Employee'. It's badly named - it's not got much to do with employees. Any way it takes a good 3 minutes to run with about 20k records in tbl_Co_Data and probably 18k records in tbl_Employee.
It looks to me like it's looping through each record in one table for each record in the other - which is plain daft.
I'm currently experimenting with a DTS package that puts tbl_Co_Data in to SQL server first before the query would run (tbl_Employee is already there) with a view to running a SP and ditching the query.
Does anybody have any other ideas as I'm having problems with the DTS in that it appears to be pretty slow in itself!
I have a (quite) specific question but I thing it covers something I simply cannot answer.
I have three UPDATE queries running on linked tables in Microsoft Access (2000/XP).
My main data table (the one to be updated) has almost 1million records
My three information tables ALL have primary keys (which are used to link the main table) and vary in size
I have atatched the three UPDATE queries plus descriptions of the field names used.
TableRecordsTime
Main DataTable900000 Mask nomenk1302 hours Mask media90015 minutes Mask brand4000?????
Query A UPDATE [Main DataTable] AS z INNER JOIN [Mask nomenk] AS n ON (z.nomCode1 = mn.nomCode1) AND (z.nomCode2 = mn.nomCode2) AND (z.nomCode3 = mn.nomCode3) AND (z.nomCode4 = mn.nomCode4) SET z.NomenkMask1 = n!NomenkMask1;
Query B UPDATE [Main DataTable] AS z INNER JOIN [Mask media] AS mm ON (z.couCode = mm.couCode) AND (z.nomCode1 = mm.nomCode1) AND (z.pubCode = mm.pubCode) SET z.MediaMask1 = mm!MediaMask1;
Query C UPDATE [Main DataTable] AS z INNER JOIN [Mask brand] AS mb ON (z.couCode = mb.couCode) AND (z.nomCode1 = mb.nomCode1) AND (z.brCode1 = mb.brCode1) AND (z.brCode2 = mb.brCode2) SET z.BrandMask1 = mb!BrandMask1;
My problem, quite simpley is the speed involved with running these queries. I know that query b) is the quickest with query a) a distant second (I could not even complete the running of query c) and killed it after 6 hours.
What I need to know is WHY is queryC soooo much slower than queryB when the only realy diference that I can see between them if the latter has an extra field to join on
The guy who created this cluster of a database no longer works here. There is a section in it that has a user select an option (0-30) then click a button. When that button is clicked it runs a series of queries based on the selection. Each selection chosen takes 90 minutes to update. I am looking to see if there is any way to do the same in less time. Additionally.. I am having to select 1, click the button, wait 90 minutes. select 2, click the button, wait 90 minutes. select 3, click the button, wait 90 minutes.... and so on. Therefore I am also trying to come up with a way to have to have a button that will do selections 1-10 all in one run.
Code: Private Sub btn_download_Click() On Error GoTo Err_btn_download_Click Dim db As Database, qwo As QueryDef, rs As Recordset, x, numrec As Long, rt As Recordset, rwo As Recordset Dim stDocName As String, i As Long, rwot As Recordset, qwot As QueryDef, qs As QueryDef, xcount As Long, xmsg As Integer
Amongst many other things i would like to be able to see which teams in our company have access to which database. We currently use up to seven!!! different Databases which makes for tremendous fun...just kidding.
Since every Team has access to more than one Database andevery Database obviously gets used by a multitude of Teams, i thought the best way to go would be a Junction Table. Whilst building this, however, it occured to me that i could have just created fields for each Database in the Team Table which then can be ticked.
So here's my question, is it more space consuming to create a junction table, it certainly appears to be more work right now...
I have 3 PC's on my Network accessing a back end which is sitting on the main PC.
i.e. Main PC has Front End and Back End PC 2 has Front End and is linked to Main PC's Back End PC 3 has Front End and is linked to Main PC's Back End
PC 2 and PC 3 take about 5 minutes to load the intial main form. Once it is loaded the performance is ok. I have a Gigabyte network so know it's not just a slow network.
I have had a look at many other threads and also read http://www.granite.ab.ca/access/performancefaq.htm but nothing has helped.
Any ideas about settings in say the options page that I may have set wrong.
FYI I have Default Open mode set to shared, Default record locking set to No Locks, and Open databases using record-level locking ticked. Not sure if these are correct or if there are other settings to consider here.
I have also found that once you open the initial form once, it then loads quickly if you close it and then open it again, but if you close the database front end and re-open it it then takes 5 minutes again.
The only answer I have at the moment is to never shut the darn thing...!
We have a split database with the front end on 3 user pcs and the backend on the network
I have the only full version of access on my pc for editing the front end and the others have access runtime to use the database
Problem: Since I split the database this way it takes horrendously long to make any modifications to the front end design of queries, forms or reports. I mean when I go to save changes on a form it takes about 10 mins to do it. Likewise design modifications to a form are very slow. eg click a field to move it and theres a delay of 5 seconds before the field is selected then another 5 seconds to respond to the next mouse move. Frustrating
Can anyone shed any light on the reason for this and/or suggest a solution.
btw it doesnt seem to make a difference whether the other users have the database open or not.
I have a form that takes a long time to open – around a minute and sometimes longer. The form is based on a query, and this query on its own opens almost right away. Why would a form take so long to load up? What factors can increase a form’s load time?
I have a following problem: I have a form in continuous mode. Users are supposed to filter data in it to their liking using inbuild filters in Access. After that there is a button that should calculate total weights in filtered records (at this point, I add more calculations once I have solved this issue). Code of button is as follows:
Code: Private Sub btnCalculateWeight_Click()
Dim rst As Recordset Dim weight As Long Dim material As Long Dim subtype As Long Dim locus As Long
[Code] ....
Loop goes through recordset correct amount of times but for unknown reason takes value only from the first record and sums it N times where N is number of records in recordset. Recordset seems to be correctly assigned according to messageboxes I added to pinpoint problem. PotteryWeights is a custom function I made and it works properly.
I have a form that I created that has a tab control. When the form loads, it automatically sets the focus to a field in the subform. This wouldn't be a problem, except when it does it hides the tabs at the top of the form so I have to go and scroll back up every time I click a tab so that I can see the tabs. Is there a way to adjust this? I have tried hiding a field and setting the focus to it and that didn't work. I also tried setting the focus to the actual tab, but that also did not work.
I'm experimenting with unbound object frames to embed an excel sheet in Access.If I have an excel application already open and busy, and then open my form, access freeze completely and won't do anything. My users will probably be running excel vba programs when they attempt to view this form, if there is no work around I'll have to drop the idea of embedding excel in access.
whenever an "external" thing wants to use excel (opening a file in explorer, etc), it uses the very 1st excel application open on the computer at this moment, instead of creating its own instance, or at least scanning for how busy the open application is. Because of that, I think that even loading the preview in the access form will try to use the current excel process even though it's completely busy.
I have a growing Access database in a multi-user environment over a 10/100Mb network. The database is all in one file at the moment, on a shared directory of our XP-Pro 'Server', and the workstations have a mapped drive to it and are W98SE machines. All the machines are 1.2Ghz Fujitsu Siemens machines.
It is still under development, but is also in constant use, and I therefore have to develop on a copy, then get everyone out so I can copy in the changes. I would love it to be a client/server setup and split the db to Tables only backend on the server and progams on the client, but when I tried, the result was a dramatic slow-down in the system...it became unusable.
I do have a budget for this, and could get a proper 'server' or maybe an Ethernet Disk, but what is the best config for speed and admin purposes. Anyone doing something similar??
I noticed something strange in access 2000: sometimes it takes a long time to calculate a report and other times it goes rapidly. I don't see any process taking a lot of CPU %.
When i do the same thing in Access2003, it goes rapidly every time.
Which is faster, placing a calculation ItemNumber: IIf([MANITEMNO]<>" ",[MANITEMNO],[ITEM]) in a query or placing =IIf([MANITEMNO]<>" ",[MANITEMNO],[ITEM])as a Control Source in a text field on a form or report :confused:
I was brought up to believe in Santa Claus and that DLookups were slow. I browsed last night and found many threads mentioning speed issues with DLookups.
My latest project consists of upgrading a database written at a Client Site that is chock full of (you guessed it) DLookups. My normal inclination was to replace them with parameter queries. But before I did, I ran some benchmarks of DLookup vs Parameter Query speed. In all cases (table1 = 11 records, table2 = 1,143 records) the DLookup was faster.
My test was to lookup a field based on the recordset's primary key. In table2 I tested a record mid way down and the last record. DLookup is the winner hands down.
Should I also stop believing in Santa Claus. Too bad because I have a new notebook on my wish list.
Can anyone offer suggestions as to why, when I split my db, place the backend on the server, and open Form2 my front end grows from 2.25MB to 3.95?
I was using macros to filter the data for each type of project and that does not affect the db size nearly as much as this does.
I am using the following code on the On_Click event of the "View Projects" button on Form2 to build the criteria for the records to appear on my "Projects" form.
Private Sub cmdOK_Click() Dim varItem As Variant Dim strSubtype As String Dim strStatus As String Dim strSubtypeCondition As String Dim strSQL As String Dim strSortOrder As String Dim cat As New ADOX.Catalog Dim cmd As ADODB.Command
For Each varItem In Me.lstSubtype.ItemsSelected strSubtype = strSubtype & "," & Me.lstSubtype.ItemData(varItem) Next varItem If Len(strSubtype) = 0 Then strSubtype = "Like '*'" Else strSubtype = Right(strSubtype, Len(strSubtype) - 1) strSubtype = "IN(" & strSubtype & ")" End If
For Each varItem In Me.lstStatus.ItemsSelected strStatus = strStatus & ",'" & Me.lstStatus.ItemData(varItem) & "'" Next varItem If Len(strStatus) = 0 Then strStatus = "Like '*'" Else strStatus = Right(strStatus, Len(strStatus) - 1) strStatus = "IN(" & strStatus & ")" End If
If Me.optAnd.Value = True Then strSubtypeCondition = " AND " Else strSubtypeCondition = " OR " End If
'Build the sort order
If Me.cboSortOrder1.Value <> "Not Sorted" Then strSortOrder = " ORDER BY tblProjectDetails.[" & Me.cboSortOrder1.Value & "]" Else strSortOrder = "" End If
I am making a internet-game. It has for every register player quite some variables stored in a database.
On nearly all pages the database is accessed, modified and closed again. Let's say each player has 20-25 variables stored in the database. Would this cause speed problems? Any ideas to solve this?