Can my Access 2010 VBA code be improved to run fas

2019-08-02 18:00发布

问题:

I have an Access 2010 database stored on a network drive. It does the following:

  1. Pull a list of 250,000 parts (this is fine)
  2. Analyse the part numbers in VBA
  3. Save the results of the analysis to a 2D array
  4. Write the list back out to a table in the database for use by other databases

If I run this process with the database saved to my desktop, it takes about 50 seconds to complete.

If I run it with the database saved on the network drive, it takes about 300 times as long. the element that seems to taking the longest is writing to results from array to table

Sub WritePMROutput()
Status "Writing output to tblPmrParts"
Set db = CurrentDb
Dim rstOutput As DAO.Recordset
Set rstOutput = db.OpenRecordset("tblPMRParts")

db.Execute "DELETE tblPMRParts.* FROM tblPMRParts;" 'clear the table of all data before wriing new data

intPMR = 0
Do Until intPMR = intArrPMRSze 'loop until the counter reaches the size of the array
With rstOutput
    .AddNew 'new line in output table
    !PMRID = arrPMR(intPMR, 0)
    !strayChars = arrPMR(intPMR, 1)
    !partno = arrPMR(intPMR, 2)
    !extension = arrPMR(intPMR, 3)
    If Not arrPMR(intPMR, 4) = "" Then !PartPatternID = arrPMR(intPMR, 4) 'the if not function seems to be required here as was having issues with data type with "" in a number field
    If Not arrPMR(intPMR, 5) = "" Then !ExtPatternID = arrPMR(intPMR, 5)
    .Update
    intPMR = intPMR + 1
End With
Loop

LogUpdate "PDM", "tblPMRParts"

End Sub

Can anyone suggest how I can improve on the efficiency of this operation? Or am I simply limited by the speed of my network?

Interestingly, I can copy the database to my desktop, run it, compact it, and copy it back to the network in about 5-10 mins.

Thanks in advance!

EDIT:

Following the very successful implementation of the fix suggested by Kim, I thought I would report back with the results. The situation was: On local PC, took about 60 seconds to run; across network, took about 15,000 seconds to run

After implementing a simple transaction (begin at the start of the sub, commit at the end): On local PC, 35 seconds (nearly twice as fast); across network 500 seconds - (30 times faster!)

The only issue I had with implementation was that the number of record locks required threw an error, but a quick Google search yielded a simple line of code to temporarily increase the number of record locksdAO.DBEngine.SetOption dbMaxLocksPerFile, 300000 Hope this helps someone else in future!

回答1:

I refer to this source as it is explained pretty well:
http://support.microsoft.com/kb/146908

Example syntax is provided here too:

Private Sub Form_Load ()
     Dim Starttime, Endtime
     Dim db As Database
     Dim t As RecordSet
     Dim i As Integer
     Dim tempName As String
     Dim temphone As String
     Set db = Workspace(0).OpenDatabase("c:\vb\BIBLIO.MDB") ' Uses a
      ' copy of BIBLIO.MDB.
     Set t = db.OpenRecordSet("Publishers", dbOpenTable)
     Starttime = Now
     'BeginTrans  ' Add this and CommitTrans (below) for greater speed.
     For i = 1 To 100
        tempName = "testname" & Str$(i) ' Make an arbitrary unique
                                        '  string.
        tempPhone = Str$(i)             ' Make arbitrary number.
        t.AddNew ' AddNew clears copy buffer to prepare for new record.
        t!PubID = 30 + i  ' Set primary key to unique value.
        t!Name = tempName  ' Set Name field to unique value.
        t!Telephone = tempPhone  ' Set Telephone field to unique value.
        t.Update   ' Write the record to disk or to transaction buffer.
     Next i
     'CommitTrans  ' Add this and BeginTrans (above) for greater speed.
     Endtime = Now
     MsgBox "Time required= " & Format(Endtime - Starttime, "hh:mm:ss")
     t.Close
     db.Close
     End
  End Sub

You can use database transactions to reduce the number of read-writes to your table.
You perform your loops and logic in memory, but only commit your recordset to the database once.

Quote from the website:

If you do not use the BeginTrans and CommitTrans statements, this program reports 17 seconds to add 100 records on a 486/66 PC. When you add BeginTrans and CommitTrans as shown in the program comments above, the program takes less than 1 second on that same computer. Performance may vary on different computers.



回答2:

The problem with Access is that it's a Client-Side Database. Whenever Access runs a process against a table, it has to pull the entire contents of the table to your local space, perform the operation, and then push the results back out to the database. This is why most people end up opting to use SQL Server as a back-end, as it's Server-Side and can do processing (in the form of Stored Procedures and Views) on the server to minimize what you bring to the front-end.

The most efficient way to delete records in a table is using a pre-contstructed DELETE query. You may pick up a little time by doing that, as opposed to using a db.Execute statement. However, you can also try changing that statement to:

CurrentProject.Connection.Execute "DELETE * FROM tblPMRParts;",,,AdCmdText