I have to generate about 800 excel files from an access database.
For the first 10-15 of them it's working nice, a few seconds/excel file but it's constantly taking longer, at the 150th excel file it's taking 10 minutes.
Here is my code:
It's doing this for each nrliste in the access table (about 800 of them)
Dim lista = From ls In Liste _
Where ls!Concatenare = nrliste(i) _
Select ls
Dim table = lista.CopyToDataTable
Dim DataArr(table.Rows.Count, 30)
For x = 0 To table.Rows.Count - 1
For y = 0 To 30
DataArr(x, y) = table.Rows(x).Item(y)
Next
Next
Dim filetocopy As String
Dim newcopy As String
Dim tempname As String = nrliste(i).ToString
Dim filename As String = "LISTA INV OBI(MF) LA 30.09.2009_" & tempname.Replace("#", "_")
filetocopy = Environment.CurrentDirectory & "\MACHETA.xls"
newcopy = FolderBD.SelectedPath & "\" & filename & ".xls"
If System.IO.File.Exists(newcopy) = True Then
System.IO.File.Delete(newcopy)
System.IO.File.Copy(filetocopy, newcopy)
Else
System.IO.File.Copy(filetocopy, newcopy)
End If
'excel file
Dim xlWBook As Excel.Workbook = xlApp.Workbooks.Open(newcopy)
Dim xlSheet As Excel.Worksheet = CType(xlWBook.Worksheets("Lista inventar OBI de natura MF"), Excel.Worksheet)
'insereaza liniile necesare
For n = 11 To ((lista.Count - 1) + 11)
With xlSheet
.Rows(n).Insert(Excel.XlDirection.xlDown, 1)
End With
Next
'copiaza datele
With xlSheet
.Range(.Cells(11, 1), .Cells(table.Rows.Count + 11, 31)).Value = DataArr
End With
Are you closing the workbook after you are done with it (inside the loop)?
Look at this thread, that explains the need for releasing all COM interfaces.
EDIT: I saw your comments, in reply to @1800 Information.
What does this linq query do?
EDIT2: Try running the SQL inside MS-Access to see how it performs? Also, I will suggest discarding LINQ for now & use plain old ADO.net Command object with parametrized query.
Alternatively, a simple & stupid way would be to pull all records (where ID between your min and max value) into a DataTable and filter it in memory and do some kind of direct transfer (without using array and avoid writing values row by row, cell by cell).
I will try and find out, if that is possible. (i.e. use a filtered dataset and write it to excel file).
Hope that gives you, some hint on how to proceed.
A quick look at your code makes me question the call to .Rows(n).Insert(Excel.XlDirection.xlDown, 1) for every row. You should be able to call Insert once for all rows. Inserting rows in a worksheet is expensive even if you just insert 1 row - especially if you are inserting into a large worksheet or into a workbook with many formulas.
SpreadsheetGear for .NET typically speeds up applications such as yours (you can see some quotes confirming this here - on the right hand side of the page). SpreadsheetGear also has an IRange.CopyFromDataTable method so you will not have to copy the data to a temporary array. The SpreadsheetGear API is similar to Excel's API so converting your code is straight forward. You can download a free trial here if you want to try it out.
Disclaimer: I own SpreadsheetGear LLC
You could try using Docmd transfer spreadsheet as this should be faster
Then you can always open the file using Excel automation afterwards
I don't know but I would look at how many files you are ending up having open in Excel all at the same time. Do you close the files once you have finished writing to them? Maybe it is keeping them open, so by the time the 150'th worksheet is open it could be struggling for memory usage. Also, I would say, try stepping through the code in a debugger, and see which bit is slow (or getting slower over time) - this will help to narrow down the cause of the problem.
Instead of Rows Insert I'd try CopyFromRecordset to do an entire recordset. You'd have to substantially rework your logic of course.
But more importantly where are you closing the Excel spreadsheet object once you're done with it?