I am having serious troubles with processing a big number of xml files using XmlDocument. The idea is to extract from about 5000 .xml docs (approx. 20 MB each) a certain data, which is saved in text format and then imported into MySQL DB. This task is supposed to be done every day.
My problem is that after processing of each xml file, the system memory is not releasing it. So, all the documents are pilling up, until all the RAM is occupied and the application starts to run very slowly (once the hard drive starts helping the system memory).
I am using already created source code, so it is not possible to change to other classes like XmlReader and so on, so I am stuck with XmlDocument.
The function for xml loading is called like this:
foreach (string s in xmlFileNames)
{
i++;
if (mytest.LoadXml(s))
mytest.loadToExchangeTables();
}
The function looks like this:
public bool LoadXml(string fileName)
{
XmlDocument myXml = new XmlDocument();
myXml.Load(fileName);
.............
//searching for needed data
.............
}
Any ideas what might be the problem? And why garbage collection is not done?
Thank you very much in advance!
Try to comment that part with
// searching for needed data
and run test once again, it may be so you don't free something IDisposable (with the use ofusing
or directly), same goes forloadToExchangeTables()
.