Low memory when using XmlDocument

2019-08-16 19:16发布

I have a function that will will generate a dataset and create an xml file from it. The function is working perfectly. The problem is that I will get an "Out of memory" error after running the report several times. When I test the report I found that the memory usage will increase a lot when it reachs this XmlDocument command. I used the GC but no use, any suggestions?

if (ddlDir.SelectedItem != null && ddlSec.SelectedItem != null)
{
    using (DataSet dsClosedKPICalls = GetReportData())
    {
        dsClosedKPICalls.DataSetName = "ClosedKPICalls";
        foreach (DataTable table in dsClosedKPICalls.Tables)
        {
            table.TableName = "ServiceInfo";
        }
        XmlContent = dsClosedKPICalls.GetXml();
    }

    XmlDocument XML_Data = new XmlDocument();           // contains the resultant XML data

    XML_Data.LoadXml(XmlContent);
    XmlNodeList TablesList = XML_Data.SelectNodes("ClosedKPICalls/ServiceInfo");

    for (int i = 0; i < TablesList.Count; i++)
    {
        XmlDocument innerXML = new XmlDocument();

        using (DataSet dsTaskDetails = getSplitupRecords(TablesList.Item(i).SelectSingleNode("ServiceNo").InnerText))
        {
            dsTaskDetails.DataSetName = "TaskDetails";
            foreach (DataTable tbl in dsTaskDetails.Tables)
            {
                tbl.TableName = "RequestInfo";
            }

            innerXML.LoadXml(dsTaskDetails.GetXml());

            TablesList.Item(i).AppendChild(XML_Data.ImportNode(innerXML.SelectSingleNode("TaskDetails"), true));

            innerXML = null;
            GC.Collect();
            GC.WaitForPendingFinalizers();
        }
    }
}

1条回答
Melony?
2楼-- · 2019-08-16 20:09

How large are your XML documents? Specifically are they larger than about 85K?

The .Net framework uses something called the Large Object Heap (LOH) which will be used for allocations larger than ~85K. If your XML documents are larger than this then stuff (like strings) will probably get allocated in the LOH. The problem with the LOH is that it is not compacted, and so it is susceptible to memory fragmentation, which can in turn result in OutOfMemoryExceptions being thrown even when the process should normally be able to allocate that memory. This may be the cause of your problem.

For solutions you can either try

  • Reducing the use of the LOH by trying to keep the size of your objects below 85K. This can be difficult to achieve depending on your application, however it may be possible to (for example) split up your large XML documents into many smaller documents.
  • Try targetting the .Net framework v4.5 and seeing if that helps. The .Net framework v4.5 includes improvements to the LOH that may help in this case.

As an aside, your calls to GC.Collect and GC.WaitForPendingFinalizers are a bad idea, as this can artifically promote lower generation objects into higher generations and generally "mess with the garbage collector". These calls will also almost certainly not help in this case. As a general rule you should steer well clear of manually invoking a garbage collection unless you know exactly what it is that you are doing.

查看更多
登录 后发表回答