How do I avoid a memory leak with LINQ-To-SQL?

2020-02-02 18:45发布

I have been having some issues with LINQ-To-SQL around memory usage. I'm using it in a Windows Service to do some processing, and I'm looping through a large amount of data that I'm pulling back from the context. Yes - I know I could do this with a stored procedure but there are reasons why that would be a less than ideal solution.

Anyway, what I see basically is memory is not being released even after I call context.SubmitChanges(). So I end up having to do all sorts of weird things like only pull back 100 records at time, or create several contexts and have them all do separate tasks. If I keep the same DataContext and use it later for other calls, it just eats up more and more memory. Even if I call Clear() on the "var tableRows" array that the query returns to me, set it to null, and call SYstem.GC.Collect() - it still doesn't release the memory.

Now I've read some about how you should use DataContexts quickly and dispose of them quickly, but it seems like their ought to be a way to force the context to dump all its data (or all its tracking data for a particular table) at a certain point to guarantee the memory is free.

Anyone know what steps guarantee that the memory is released?

5条回答
劳资没心,怎么记你
2楼-- · 2020-02-02 18:51

If you don't need object tracking set DataContext.ObjectTrackingEnabled to false. If you do need it, you can use reflection to call the internal DataContext.ClearCache(), although you have to be aware that since its internal, it's subject to disappear in a future version of the framework. And as far as I can tell, the framework itself doesn't use it but it does clear the object cache.

查看更多
Fickle 薄情
3楼-- · 2020-02-02 18:56

As David Points out, you should dispose of the DataContext using a using block.

It seems that your primary concern is about creating and disposing a bunch of DataContext objects. THis is how linq2sql is designed. The DataContext is meant to have short lifetime. Since you are pulling a lot of data from the DB, it makes sense that there will be a lot of memory usage. You are on the right track, by processing your data in chunks.

Don't be afraid of creating a ton of DataContexts. They are designed to be used that way.

查看更多
Lonely孤独者°
4楼-- · 2020-02-02 19:06

Thanks guys - I will check out the ClearCache method. Just for clarification (for future readers), the situation in which I was getting the memory usuage was something like this:

using(DataContext context = new DataContext())
{
   while(true)
   {
      int skipAmount = 0;
      var rows = context.tables.Select(x => x.Dept == "Dept").Skip(skipAmount).Take(100);

      //break out of loop when out of rows

      foreach(table t in rows)
      {
         //make changes to t   
      }

      context.SubmitChanges();
      skipAmount += rows.Count();

      rows.Clear();
      rows = null;

      //at this point, even though the rows have been cleared and changes have been
      //submitted, the context is still holding onto a reference somewhere to the
      //removed rows.  So unless you create a new context, memory usuage keeps on growing
   }
}
查看更多
Rolldiameter
5楼-- · 2020-02-02 19:13

A DataContext tracks all the objects it ever fetched. It won't release this until it is garbage collected. Also, as it implements IDisposable, you must call Dispose or use the using statement.

This is the right way to go:

using(DataContext myDC = new DataContext)
{
  //  Do stuff
} //DataContext is disposed
查看更多
贼婆χ
6楼-- · 2020-02-02 19:13

I just ran into a similar problem. In my case, helped establish the properties of DataContext.ObjectTrackingEnabled to false. But it works only in the case of iterating through the rows as follows:

using (var db = new DataContext())
{
    db.ObjectTrackingEnabled = false;
    var documents = from d in db.GetTable<T>()
                     select d;
    foreach (var doc in documents)
    {
        ...
    }
}

If, for example, in the query to use the methods ToArray() or ToList() - no effect

查看更多
登录 后发表回答