Say, we have a table with some large text
field containg jpg-files' binary data. The task is to get those files from a database on disk. So, at first I decided to do the following:
MyDataContext dc = new MyDataContext();
foreach(ImageTable t in dc.ImageTable.OrderBy(i=>i.Id))
{
using (StreamWriter writer = new StreamWriter(new FileStream(string.Concat(t.Name,".jpg"), FileMode.CreateNew), Encoding.GetEncoding(1251)))
{
writer.Write(t.Data);
writer.Close();
}
}
But as soon as the table had about 20 thousand of rows, in a while I got an OutOfMemoryException
.
In the end, to avoid loading of all the rows into one datacontext I did the following:
MyDataContext dc = new MyDataContext();
foreach(int id in dc.ImageTable.OrderBy(i=>i.Id).Select(i=>i.Id))
{
using (MyDataContext _dc = new MyDataContext())
{
ImageTable t = _dc.ImageTable.FirstOrDefault(i => i.Id == id);
using (StreamWriter writer = new StreamWriter(new FileStream(string.Concat(t.Name,".jpg"), FileMode.CreateNew), Encoding.GetEncoding(1251)))
{
writer.Write(t.Data);
writer.Close();
}
}
}
So each row is loaded by a separate datacontext...no memory problem left! But surely it's not the best approach to do the task.
Could anyone suggest something?
If it already works: resolving memory issue, with performance that fits the needs of your application, it's a good solution.
If you're still not satisfied with the results, you may think about leaving
linq to sql
and look onraw SQL
use SqlDataReader with readonly, forward only cursor, to get maximum efficiency in read operation.Hope this helps.
You could try switching off object tracking: