Parallel.ForEach - System Out of Memory Exception

2019-08-18 03:12发布

问题:

I have a problem with my website crawler as I get System Out of Memory Exception after it crawls something around 700 URLs. Memory usage is raising from the start and in one moment program just stops.

It is a console application written in C#.

I think the problem is that i instantiate 6 new objects at every foreach loop. Than i pass through them, get property values with reflection and create the final object that i use to DB save.

I expect .NET to destroy those object when not using them anymore but that is not the case. What are my options? Is BAckground Worker any better?

My code is something like this....

       Parallel.ForEach(Globals.Urls, url =>
        {

            progCtrl.indexIsSet = false;

            var urlHelper = url.Split(';')[1].TrimStart('\t');
           // var urlHelper = Globals.replaceGermanUmlauts(url.Split(';')[1].TrimStart('\t'));
            HtmlDocument htm = new HtmlDocument();

            try
            {
                Company comp0 = new Company();
                Company comp1 = new Company();
                Company comp2 = new Company();
                Company comp3 = new Company();
                Company comp4 = new Company();
                Company comp5 = new Company();
                Company comp6 = new Company();


//then I do some logic, add those companies to list and go further.

How to destory them? I have tried making them IDisposable but that didn't help.

Thanks.