The accepted answer to question "Why does this Parallel.ForEach code freeze the program up?" advises to substitute the List usage by ConcurrentBag in a WPF application.
I'd like to understand whether a BlockingCollection can be used in this case instead?
You can indeed use a BlockingCollection
, but there is absolutely no point in doing so.
First off, note that BlockingCollection
is a wrapper around a collection that implements IProducerConsumerCollection<T>
. Any type that implements that interface can be used as the underlying storage:
When you create a BlockingCollection<T>
object, you can specify not
only the bounded capacity but also the type of collection to use. For
example, you could specify a ConcurrentQueue<T>
object for first in,
first out (FIFO) behavior, or a ConcurrentStack<T>
object for last
in,first out (LIFO) behavior. You can use any collection class that
implements the IProducerConsumerCollection<T>
interface. The default
collection type for BlockingCollection<T>
is ConcurrentQueue<T>
.
This includes ConcurrentBag<T>
, which means you can have a blocking concurrent bag. So what's the difference between a plain IProducerConsumerCollection<T>
and a blocking collection? The documentation of BlockingCollection
says (emphasis mine):
BlockingCollection<T>
is used as a wrapper for an
IProducerConsumerCollection<T>
instance, allowing removal attempts
from the collection to block until data is available to be removed.
Similarly, a BlockingCollection<T>
can be created to enforce an
upper-bound on the number of data elements allowed in the
IProducerConsumerCollection<T>
[...]
Since in the linked question there is no need to do either of these things, using BlockingCollection
simply adds a layer of functionality that goes unused.
Yes, you could use BlockingCollection
for that. finishedProxies
would be defined as:
BlockingCollection<string> finishedProxies = new BlockingCollection<string>();
and to add an item, you would write:
finishedProxies.Add(checkResult);
And when it's done, you could create a list from the contents.