Futures for blocking calls in Scala

2020-06-12 06:29发布

问题:

The Akka documentation says:

you may be tempted to just wrap the blocking call inside a Future and work with that instead, but this strategy is too simple: you are quite likely to find bottlenecks or run out of memory or threads when the application runs under increased load.

They suggest the following strategies:

  • Do the blocking call within a Future, ensuring an upper bound on the number of such calls at any point in time (submitting an unbounded number of tasks of this nature will exhaust your memory or thread limits).

  • Do the blocking call within a Future, providing a thread pool with an upper limit on the number of threads which is appropriate for the hardware on which the application runs.

Do you know about any implementation of those strategies?

回答1:

Futures are run within execution contexts. This is obvious from the Future API: any call which involves attaching some callbacks to a future or to build a future from an arbitrary computation or from another future requires an implicitly available ExecutionContext object. So you can control the concurrency setup for your futures by tuning the ExecutionContext in which they run.

For instance, to implement the second strategy you can do something like

import scala.concurrent.ExecutionContext
import java.util.concurrent.Executors
import scala.concurrent.future

object Main extends App {

  val ThreadCount = 10
  implicit val executionContext = ExecutionContext.fromExecutor(Executors.newFixedThreadPool(ThreadCount))

  val f = future {
    println(s"Hello ! I'm running in an execution context with $ThreadCount threads")
  }

}


回答2:

Akka itself implements all this, you can wrap your blocking calls into Actors and then use dispatchers to control execution thread pools.