I'm writing a load-testing application in Java, and have a thread pool that executes tasks against the server under test. So to make 1000 jobs and run them in 5 threads I do something like this:
ExecutorService pool = Executors.newFixedThreadPool(5);
List<Runnable> jobs = makeJobs(1000);
for(Runnable job : jobs){
pool.execute(job);
}
However I don't think this approach will scale very well, because I have to make all the 'job' objects ahead of time and have them sitting in memory until they are needed.
I'm looking for a way to have the threads in the pool go to some kind of 'JobFactory' class each time they need a new job, and for the factory to build Runnables on request until the required number of jobs have been run. The factory could maybe start returning 'null' to signal to the threads that there is no more work to do.
I could code something like this up by hand, but it seems like a common enough use-case and was wondering if there was anything in the wonderful but complex 'java.util.concurrent' package that I could use instead?
Hrm. You could create a
BlockingQueue<Runnable>
with a fixed capacity and have each of your worker threads dequeue aRunnable
and run it. Then you could have a producer thread which is what puts the jobs into the queue.Main thread would do something like:
The worker would look something like:
And the job submitter would look something like:
You can do all the work in the executing threads of the thread pools using an AtomicInteger to monitor the number of runnables executed
You can also store the returned Future's in a List and
get()
on them to await completion (among other mechanisms)