I currently have a Rails 3.0 project, with Ruby 1.9.2 and Resque.
My application has multiple worker classes and multiple queues, that are dynamically created (during runtime). Also, there are multiple workers started that are free to work on any queues, because at start time there isn't any existing queues, and they cannot be predicted:
$ COUNT=3 QUEUE=* rake resque:workers
Queues a created based on the project
's id:
@queue = "project_#{project.id}".to_sym
For a given queue, their jobs have to processed in order and one at a time. My problem is that, by having multiple workers, multiple jobs are processed in parallel.
Is there a way to set the maximum number of workers per queue (to 1)? Is there a way to lock a queue while a job is processing?
Thanks!
I finally came to a quite simple solution using resque-retry and locks stored in redis (I am doing this for users, just do it for projects): https://stackoverflow.com/a/10933666/745266
The first solution I thought of would be to check if there is any worker working in a given queue when there's another worker polling that same queue. This could be done by reimplementing Resque::Job.reserve(queue)
:
module Resque
class Job
def self.reserve(queue)
Resque::Worker.working.each do |worker|
# if already working in the same queue
if worker.job['queue'] == queue
return
end
end
return unless payload = Resque.pop(queue)
new(queue, payload)
end
end
end
A possible issue would be the race condition. Thoughts?
Resque-pool can help you specify number of workers per queue.
https://github.com/nevans/resque-pool