Python CGI queue

2019-02-28 09:18发布

I'm working on a fairly simple CGI with Python. I'm about to put it into Django, etc. The overall setup is pretty standard server side (i.e. computation is done on the server):

  1. User uploads data files and clicks "Run" button
  2. Server forks jobs in parallel behind the scenes, using lots of RAM and processor power. ~5-10 minutes later (average use case), the program terminates, having created a file of its output and some .png figure files.
  3. Server displays web page with figures and some summary text

I don't think there are going to be hundreds or thousands of people using this at once; however, because the computation going on takes a fair amount of RAM and processor power (each instance forks the most CPU-intensive task using Python's Pool).

I wondered if you know whether it would be worth the trouble to use a queueing system. I came across a Python module called beanstalkc, but on the page it said it was an "in-memory" queueing system.

What does "in-memory" mean in this context? I worry about memory, not just CPU time, and so I want to ensure that only one job runs (or is held in RAM, whether it receives CPU time or not) at a time.

Also, I was trying to decide whether

  • the result page (served by the CGI) should tell you it's position in the queue (until it runs and then displays the actual results page)

    OR

  • the user should submit their email address to the CGI, which will email them the link to the results page when it is complete.

What do you think is the appropriate design methodology for a light traffic CGI for a problem of this sort? Advice is much appreciated.

标签: python cgi queue
1条回答
虎瘦雄心在
2楼-- · 2019-02-28 09:59

Definitely use celery. You can run an amqp server or I think you can sue the database as a queue for the messages. It allows you to run tasks in the background and it can use multiple worker machines to do the processing if you want. It can also do cron jobs that are database based if you use django-celery

It's as simple as this to run a task in the background:

@task
def add(x, y):
    return x + y

In a project I have it's distributing the work over 4 machines and it works great.

查看更多
登录 后发表回答