I need to update the solr index on a schedule with the command:
(env)$ ./manage.py update_index
I've looked through the Celery docs and found info on scheduling, but haven't been able to find a way to run a django management command on a schedule and inside a virtualenv. Would this be better run on a normal cron? And if so how would I run it inside the virtualenv? Anyone have experience with this?
Thanks for the help!
To run your command periodically from a cron job, just wrap the command in a bash script that loads the virtualenv. For example, here is what we do to run manage.py commands:
django_cmd.sh:
Crontab:
Django Celery Task Scheduling project structure
add below configuration in settings.py file:
celery.py : holds celery task scheduler
init.py
tasks.py from project1
Task will run after every 30 seconds
Requirement fro windows:
celery worker and celery beat should be running run each below command on different terminal
celery -A appname worker -l info
celery -A appname beat -l info
Requirement fro Linux:
celery worker and celery beat should be running celery beat and worker can be started on same server
celery -A appname worker -l info -B
@tzenderman please let me know if I missed something. For me this is working fine
I actually found a nice way of doing this using fabric + celery and I'm working on it now:
In app/tasks.py, create a fabric function with the manage.py commands you need, then decorate it with
@periodic_task
, add it to your celery schedule and it should be good to go.UPDATE: I wasn't able to actually use Fabric + Celery because using fabric in the module caused it be recognized as a fabric file and the celery calls in the file didn't work.