I have to run tasks on approximately 150k Django objects. What is the best way to do this? I am using the Django ORM as the Broker. The database backend is MySQL and chokes and dies during the task.delay() of all the tasks. Related, I was also wanting to kick this off from the submission of a form, but the resulting request produced a very long response time that timed out.
相关问题
- Django __str__ returned non-string (type NoneType)
- Django & Amazon SES SMTP. Cannot send email
- Django check user group permissions
- Django restrict pages to certain users
- UnicodeEncodeError with attach_file on EmailMessag
相关文章
- 用哪个ORM好点,博客园用的什么
- Profiling Django with PyCharm
- Why doesn't Django enforce my unique_together
- MultiValueDictKeyError in Django admin
- Hibernate doesn't generate cascade
- Django/Heroku: FATAL: too many connections for rol
- Django is sooo slow? errno 32 broken pipe? dcramer
- Is this the right way of using ThenFetch() to load
Try using RabbitMQ instead.
RabbitMQ is used in a lot of bigger companies and people really rely on it, since it's such a great broker.
Here is a great tutorial on how to get you started with it.
I use beanstalkd ( http://kr.github.com/beanstalkd/ ) as the engine. Adding a worker and a task is pretty straightforward for Django if you use django-beanstalkd : https://github.com/jonasvp/django-beanstalkd/
It’s very reliable for my usage.
Example of worker :
To launch a job/worker/task :
(source extracted from example app of django-beanstalkd)
Enjoy !
I would also consider using something other than using the database as the "broker". It really isn't suitable for this kind of work.
Though, you can move some of this overhead out of the request/response cycle by launching a task to create the other tasks:
Also, since you probably don't have 15000 processors to process all of these objects in parallel, you could split the objects in chunks of say 100's or 1000's: