Pointing to multiple S3 buckets in s3boto

2019-03-11 03:56发布

In settings.py I have:

STATICFILES_STORAGE = 'storages.backends.s3boto.S3BotoStorage'

DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
AWS_ACCESS_KEY_ID = 'xxxxxxxxxxxxx'
AWS_SECRET_ACCESS_KEY = 'xxxxxxxxxxxxx'
AWS_STORAGE_BUCKET_NAME = 'static.mysite.com'

This is pointing to my S3 bucket static.mysite.com and works fine when I do manage.py collectstatic, it uploads all the static files to my bucket. However, I have another bucket which I use for different purposes and would like to use in certain areas of the website, for example if I have a model like this:

class Image(models.Model):
    myobject = models.ImageField(upload_to='my/folder')

Now when Image.save() is invoked, it will still upload the file to the S3 bucket in AWS_STORAGE_BUCKET_NAME, however I want this Image.save() to be point to another S3 bucket. Any clean way of doing this? I don't want to change settings.py in run time nor implement any practices that violate the key principles of django, i.e. having a pluggable easy-to-change backend storage.

1条回答
祖国的老花朵
2楼-- · 2019-03-11 04:41

The cleanest way for you would be to create a subclass of S3BotoStorage, and override default bucket name in the init method.

from django.conf import settings
from storages.backends.s3boto import S3BotoStorage

class MyS3Storage(S3BotoStorage):
    def __init__(self, *args, **kwargs):
        kwargs['bucket'] = getattr(settings, 'MY_AWS_STORAGE_BUCKET_NAME')
        super(MyS3Storage, self).__init__(*args, **kwargs)

Then specify this class as your DEFAULT_FILE_STORAGE and leave STATICFILES_STORAGE as it is, or vise versa.

查看更多
登录 后发表回答