I'm trying to setup django-compressor and django-staticfiles so that the compressed CSS/Javascript and images are served from Amazon's S3.
I've managed to setup staticfiles using S3 as the backend so it's collectstatic
command sends the files to S3 instead of STATIC_ROOT
.
However when trying to add django-compressor
to the mix is where it all seems to fall apart for me. Following the documentation on setting up remote storages I've created a subclass of the storage backend, boto, so I copied the example to storage.py
. Once I start using this cached backend the files are copied into static_media and not S3. After the first page load the CACHE folder appears on S3 and in the static_media folder.
Setting STATICFILES_STORAGE
and COMPRESS_STORAGE
back to boto's normal S3 class (storages.backends.s3boto.S3BotoStorage
) results in the static assets being collected into the S3 bucket and no static_media folder. However trying to reload the page throws the error:
Caught NotImplementedError while rendering: This backend doesn't support absolute paths.
highlighting {% compress css %}
as the tag and compressor/base.py
as the origin.
The s3/staticfiles/compressor section of my settings.py
:
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
AWS_ACCESS_KEY_ID = 'key'
AWS_SECRET_ACCESS_KEY ='secret'
AWS_STORAGE_BUCKET_NAME = 'my-bucket'
S3_URL = 'http://my-bucket.s3.amazonaws.com/'
MEDIA_ROOT = 'client_media'
MEDIA_URL = '/media/'
STATIC_ROOT = 'static_media'
STATIC_URL = S3_URL
ADMIN_MEDIA_PREFIX = S3_URL + 'admin/'
STATICFILES_DIRS = (
join(DIRNAME, 'static'),
)
STATICFILES_FINDERS = (
'django.contrib.staticfiles.finders.FileSystemFinder',
'django.contrib.staticfiles.finders.AppDirectoriesFinder',
'compressor.finders.CompressorFinder',
)
COMPRESS_ENABLED = True
COMPRESS_URL = S3_URL
COMPRESS_ROOT = STATIC_ROOT
COMPRESS_STORAGE = 'storage.CachedS3BotoStorage'
STATICFILES_STORAGE = COMPRESS_STORAGE
So where am I going wrong? Have I mis-configured something when using the CachedS3BotoStorage
custom storage maybe?
Your settings look correct. You should keep both STATICFILES_STORAGE
and COMPRESS_STORAGE
set to storage.CachedS3BotoStorage
though and not switch back to storages.backends.s3boto.S3BotoStorage
.
According to this django-compressor issue, the problem is with the way django-staticfiles saves during the collectstatic process (using shutil.copy2
). This issue has been corrected in the newer version of django-staticfiles, which can be used instead of the one that ships with Django 1.3.
pip install django-staticfiles==dev
And in your settings.py
, switch to the updated version:
STATICFILES_FINDERS = (
#"django.contrib.staticfiles.finders.FileSystemFinder",
#"django.contrib.staticfiles.finders.AppDirectoriesFinder",
"staticfiles.finders.FileSystemFinder",
"staticfiles.finders.AppDirectoriesFinder",
"compressor.finders.CompressorFinder",
)
INSTALLED_APPS = (
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
#'django.contrib.staticfiles',
'staticfiles',
#...
)
After running python manage.py collectstatic
again, both the CACHE directory from django-compressor and the collected staticfiles files should show up on S3.
Using django_compressor==1.2
worked for me. I am not sure why you need to install django-staticfiles however all the versions of django_compressor
except 1.2 has that issue.
After plenty of days of hard work and research I was finally able to do this and I decided to write a detailed guide about it, including how to also serve them zipped with gzip.
Basically you need to do a few things:
- Use
AWS_IS_GZIPPED = True
- If your S3 is outside of US. You need to create a custom
S3Connection
class where you override the DefaultHost
variable to your S3 url. Example s3-eu-west-1.amazonaws.com
- If you're using a dotted bucket name, example
subdomain.domain.tld
. You need to set AWS_S3_CALLING_FORMAT = 'boto.s3.connection.OrdinaryCallingFormat'
- You have to set
non_gzipped_file_content = content.file
in your CachedS3BotoStorage
This is the CachedS3BotoStorage
class you need:
class CachedS3BotoStorage(S3BotoStorage):
"""
S3 storage backend that saves the files locally, too.
"""
connection_class = EUConnection
location = settings.STATICFILES_LOCATION
def __init__(self, *args, **kwargs):
super(CachedS3BotoStorage, self).__init__(*args, **kwargs)
self.local_storage = get_storage_class(
"compressor.storage.CompressorFileStorage")()
def save(self, name, content):
non_gzipped_file_content = content.file
name = super(CachedS3BotoStorage, self).save(name, content)
content.file = non_gzipped_file_content
self.local_storage._save(name, content)
return name
Note that EUConnection
is a custom class where I set DefaultHost
to my S3 location. Check the much longer and detailed guide for complete custom storages and settings.py
Try this post that complete the above solution with some lines, to fix the problem that create many (multiples) manifest_%.json in Amazon S3.
https://stackoverflow.com/a/31545361/1359475