I want to serve my compressed CSS/JS from CloudFront (they live on S3), but am unable to work out how to do it via the compressor settings in settings.py, I have the following:
COMPRESS_OFFLINE = True
COMPRESS_URL = 'http://static.example.com/' #same as STATIC_URL, so unnecessary, just here for simplicity
COMPRESS_STORAGE = 'my_example_dir.storage.CachedS3BotoStorage' #subclass suggested in [docs][1]
COMPRESS_OUTPUT_DIR = 'compressed_static'
COMPRESS_ROOT = '/home/dotcloud/current/static/' #location of static files on server
Despite the COMPRESS_URL, my files are being read from my s3 bucket:
<link rel="stylesheet" href="https://example.s3.amazonaws.com/compressed_static/css/e0684a1d5c25.css?Signature=blahblahblah;Expires=farfuture;AWSAccessKeyId=blahblahblah" type="text/css" />
I guess the issue is I want to write the file to S3, but read it from CloudFront. Is this possible?
I made a few, different changes to settings.py
Compressor Docs
This above solution saved the files locally as well as uploaded them to s3. This let me compress the files offline. If you aren't gzipping, the above ought to work for serving compressed files from CloudFront.
Adding gzip adds a wrinkle:
settings.py
though this resulted in an error whenever a compressible file (css and js according to storages) was being pushed to s3 during collectstatic:
This was due to some bizarre error having to do with the compression of the css/js files that I don't understand. These files I need locally, unzipped, and not on s3, so I could do avoid the problem altogether if I tweak the storage subclass referenced above (and provided in the compressor docs).
new storage.py
This then saved all .css and .js files (excluding the admin files, which I serve uncompressed from CloudFront) while pushing the rest of the files to s3 (and not bothering to save them locally, though could easily add the self.local_storage._save line).
But when I run compress, I want my compressed .js and .css files to get pushed to s3 so I create another sublcass for compressor to use:
Finally, given these new subclasses, I need to update a few settings:
And that is all I have to say about that.
I wrote a wrapper storage backend around the one provided by boto
myapp/storage_backends.py:
Where my settings.py file has...
Additionally, for streaming distributions it's useful to override the
url
function to allowrtmp://
urls, as in:Actually this seems also to be an issue in django-storages. When compressor compares the hashes of files on S3, django-storages doesn't unpack the content of the Gzip'ed files, but tries to compare different hashes. I've opened https://bitbucket.org/david/django-storages/pull-request/33/fix-gzip-support to fix that.
FWIW, there is also https://bitbucket.org/david/django-storages/pull-request/32/s3boto-gzip-fix-and-associated-unit-tests which fixes another issue of actually saving files to S3 when having AWS_IS_GZIPPED set to True. What a yak that has been.
Seems like the problem was actually fixed upstream in Django, https://github.com/django/django/commit/5c954136eaef3d98d532368deec4c19cf892f664
The problematic _get_size method could probably patched locally to work around it for older versions of Django.
EDIT: Have a look at https://github.com/jezdez/django_compressor/issues/100 for an actual work around.