django pipeline with S3 storage not compressing

2019-06-07 16:29发布

I'm trying to use S3 as my production storage for static files but whenever I collect the static files, I see them being uploaded to S3 bucket, the compressed versions are not being created/uploaded (the ones mentioned in output_filename).

Here are my relevant bits:

PIPELINE_YUI_BINARY = '/usr/bin/yui-compressor'
PIPELINE_JS_COMPRESSOR = 'pipeline.compressors.yui.YUICompressor'
PIPELINE_CSS_COMPRESSOR = 'pipeline.compressors.yui.YUICompressor'

PIPELINE_JS = {
    'main': {
        'source_filenames': (
            ...
        ),
        'output_filename': 'build/app.min.js',
    }
}

PIPELINE_CSS = {
    'main': {
        'source_filenames': (
            ...
        ),
        'output_filename': 'build/app.min.css',
        'variant': 'datauri',
    }
}

# Amazon S3 static storage
AWS_ACCESS_KEY_ID = os.environ['AWS_ACCESS_KEY_ID']
AWS_SECRET_ACCESS_KEY = os.environ['AWS_SECRET_ACCESS_KEY']
AWS_STORAGE_BUCKET_NAME = os.environ['AWS_STORAGE_BUCKET_NAME']
AWS_QUERYSTRING_AUTH = False
AWS_IS_GZIPPED = True
AWS_HEADERS = {
    'Cache-Control': 'max-age=86400',
}

DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
STATICFILES_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
# these next two aren't used, but staticfiles will complain without them
STATIC_URL = "https://%s.s3.amazonaws.com/" % os.environ['AWS_STORAGE_BUCKET_NAME']
STATIC_ROOT = ''

STATICFILES_FINDERS = (
    'django.contrib.staticfiles.finders.FileSystemFinder',
    'django.contrib.staticfiles.finders.AppDirectoriesFinder',
    'pipeline.finders.PipelineFinder',
)

No build directory being created. I don't think the compressors even run. The console shows no complaints whatsoever. What am I doing wrong?

0条回答
登录 后发表回答