django-pipeline and s3boto storage don't seem

2020-07-18 05:05发布

问题:

I'm trying to use django-pipeline-1.1.27 with s3boto to compress and filter static files, and then upload them to an s3 bucket. If I just use:

PIPELINE_STORAGE = 'pipeline.storage.PipelineFinderStorage'

Then it works and I get a static folder with the nice versioned file that I configured. As soon as I switch to

PIPELINE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'

I get

Traceback (most recent call last):
  File "manage.py", line 15, in <module>
    execute_manager(settings)
  File "/my/virtual/env/lib/python2.7/site-packages/django/core/management/__init__.py", line 438, in execute_manager
    utility.execute()
  File "/my/virtual/env/lib/python2.7/site-packages/django/core/management/__init__.py", line 379, in execute
    self.fetch_command(subcommand).run_from_argv(self.argv)
  File "/my/virtual/env/lib/python2.7/site-packages/django/core/management/base.py", line 191, in run_from_argv
    self.execute(*args, **options.__dict__)
  File "/my/virtual/env/lib/python2.7/site-packages/django/core/management/base.py", line 220, in execute
    output = self.handle(*args, **options)
  File "/my/virtual/env/lib/python2.7/site-packages/pipeline/management/commands/synccompress.py", line 39, in handle
    packager.pack_stylesheets(package, sync=sync, force=force)
  File "/my/virtual/env/lib/python2.7/site-packages/pipeline/packager.py", line 52, in pack_stylesheets
    **kwargs)
  File "/my/virtual/env/lib/python2.7/site-packages/pipeline/packager.py", line 60, in pack
    package['output'], package['paths'])
  File "/my/virtual/env/lib/python2.7/site-packages/pipeline/versioning/__init__.py", line 45, in need_update
    version = self.version(paths)
  File "/my/virtual/env/lib/python2.7/site-packages/pipeline/versioning/__init__.py", line 20, in version
    return getattr(self.versioner, 'version')(paths)
  File "/my/virtual/env/lib/python2.7/site-packages/pipeline/versioning/hash/__init__.py", line 37, in version
    buf = self.concatenate(paths)
  File "/my/virtual/env/lib/python2.7/site-packages/pipeline/versioning/hash/__init__.py", line 27, in concatenate
    return '\n'.join([self.read_file(path) for path in paths])
  File "/my/virtual/env/lib/python2.7/site-packages/pipeline/versioning/hash/__init__.py", line 31, in read_file
    file = storage.open(path, 'rb')
  File "/my/virtual/env/lib/python2.7/site-packages/django/core/files/storage.py", line 33, in open
    file = self._open(name, mode)
  File "/my/virtual/env/lib/python2.7/site-packages/storages/backends/s3boto.py", line 177, in _open
    raise IOError('File does not exist: %s' % name)
IOError: File does not exist: css/style.css

which is one of my source files. So why does pipeline no longer want to do the filter/concatenate/compress steps when I switch to s3boto storage?

It may be that I'm doing something. Here is other config in case it helps:

INSTALLED_APPS = (
    ...
    'pipeline',
    'storages',
)

STATICFILES_FINDERS = (
    'pipeline.finders.PipelineFinder',
    'django.contrib.staticfiles.finders.FileSystemFinder',
    'django.contrib.staticfiles.finders.AppDirectoriesFinder',
)

STATIC_ROOT = "/some/path/outside/django_project/deploy_static"
STATICFILES_DIRS = () # All statics in this site are in apps

STATICFILES_STORAGE = 'pipeline.storage.PipelineStorage'
PIPELINE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'

PIPELINE = True
PIPELINE_AUTO = True
PIPELINE_VERSION = True
PIPELINE_VERSION_PLACEHOLDER = 'VERSION'
PIPELINE_VERSIONING = 'pipeline.versioning.hash.SHA1Versioning'

PIPELINE_CSS = {
    'standard': {
        'source_filenames': (
          'css/style.css',
          ...
        ),
        'output_filename': 'css/all-VERSION.css',
        'extra_context': {
            'media': 'screen,projection',
        },
    }
}

My site is on Django 1.3.1.

The command I'm running is:

python manage.py synccompress --force

The AWS creds are also in settings, but that's moot because it's not even getting to that point.

UPDATE Added full stack and settings requested in comments

UPDATE At the request of the library author, I tried upgrading to the latest beta. Observations from that so far:

  1. I don't know how to get versioned compressed files now
  2. collectstatic leaves me with the compressed files and the originals
  3. Still getting the same error from django-pipeline when boto storage is configured: it wants to send my source files to s3, but I can't even see where it's staging my assets. Nothing gets placed in STATIC_ROOT.

UPDATE I've created the simplest project that works for finder storage and then breaks with S3Boto. I've pushed it to github, and included a capture of the stacktrace.

https://github.com/estebistec/simple_pipeline https://raw.github.com/estebistec/simple_pipeline/master/STACKTRACE

I would be ecstatic if I could be told I'm doing some really dumb and this should all just work.

回答1:

django-pipeline 1.1.x is a bit dumb about how you should use staticfiles, it prefers to have everything in one place. I suggest you to try django-pipeline 1.2 with latest django-staticfiles or django 1.4.

Use a custom like this :

STATICFILES_STORAGE = 'your.app.S3PipelineStorage'

The code looks like this :

from staticfiles.storage import CachedFilesMixin

from pipeline.storage import PipelineMixin

from storages.backends.s3boto import S3BotoStorage


class S3PipelineStorage(PipelineMixin, CachedFilesMixin, S3BotoStorage):
     pass

You can find how to fix your application, but there is still a bug with compiled files unless you use version 1.2c1 : https://gist.github.com/1999564



回答2:

I just experienced this same error on a Django 1.6 project with django-pipeline==1.3.23, and the solution was simply removing the PIPELINE_STORAGE setting.



回答3:

There is another problem with similar error message that affects earlier and current version (1.5.4) of django-pipeline.

The error message is IOError: File does not exist, and it happens in s3boto.py.open() and packager.pack_stylesheets(). You might hit the problem if you use any of the compiler (Compass, Sass, Less etc). I suspect it would also affect JS compiler, but I have not confirmed.

In a nutshell, the compiler generates output file in the local static storage and the next steps, compress is trying to find the output in s3 storage.

If it affects you, you might want to take a look at https://github.com/cyberdelia/django-pipeline/issues/473. There are two pull-requests (patches), one made by skirsdeda and another made by thomasyip (me). Both might solve your problem. If you would like the compiled (but, pre-compressed) file to be copied to s3 and available to the app, the you would take thomasyip (me)'s patch.

Here is full Traceback for the problem:

Traceback (most recent call last):
  File "apps/manage.py", line 16, in <module>
    execute_from_command_line(sys.argv)
  File "/Users/thomas/Dev/Project/main-server/venv/lib/python2.7/site-packages/django/core/management/__init__.py", line 385, in execute_from_command_line
    utility.execute()
  File "/Users/thomas/Dev/Project/main-server/venv/lib/python2.7/site-packages/django/core/management/__init__.py", line 377, in execute
    self.fetch_command(subcommand).run_from_argv(self.argv)
  File "/Users/thomas/Dev/Project/main-server/venv/lib/python2.7/site-packages/django/core/management/base.py", line 288, in run_from_argv
    self.execute(*args, **options.__dict__)
  File "/Users/thomas/Dev/Project/main-server/venv/lib/python2.7/site-packages/django/core/management/base.py", line 338, in execute
    output = self.handle(*args, **options)
  File "/Users/thomas/Dev/Project/main-server/venv/lib/python2.7/site-packages/django/core/management/base.py", line 533, in handle
    return self.handle_noargs(**options)
  File "/Users/thomas/Dev/Project/main-server/venv/lib/python2.7/site-packages/django/contrib/staticfiles/management/commands/collectstatic.py", line 171, in handle_noargs
    collected = self.collect()
  File "/Users/thomas/Dev/Project/main-server/venv/lib/python2.7/site-packages/django/contrib/staticfiles/management/commands/collectstatic.py", line 117, in collect
    for original_path, processed_path, processed in processor:
  File "/Users/thomas/Dev/Project/main-server/venv/lib/python2.7/site-packages/pipeline/storage.py", line 26, in post_process
    packager.pack_stylesheets(package)
  File "/Users/thomas/Dev/Project/main-server/venv/lib/python2.7/site-packages/pipeline/packager.py", line 96, in pack_stylesheets
    variant=package.variant, **kwargs)
  File "/Users/thomas/Dev/Project/main-server/venv/lib/python2.7/site-packages/pipeline/packager.py", line 106, in pack
    content = compress(paths, **kwargs)
  File "/Users/thomas/Dev/Project/main-server/venv/lib/python2.7/site-packages/pipeline/compressors/__init__.py", line 73, in compress_css
    css = self.concatenate_and_rewrite(paths, output_filename, variant)
  File "/Users/thomas/Dev/Project/main-server/venv/lib/python2.7/site-packages/pipeline/compressors/__init__.py", line 137, in concatenate_and_rewrite
    content = self.read_text(path)
  File "/Users/thomas/Dev/Project/main-server/venv/lib/python2.7/site-packages/pipeline/compressors/__init__.py", line 220, in read_text
    content = self.read_bytes(path)
  File "/Users/thomas/Dev/Project/main-server/venv/lib/python2.7/site-packages/pipeline/compressors/__init__.py", line 214, in read_bytes
    file = staticfiles_storage.open(path)
  File "/Users/thomas/Dev/Project/main-server/venv/lib/python2.7/site-packages/django/core/files/storage.py", line 35, in open
    return self._open(name, mode)
  File "/Users/thomas/Dev/Project/main-server/venv/lib/python2.7/site-packages/storages/backends/s3boto.py", line 366, in _open
    raise IOError('File does not exist: %s' % name)
IOError: File does not exist: sheets/sass/sheets.css


回答4:

Complementing the answers, you could use GZIP as well when compressing:

from django.contrib.staticfiles.storage import CachedFilesMixin

from pipeline.storage import PipelineMixin

from storages.backends.s3boto import S3BotoStorage


class S3PipelineStorage(PipelineMixin, CachedFilesMixin, S3BotoStorage):
    def __init__(self, *args, **kwargs):
        self.gzip = True
        super(S3PipelineStorage, self).__init__(*args, **kwargs)

Using the settings as follows:

COMPRESS_STORAGE = STATICFILES_STORAGE = 'my.apps.main.S3PipelineStorage'


回答5:

Not sure how this seemed to work for everyone else. I followed the solution above and kept getting the following error:

Traceback (most recent call last):
  File "manage.py", line 24, in <module>
    execute_from_command_line(sys.argv)
  File "python3.4/site-packages/django/core/management/__init__.py", line 338, in execute_from_command_line
    utility.execute()
  File "python3.4/site-packages/django/core/management/__init__.py", line 330, in execute
    self.fetch_command(subcommand).run_from_argv(self.argv)
  File "python3.4/site-packages/django/core/management/base.py", line 390, in run_from_argv
    self.execute(*args, **cmd_options)
  File "python3.4/site-packages/django/core/management/base.py", line 441, in execute
    output = self.handle(*args, **options)
  File "python3.4/site-packages/django/contrib/staticfiles/management/commands/collectstatic.py", line 168, in handle
    collected = self.collect()
  File "python3.4/site-packages/django/contrib/staticfiles/management/commands/collectstatic.py", line 114, in collect
    for original_path, processed_path, processed in processor:
  File "python3.4/site-packages/pipeline/storage.py", line 26, in post_process
    packager.pack_stylesheets(package)
  File "python3.4/site-packages/pipeline/packager.py", line 96, in pack_stylesheets
    variant=package.variant, **kwargs)
  File "python3.4/site-packages/pipeline/packager.py", line 105, in pack
    paths = self.compile(package.paths, force=True)
  File "python3.4/site-packages/pipeline/packager.py", line 99, in compile
    return self.compiler.compile(paths, force=force)
  File "python3.4/site-packages/pipeline/compilers/__init__.py", line 56, in compile
    return list(executor.map(_compile, paths))
  File "/usr/local/lib/python3.4/concurrent/futures/_base.py", line 549, in result_iterator
    yield future.result()
  File "/usr/local/lib/python3.4/concurrent/futures/_base.py", line 402, in result
    return self.__get_result()
  File "/usr/local/lib/python3.4/concurrent/futures/_base.py", line 354, in __get_result
    raise self._exception
  File "/usr/local/lib/python3.4/concurrent/futures/thread.py", line 54, in run
    result = self.fn(*self.args, **self.kwargs)
  File "python3.4/site-packages/pipeline/compilers/__init__.py", line 42, in _compile
    outdated = compiler.is_outdated(input_path, output_path)
  File "python3.4/site-packages/pipeline/compilers/__init__.py", line 85, in is_outdated
    return self.storage.modified_time(infile) > self.storage.modified_time(outfile)
  File "python3.4/site-packages/storages/backends/s3boto.py", line 480, in modified_time
    return parse_ts(entry.last_modified)
AttributeError: 'NoneType' object has no attribute 'last_modified'

It wasn't until I came across this solution that I started to find what worked for me. Here's the storage that I ended up using that saved the file locally as well as in S3 which got me passed all errors:

from django.contrib.staticfiles.storage import ManifestFilesMixin
from django.core.files.storage import get_storage_class
from pipeline.storage import PipelineMixin
from storages.backends.s3boto import S3BotoStorage


class StaticStorage(PipelineMixin, ManifestFilesMixin, S3BotoStorage):
    """Custom storage for static content."""

    def __init__(self, *args, **kwargs):
        super(StaticStorage, self).__init__(*args, **kwargs)
        self.local_storage = get_storage_class(
            'django.contrib.staticfiles.storage.StaticFilesStorage')()

    def save(self, name, content):
        name = super(StaticStorage, self).save(name, content)
        self.local_storage._save(name, content)
        return name