rails carrierwave private files on S3 and cloudfro

2020-02-25 22:50发布

I have both public and private files which I server from Amazon cloudfront, the public files work fine but now I'd like to secure some of them as private with an authenticated read.

The private files have their own Uploader DocumentUploader, do the files need to be stored in separate buckets? As it is now they are all in the one bucket.

I've done something similar with Paperclip awhile back but can't seem to find a good resource for doing it with Carrierwave and using a timed Authenticated_url

I see they have something like it here:

http://www.rdoc.info/github/jnicklas/carrierwave/5d1cb7e6a4e8a4786c2b/CarrierWave/Storage/Fog/File#authenticated_url-instance_method

But I'm not sure how to implement it.

Any tips would be greatly appreciated.

3条回答
虎瘦雄心在
2楼-- · 2020-02-25 23:30

Depends how secure, but you can set file permissions on the particular Uploader Class itself overriding the default permissions like so:

class SomeUploader < CarrierWave::Uploader::Base

  def fog_public
    false
  end

  def fog_authenticated_url_expiration
    5.minutes # in seconds from now,  (default is 10.minutes)
  end
  .....

That will automatically cause the files from this Uploader to now be prepended with the temporary AWS expiration and accesskeys and future uploads will be set to private, ie not publicly accessible.

https://s3.amazonaws.com/uploads/something/1234/124.pdf?AWSAccessKeyId=AKIAJKOSTQ6UXXLEWIUQ&Signature=4yM%2FF%2F5TV6t4b1IIvjseenRrb%2FY%3D&Expires=1379152321

查看更多
【Aperson】
3楼-- · 2020-02-25 23:34

As far as I can see here you may need to create another bucket for secured files.

You can implement the security for your 'private' files by your own, in your model (if you have one) you can add a field that checks if the file is secure or not, then you can manage this scenario using your controller.

One nice gem that you can use is cancan. With it you can manage the model and some attributes (the secure field) and provide authorization or not, based on your user's profile.

查看更多
够拽才男人
4楼-- · 2020-02-25 23:38

You can setup carrierwave config in separate uploader. like this.

using gem 'aws-sdk', '~> 2.10' gem 'carrierwave-aws', '~> 1.1'

    class BusinessDocumentUploader < CarrierWave::Uploader::Base

    def initialize(*)
      super



      CarrierWave.configure do |config|
      config.storage    = :aws
      config.aws_bucket = Rails.application.secrets.aws_bucket
      config.aws_acl    = 'private'

    #acl: "private", # accepts private, public-read, public-read-write, authenticated-read, aws-exec-read, bucket-owner-read, bucket-owner-full-control
      # Optionally define an asset host for configurations that are fronted by a
      # content host, such as CloudFront.
      config.asset_host = Rails.application.secrets.aws_asset_host

      # The maximum period for authenticated_urls is only 7 days.
      config.aws_authenticated_url_expiration = 60 * 60 * 24 * 7
      # config.aws_authenticated_url_expiration = 2

      # Set custom options such as cache control to leverage browser caching
      config.aws_attributes = {
        expires: 1.week.from_now.httpdate,
        cache_control: 'max-age=604800'
      }

      config.aws_credentials = {
        access_key_id:     Rails.application.secrets.aws_access_key_id,
        secret_access_key: Rails.application.secrets.aws_secret_access_key,
        region:            Rails.application.secrets.aws_region # Required
      }

    end

    end
    end
查看更多
登录 后发表回答