I have both public and private files which I server from Amazon cloudfront, the public files work fine but now I'd like to secure some of them as private with an authenticated read.
The private files have their own Uploader DocumentUploader, do the files need to be stored in separate buckets? As it is now they are all in the one bucket.
I've done something similar with Paperclip awhile back but can't seem to find a good resource for doing it with Carrierwave and using a timed Authenticated_url
I see they have something like it here:
But I'm not sure how to implement it.
Any tips would be greatly appreciated.
Depends how secure, but you can set file permissions on the particular Uploader Class itself overriding the default permissions like so:
That will automatically cause the files from this Uploader to now be prepended with the temporary AWS expiration and accesskeys and future uploads will be set to private, ie not publicly accessible.
https://s3.amazonaws.com/uploads/something/1234/124.pdf?AWSAccessKeyId=AKIAJKOSTQ6UXXLEWIUQ&Signature=4yM%2FF%2F5TV6t4b1IIvjseenRrb%2FY%3D&Expires=1379152321
As far as I can see here you may need to create another bucket for secured files.
You can implement the security for your 'private' files by your own, in your model (if you have one) you can add a field that checks if the file is secure or not, then you can manage this scenario using your controller.
One nice gem that you can use is cancan. With it you can manage the model and some attributes (the secure field) and provide authorization or not, based on your user's profile.
You can setup carrierwave config in separate uploader. like this.
using gem 'aws-sdk', '~> 2.10' gem 'carrierwave-aws', '~> 1.1'