I am using Amazon s3 but here i am facing two problems
1.I cant directly upload files to amazon server when i submit form.I mean i have to upload images to upload folder and from there i have to retrieve and upload to s3 server
.is there a way to upload images directly when we click on submit ?
2.if i pass 'public'
in s3 put object
then only i can access or view files but if i make it public every one can view files but i need to protect all files and view only to the authenticated user .Can any one suggest me how to fix this issue ?
try {
$s3 = \Storage::disk('s3');
$s3->put($strFileName, file_get_contents($img_path.$strFileName), 'public');
} catch (Aws\Exception\S3Exception $e) {
echo "There was an error uploading the file.\n"+$e;
}
Before asking questions i have read so many answers from stackoverflow but it didnt helped me to fix my issue.Thanks
1. is there a way to upload images directly when we click on submit
Yes
How:
You need to do this using JavaScript (with AJAX) in two parts;
a) when user clicks "submit" you trap this event, first upload the file (see http://docs.aws.amazon.com/AWSJavaScriptSDK/guide/browser-examples.html for an example), and b) then submit the form through AJAX and handle the response.
However:
This allows the user to upload anything and may cause problems. There are tips (just below the example) to create authenticated URLs for 15 minutes (variable) but what happens if a user takes longer than 15 minutes, or tries to upload 100 files in 15 minutes, or uploads something other than an images file, or a badly formatted image file etc.
It's much safer to pull into your server and verify they are images and of the type/size you need and then upload from the server.
Of course, if this is a simple admin tool and you are controlling who accesses the code then go for it - hopefully you'll only upload what you expect.
2. i need to protect all files and view only to the authenticated user
By "authenticated user": if you mean "the user that uploaded the image" then s3 alone does not provide the functionality, but CloudFront does. You can issue pre-authorised URLs or signed cookies: http://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/private-content-choosing-signed-urls-cookies.html
If "authenticated user" means the person that uploaded the file, then according to the docs, this is not possible in Laravel class without getting access to the underlying client.
public
andprivate
are your only visibility options by default, which are translated topublic-read
, but you needauthenticated-read
,bucket-owner-read
or one of the other canned grants (ref: http://docs.aws.amazon.com/AmazonS3/latest/dev/acl-overview.html#canned-acl) If theauthenticated-read
or other canned ACL grants don't give the permissions profile you need, you can create your own (details further up on that same page).Solution is to grab the underlying client and then do put-object directly. (And then if you go that far, you may as well ditch the Laravel library and pull in s3 and do it all yourself - then you have full control of everything).
For you first question, it can be directly upload images to AWS S3.
You are supposed to specify your file path and the file you get from the form.
Solution to your question which I'm also using but without laravel.
1. For uploading any file directly to Amazon AWS S3 Bucket in specific folder you can do it like this.
HTML
PHP - upload.php
Include aws php sdk
Initialize S3 Client
Create file upload entity
Upload to s3 Bucket
2. Now for serve the uploaded files to authenticated users only
Firstly, you will need to create private bucket policy for the s3 Bucket.
Bucket Policy - To generate the bucket policy you can use Policy Generator. Using this you can generate policy like this. Copied from Improve.dk website
Secondly, you will need to setup the private web distribution of cloudfront the s3 Bucket. Only doing that you will be able to serve your contents to authenticated users only through aws signed url or signed cookie
Thirdly, to generate the signed url, you will need the pem file which you will get from aws console only.
Generate signed url
I recently tackled this problem. First off yes you can upload directly to s3 here is what I used for some info on this: http://docs.aws.amazon.com/AmazonS3/latest/dev/HTTPPOSTExamples.html
First off you need to create a policy and signature server side to add to your html form for uploading files.
Now on the frontend my form I don't use a submit button, you could use a submit button but you will need to catch the submit and prevent the form from actually submitting till after the upload finishes.
When we click save, it generates an md5 (use npm to install) filename so that file names can't really be guessed randomly, it then uses ajax to upload the file up to S3. After this is finished it puts the file data and returned aws data in a hidden input and submits the form. It should look something like this:
The controller listening for the submit then parses the JSON from that input field and creates an instance of Media (a model I created) and it stores the
awsData
andfileData
for each image.Then instead of pointing html image tags to the s3 file like this:
I do something like this:
Then the route can go through the normal
auth
middleware and all you need to do in Laravel. Finally, that route points to a controller that does this:So what this does is simply uses a 301 redirect and sets the header location to the actual aws file. Since we generate an md5 filename when we upload the file to aws each filename is an md5 so people couldn't randomly search for aws files in the bucket.