How to unlock the file after AWS S3 Helper uploadi

2020-03-01 07:29发布

I am using the official PHP SDK with the official service provider for laravel to upload an image to Amazon S3. The image is temporarily stored on my server and should be deleted after uploading. The following is the code I used to do my upload and delete.

$temp_path = "/screenshot_temp/testing.png";

$client = AWS::createClient('s3');
$result = $client->putObject(array(
        'Bucket'     => self::$bucketName,
        'Key'        => 'screenshot/testing.png',
        'SourceFile' => $temp_path,
        'ACL'    => 'public-read'
    ));
);

chown($temp_path, 777);
unlink($temp_path);

The upload is successful. I can see my image with the link return, and I can see it on the amazon console. The problem is that the delete fails, with the following error message:

ErrorException: unlink(... path of my file ...): Permission denied

I am sure my file permission setting is correct, and I am able to delete my file with the section of code for uploading to S3 comment out. So it should be the problem that the file is locked during uploading the file. Is there a way I can unlock and delete my file?

4条回答
The star\"
2楼-- · 2020-03-01 07:43

When you're using SourceFile option at putObject S3Client opens a file, but doesn't close it after operation.

In most cases you just can unset $client and/or $result to close opened files. But unfortunately not in this case.

Use Body option instead of the SourceFile.

// temp file
$file = fopen($temp_path, "r");

// use resource, not a path
$result = $client->putObject(array(
        'Bucket'     => self::$bucketName,
        'Key'        => 'screenshot/testing.png',
        'Body'       => $file,
        'ACL'        => 'public-read'
    ));
);

fclose($file);

unlink($temp_path);
查看更多
霸刀☆藐视天下
3楼-- · 2020-03-01 07:45

I'm not a PHP guy, but I'd try popping that bad boy into a stream and then passing the stream to the SDK.

That way, you can explicitly close the stream and then delete the temp file. You may even be able to eliminate the temporary file altogether and deal solely with streams, if that's allowed by your specific use-case.

Looks like this SO post might set you on the right track.

查看更多
beautiful°
4楼-- · 2020-03-01 07:46

Yes the stream upload locks the file till it finishes, Try either of 2,

$client = AWS::createClient('s3');
$fileContent = file_get_contents($temp_path);
$result = $client->putObject(array(
    'Bucket'     => self::$bucketName,
    'Key'        => 'screenshot/testing.png',
    'Body'       => $fileContent,
    'ACL'        => 'public-read'
));
);

unlink($temp_path);

or

$client = AWS::createClient('s3');
$fileContent = file_get_contents($temp_path);
$result = $client->putObject(array(
    'Bucket'     => self::$bucketName,
    'Key'        => 'screenshot/testing.png',
    'Body'       => $fileContent,
    'ACL'        => 'public-read'
));
);

gc_collect_cycles();
unlink($temp_path);
查看更多
小情绪 Triste *
5楼-- · 2020-03-01 07:51

EDIT: I just noticed the string in your $temp_path begins with a "/" slash character. This beginning slash typically starts at the root directory of the website, are you sure this is the correct location? Use the getcwd() command in PHP to find out what folder PHP thinks it is inside of.

I understand you believe the permissions are correct but in light of the "Permission denied" error I still believe it is telling you something relevant.

I see you are trying to chown the directory, did you perhaps mean to chmod it? If you can SSH to your server and run this command you may have more luck:

chmod -R 777 /(your-website-dir/screenshot_temp

Or even try changing the "chown" to "chmod" in your PHP code.

查看更多
登录 后发表回答