Amazon S3 copy the directory to another directory

2020-08-09 10:11发布

问题:

How do i copy/duplicate a folder that contain sub-folders and files into another directory within the S3 bucket using PHP API?

$s3->copy_object only copies the folder, but not the files and sub-folders inside.

Do i have to use $s3->list_objects to get all files and directory and run $s3->copy_object on every single file/directory?

回答1:

S3 is not a filesystem, it's an object store. Folders don't actually exist in any tangible sense; a folder is just something you can call a shared prefix. Put another way, if you create path/to/one and path/to/two, it doesn't also cause path and path/to to exist. If you see them, that's because some component took a list of objects, split their keys on /, and decided to display that list as a hierarchy.

You want to "duplicate a folder into another folder". Rephrasing this into S3 terms, you want to "duplicate all objects with the same prefix into objects with a different prefix". Saying it that way makes the method clear: get a list of objects with the one prefix, then copy each of them.



回答2:

Code for scala (copying between folders in one bucket):

def copyFolders(bucketName: String, srcFolder: String, targetFolder: String): Unit = {
import scala.collection.JavaConversions._
val transferManager: TransferManager = TransferManagerBuilder.standard.build
try {

  for (file <- s3.listObjects(bucketName, s"$srcFolder/").getObjectSummaries) {
    val fileName = file.getKey.replace(s"$srcFolder/", "")
    if (!fileName.isEmpty) {
      val transferProcess: Copy = transferManager.copy(bucketName, file.getKey,
        bucketName, s"$targetFolder/$fileName")
      log.info(s"Old key = ${file.getKey}")
      log.info(s"New file Key = $targetFolder/$fileName")
      transferProcess.waitForCompletion()
    }
  }
} catch {
  case e: AmazonServiceException =>
    log.error(e.getErrorMessage, e)
    System.exit(1)
  case e: AmazonClientException =>
    log.error("Amazon client error: " + e.getMessage, e)
    System.exit(1)
  case e: InterruptedException =>
    log.error("Transfer interrupted: " + e.getMessage, e)
    System.exit(1)
}
}

Usage:

copyFolders("mybucket", "somefolder/srcfolder", "somefolder/targetfolder")


回答3:

One way to do it is using list objects and move each object one by one. Another way is to use s3fuse, which will make your s3 bucket as the local directory and then you can just apply simple command like 'mv' to move the files.



回答4:

here is some code taken right from amazon. This code duplicates the item a three times to a target, what you need to do is change it so that it loops through each key and add it to the batch.

<?php

// Include the AWS SDK using the Composer autoloader.
require 'vendor/autoload.php';

use Aws\S3\S3Client;

$sourceBucket = '*** Your Source Bucket Name ***';
$sourceKeyname = '*** Your Source Object Key ***';
$targetBucket = '*** Your Target Bucket Name ***';

// Instantiate the client.
$s3 = S3Client::factory();

// Copy an object.
$s3->copyObject(array(
    'Bucket'     => $targetBucket,
    'Key'        => "{$sourceKeyname}-copy",
    'CopySource' => "{$sourceBucket}/{$sourceKeyname}",
));

// Perform a batch of CopyObject operations.
$batch = array();
for ($i = 1; $i <= 3; $i++) {
    $batch[] = $s3->getCommand('CopyObject', array(
        'Bucket'     => $targetBucket,
        'Key'        => "{$sourceKeyname}-copy-{$i}",
        'CopySource' => "{$sourceBucket}/{$sourceKeyname}",
    ));
}
try {
    $successful = $s3->execute($batch);
    $failed = array();
} catch (\Guzzle\Service\Exception\CommandTransferException $e) {
    $successful = $e->getSuccessfulCommands();
    $failed = $e->getFailedCommands();
}