Laravel 5: How do you copy a local file to Amazon

2019-04-29 16:55发布

I'm writing code in Laravel 5 to periodically backup a MySQL database. My code thus far looks like this:

    $filename = 'database_backup_'.date('G_a_m_d_y').'.sql';
    $destination = storage_path() . '/backups/';

    $database = \Config::get('database.connections.mysql.database');
    $username = \Config::get('database.connections.mysql.username');
    $password = \Config::get('database.connections.mysql.password');

    $sql = "mysqldump $database --password=$password --user=$username --single-transaction >$destination" . $filename;

    $result = exec($sql, $output); // TODO: check $result

    // Copy database dump to S3

    $disk = \Storage::disk('s3');

    // ????????????????????????????????
    //  What goes here?
    // ????????????????????????????????

I've seen solutions online that would suggest I do something like:

$disk->put('my/bucket/' . $filename, file_get_contents($destination . $filename));

However, for large files, isn't it wasteful to use file_get_contents()? Are there any better solutions?

4条回答
我只想做你的唯一
2楼-- · 2019-04-29 17:06

Looking at the documentation the only way is using method put which needs file content. There is no method to copy file between 2 file systems so probably the solution you gave is at the moment the only one.

If you think about it, finally when copying file from local file system to s3, you need to have file content to put it in S3, so indeed it's not so wasteful in my opinion.

查看更多
Bombasti
3楼-- · 2019-04-29 17:11

You can always use a file resource to stream the file (advisable for large files) by doing something like this:

Storage::disk('s3')->put('my/bucket/' . $filename, fopen('path/to/local/file', 'r+'));

An alternative suggestion is proposed here. It uses Laravel's Storage facade to read the stream. The basic idea is something like this:

    $inputStream = Storage::disk('local')->getDriver()->readStream('/path/to/file');
    $destination = Storage::disk('s3')->getDriver()->getAdapter()->getPathPrefix().'/my/bucket/';
    Storage::disk('s3')->getDriver()->putStream($destination, $inputStream);
查看更多
趁早两清
4楼-- · 2019-04-29 17:11

You can try this code

$contents = Storage::get($file);
Storage::disk('s3')->put($newfile,$contents);

As Laravel document this is the easy way I found to copy data between two disks

查看更多
爷的心禁止访问
5楼-- · 2019-04-29 17:26

There is a way to copy files without needing to load the file contents into memory.

You will also need to import the following:

use League\Flysystem\MountManager;

Now you can copy the file like so:

$mountManager = new MountManager([
    's3' => \Storage::disk('s3')->getDriver(),
    'local' => \Storage::disk('local')->getDriver(),
]);
$mountManager->copy('s3://path/to/file.txt', 'local://path/to/output/file.txt');
查看更多
登录 后发表回答