how appoint a subdomain for a s3 bucket?

2019-01-22 08:18发布

问题:

Good morning,

I am using amazon s3 bucket as the image server. And I want to use a subdomain of my site, how to address this bucket. eg: a picture is now in: https://s3-sa-east-1.amazonaws.com/nomeBucket/pasta/imag.png, and I access it through this same link.

Would that it were so: imagens.mydomain.com.br / folder / imag.png Is there any way I can do this? appoint a subdomain address to a bucket? I've tried the amazon route 53, as CNAME. I tried this: https://s3-sa-east-1.amazonaws.com/nomeBucket/

I took the test yesterday, but apparently it did not work. Someone already did something similar, and / or know how to help me?

Note: I'm using nginx. also need to configure it for subdomain?

Thank you

回答1:

You need to rename your bucket to match the custom domain name (e.g. imagens.mydomain.com.br) and set up that domain as a CNAME to

<bucket-name>.s3.amazonaws.com.

(in your case imagens.mydomain.com.br.s3.amazonaws.com.

The full instructions are available here:

http://docs.aws.amazon.com/AmazonS3/latest/dev/VirtualHosting.html



回答2:

I'm going to build on the other answers here for completeness.

I have moved my bucket to a subdomain so that the contents can be cached by Cloudflare.

  • Old S3 Bucket Name: autoauctions
  • New S3 Bucket Name: img.autoauctions.io
  • CNAME record: img.autoauctions.io.s3.amazonaws.com

Now you'll need to copy all of your objects since you cannot rename a bucket. Here's how to do that with AWS CLI:

pip install awscli
aws configure
  • Go to https://console.aws.amazon.com/iam/home and create a user or go to an existing user
  • Go to the user's Security credentials tab
  • Click Create access key. Copy the secret.
  • Here's a list of AWS regions.

Now you'll copy your old bucket contents to your new bucket.

aws s3 sync s3://autoauctions s3://img.autoauctions.io

I found this to be too slow for the 1TB of images I needed to copy, so I increased the number of concurrent connections and re-ran from an EC2 instance.

aws configure set default.s3.max_concurrent_requests 400

Sync it up!


Want to make folders within your bucket public? Create a bucket policy like this:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "PublicReadGetObject",
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::img.autoauctions.io/copart/*"
        },
        {
            "Sid": "PublicReadGetObject",
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::img.autoauctions.io/iaai/*"
        }
    ]
}

And now the image loads from img.autoauctions.io via Cloudflare's cache.

Hope this helps some people!